Studies of uncontrolled air traffic patterns, phase 1
NASA Technical Reports Server (NTRS)
Baxa, E. G., Jr.; Scharf, L. L.; Ruedger, W. H.; Modi, J. A.; Wheelock, S. L.; Davis, C. M.
1975-01-01
The general aviation air traffic flow patterns at uncontrolled airports are investigated and analyzed and traffic pattern concepts are developed to minimize the midair collision hazard in uncontrolled airspace. An analytical approach to evaluate midair collision hazard probability as a function of traffic densities is established which is basically independent of path structure. Two methods of generating space-time interrelationships between terminal area aircraft are presented; one is a deterministic model to generate pseudorandom aircraft tracks, the other is a statistical model in preliminary form. Some hazard measures are presented for selected traffic densities. It is concluded that the probability of encountering a hazard should be minimized independently of any other considerations and that the number of encounters involving visible-avoidable aircraft should be maximized at the expense of encounters in other categories.
Deterministic models for traffic jams
NASA Astrophysics Data System (ADS)
Nagel, Kai; Herrmann, Hans J.
1993-10-01
We study several deterministic one-dimensional traffic models. For integer positions and velocities we find the typical high and low density phases separated by a simple transition. If positions and velocities are continuous variables the model shows self-organized critically driven by the slowest car.
A queuing model for road traffic simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guerrouahane, N.; Aissani, D.; Bouallouche-Medjkoune, L.
We present in this article a stochastic queuing model for the raod traffic. The model is based on the M/G/c/c state dependent queuing model, and is inspired from the deterministic Godunov scheme for the road traffic simulation. We first propose a variant of M/G/c/c state dependent model that works with density-flow fundamental diagrams rather than density-speed relationships. We then extend this model in order to consider upstream traffic demand as well as downstream traffic supply. Finally, we show how to model a whole raod by concatenating raod sections as in the deterministic Godunov scheme.
NASA Astrophysics Data System (ADS)
Treiber, Martin; Kesting, Arne; Helbing, Dirk
2006-07-01
We investigate the adaptation of the time headways in car-following models as a function of the local velocity variance, which is a measure of the inhomogeneity of traffic flow. We apply this mechanism to several car-following models and simulate traffic breakdowns in open systems with an on-ramp as bottleneck and in a closed ring road. Single-vehicle data and one-minute aggregated data generated by several virtual detectors show a semiquantitative agreement with microscopic and flow-density data from the Dutch freeway A9. This includes the observed distributions of the net time headways for free and congested traffic, the velocity variance as a function of density, and the fundamental diagram. The modal value of the time headway distribution is shifted by a factor of about 2 under congested conditions. Macroscopically, this corresponds to the capacity drop at the transition from free to congested traffic. The simulated fundamental diagram shows free, synchronized, and jammed traffic, and a wide scattering in the congested traffic regime. We explain this by a self-organized variance-driven process that leads to the spontaneous formation and decay of long-lived platoons even for a deterministic dynamics on a single lane.
Appropriate time scales for nonlinear analyses of deterministic jump systems
NASA Astrophysics Data System (ADS)
Suzuki, Tomoya
2011-06-01
In the real world, there are many phenomena that are derived from deterministic systems but which fluctuate with nonuniform time intervals. This paper discusses the appropriate time scales that can be applied to such systems to analyze their properties. The financial markets are an example of such systems wherein price movements fluctuate with nonuniform time intervals. However, it is common to apply uniform time scales such as 1-min data and 1-h data to study price movements. This paper examines the validity of such time scales by using surrogate data tests to ascertain whether the deterministic properties of the original system can be identified from uniform sampled data. The results show that uniform time samplings are often inappropriate for nonlinear analyses. However, for other systems such as neural spikes and Internet traffic packets, which produce similar outputs, uniform time samplings are quite effective in extracting the system properties. Nevertheless, uniform samplings often generate overlapping data, which can cause false rejections of surrogate data tests.
DOT National Transportation Integrated Search
2009-08-01
Federal Aviation Administration (FAA) air traffic flow management (TFM) : decision-making is based primarily on a comparison of deterministic predictions of demand : and capacity at National Airspace System (NAS) elements such as airports, fixes and ...
Classification and unification of the microscopic deterministic traffic models.
Yang, Bo; Monterola, Christopher
2015-10-01
We identify a universal mathematical structure in microscopic deterministic traffic models (with identical drivers), and thus we show that all such existing models in the literature, including both the two-phase and three-phase models, can be understood as special cases of a master model by expansion around a set of well-defined ground states. This allows any two traffic models to be properly compared and identified. The three-phase models are characterized by the vanishing of leading orders of expansion within a certain density range, and as an example the popular intelligent driver model is shown to be equivalent to a generalized optimal velocity (OV) model. We also explore the diverse solutions of the generalized OV model that can be important both for understanding human driving behaviors and algorithms for autonomous driverless vehicles.
Automated Flight Routing Using Stochastic Dynamic Programming
NASA Technical Reports Server (NTRS)
Ng, Hok K.; Morando, Alex; Grabbe, Shon
2010-01-01
Airspace capacity reduction due to convective weather impedes air traffic flows and causes traffic congestion. This study presents an algorithm that reroutes flights in the presence of winds, enroute convective weather, and congested airspace based on stochastic dynamic programming. A stochastic disturbance model incorporates into the reroute design process the capacity uncertainty. A trajectory-based airspace demand model is employed for calculating current and future airspace demand. The optimal routes minimize the total expected traveling time, weather incursion, and induced congestion costs. They are compared to weather-avoidance routes calculated using deterministic dynamic programming. The stochastic reroutes have smaller deviation probability than the deterministic counterpart when both reroutes have similar total flight distance. The stochastic rerouting algorithm takes into account all convective weather fields with all severity levels while the deterministic algorithm only accounts for convective weather systems exceeding a specified level of severity. When the stochastic reroutes are compared to the actual flight routes, they have similar total flight time, and both have about 1% of travel time crossing congested enroute sectors on average. The actual flight routes induce slightly less traffic congestion than the stochastic reroutes but intercept more severe convective weather.
NASA Technical Reports Server (NTRS)
Simpson, Robert W.
1991-01-01
Brief summaries are given of research activities at the Massachusetts Institute of Technology (MIT) under the sponsorship of the FAA/NASA Joint University Program. Topics covered include hazard assessment and cockpit presentation issues for microburst alerting systems; the situational awareness effect of automated air traffic control (ATC) datalink clearance amendments; a graphical simulation system for adaptive, automated approach spacing; an expert system for temporal planning with application to runway configuration management; deterministic multi-zone ice accretion modeling; alert generation and cockpit presentation for an integrated microburst alerting system; and passive infrared ice detection for helicopter applications.
Density waves in granular flow
NASA Astrophysics Data System (ADS)
Herrmann, H. J.; Flekkøy, E.; Nagel, K.; Peng, G.; Ristow, G.
Ample experimental evidence has shown the existence of spontaneous density waves in granular material flowing through pipes or hoppers. Using Molecular Dynamics Simulations we show that several types of waves exist and find that these density fluctuations follow a 1/f spectrum. We compare this behaviour to deterministic one-dimensional traffic models. If positions and velocities are continuous variables the model shows self-organized criticality driven by the slowest car. We also present Lattice Gas and Boltzmann Lattice Models which reproduce the experimentally observed effects. Density waves are spontaneously generated when the viscosity has a nonlinear dependence on density which characterizes granular flow.
Transformation formulas relating geodetic coordinates to a tangent to Earth, plane coordinate system
NASA Technical Reports Server (NTRS)
Credeur, L.
1981-01-01
Formulas and their approximation were developed to map geodetic position to an Earth tangent plane with an airport centered rectangular coordinate system. The transformations were developed for use in a terminal area air traffic model with deterministic aircraft traffic. The exact configured vehicle's approximation equations used in their precision microwave landing system navigation experiments.
Chaotic Ising-like dynamics in traffic signals
Suzuki, Hideyuki; Imura, Jun-ichi; Aihara, Kazuyuki
2013-01-01
The green and red lights of a traffic signal can be viewed as the up and down states of an Ising spin. Moreover, traffic signals in a city interact with each other, if they are controlled in a decentralised way. In this paper, a simple model of such interacting signals on a finite-size two-dimensional lattice is shown to have Ising-like dynamics that undergoes a ferromagnetic phase transition. Probabilistic behaviour of the model is realised by chaotic billiard dynamics that arises from coupled non-chaotic elements. This purely deterministic model is expected to serve as a starting point for considering statistical mechanics of traffic signals. PMID:23350034
Improvement of driving safety in road traffic system
NASA Astrophysics Data System (ADS)
Li, Ke-Ping; Gao, Zi-You
2005-05-01
A road traffic system is a complex system in which humans participate directly. In this system, human factors play a very important role. In this paper, a kind of control signal is designated at a given site (i.e., signal point) of the road. Under the effect of the control signal, the drivers will decrease their velocities when their vehicles pass the signal point. Our aim is to transit the traffic flow states from disorder to order and then improve the traffic safety. We have tested this technique for the two-lane traffic model that is based on the deterministic Nagel-Schreckenberg (NaSch) traffic model. The simulation results indicate that the traffic flow states can be transited from disorder to order. Different order states can be observed in the system and these states are safer.
Automated Guideway Network Traffic Modeling
DOT National Transportation Integrated Search
1972-02-01
In the literature concerning automated guideway transportation systems, such as dual mode, a great deal of effort has been expended on the use of deterministic reservation schemes and the problem of merging streams of vehicles. However, little attent...
NASA Astrophysics Data System (ADS)
Zhang, Xunxun; Xu, Hongke; Fang, Jianwu
2018-01-01
Along with the rapid development of the unmanned aerial vehicle technology, multiple vehicle tracking (MVT) in aerial video sequence has received widespread interest for providing the required traffic information. Due to the camera motion and complex background, MVT in aerial video sequence poses unique challenges. We propose an efficient MVT algorithm via driver behavior-based Kalman filter (DBKF) and an improved deterministic data association (IDDA) method. First, a hierarchical image registration method is put forward to compensate the camera motion. Afterward, to improve the accuracy of the state estimation, we propose the DBKF module by incorporating the driver behavior into the Kalman filter, where artificial potential field is introduced to reflect the driver behavior. Then, to implement the data association, a local optimization method is designed instead of global optimization. By introducing the adaptive operating strategy, the proposed IDDA method can also deal with the situation in which the vehicles suddenly appear or disappear. Finally, comprehensive experiments on the DARPA VIVID data set and KIT AIS data set demonstrate that the proposed algorithm can generate satisfactory and superior results.
Evaluation of Fast-Time Wake Vortex Models using Wake Encounter Flight Test Data
NASA Technical Reports Server (NTRS)
Ahmad, Nashat N.; VanValkenburg, Randal L.; Bowles, Roland L.; Limon Duparcmeur, Fanny M.; Gloudesman, Thijs; van Lochem, Sander; Ras, Eelco
2014-01-01
This paper describes a methodology for the integration and evaluation of fast-time wake models with flight data. The National Aeronautics and Space Administration conducted detailed flight tests in 1995 and 1997 under the Aircraft Vortex Spacing System Program to characterize wake vortex decay and wake encounter dynamics. In this study, data collected during Flight 705 were used to evaluate NASA's fast-time wake transport and decay models. Deterministic and Monte-Carlo simulations were conducted to define wake hazard bounds behind the wake generator. The methodology described in this paper can be used for further validation of fast-time wake models using en-route flight data, and for determining wake turbulence constraints in the design of air traffic management concepts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goreac, Dan, E-mail: Dan.Goreac@u-pem.fr; Kobylanski, Magdalena, E-mail: Magdalena.Kobylanski@u-pem.fr; Martinez, Miguel, E-mail: Miguel.Martinez@u-pem.fr
2016-10-15
We study optimal control problems in infinite horizon whxen the dynamics belong to a specific class of piecewise deterministic Markov processes constrained to star-shaped networks (corresponding to a toy traffic model). We adapt the results in Soner (SIAM J Control Optim 24(6):1110–1122, 1986) to prove the regularity of the value function and the dynamic programming principle. Extending the networks and Krylov’s “shaking the coefficients” method, we prove that the value function can be seen as the solution to a linearized optimization problem set on a convenient set of probability measures. The approach relies entirely on viscosity arguments. As a by-product,more » the dual formulation guarantees that the value function is the pointwise supremum over regular subsolutions of the associated Hamilton–Jacobi integrodifferential system. This ensures that the value function satisfies Perron’s preconization for the (unique) candidate to viscosity solution.« less
Thermostatted kinetic equations as models for complex systems in physics and life sciences.
Bianca, Carlo
2012-12-01
Statistical mechanics is a powerful method for understanding equilibrium thermodynamics. An equivalent theoretical framework for nonequilibrium systems has remained elusive. The thermodynamic forces driving the system away from equilibrium introduce energy that must be dissipated if nonequilibrium steady states are to be obtained. Historically, further terms were introduced, collectively called a thermostat, whose original application was to generate constant-temperature equilibrium ensembles. This review surveys kinetic models coupled with time-reversible deterministic thermostats for the modeling of large systems composed both by inert matter particles and living entities. The introduction of deterministic thermostats allows to model the onset of nonequilibrium stationary states that are typical of most real-world complex systems. The first part of the paper is focused on a general presentation of the main physical and mathematical definitions and tools: nonequilibrium phenomena, Gauss least constraint principle and Gaussian thermostats. The second part provides a review of a variety of thermostatted mathematical models in physics and life sciences, including Kac, Boltzmann, Jager-Segel and the thermostatted (continuous and discrete) kinetic for active particles models. Applications refer to semiconductor devices, nanosciences, biological phenomena, vehicular traffic, social and economics systems, crowds and swarms dynamics. Copyright © 2012 Elsevier B.V. All rights reserved.
Self-Organized Criticality and Scaling in Lifetime of Traffic Jams
NASA Astrophysics Data System (ADS)
Nagatani, Takashi
1995-01-01
The deterministic cellular automaton 184 (the one-dimensional asymmetric simple-exclusion model with parallel dynamics) is extended to take into account injection or extraction of particles. The model presents the traffic flow on a highway with inflow or outflow of cars.Introducing injection or extraction of particles into the asymmetric simple-exclusion model drives the system asymptotically into a steady state exhibiting a self-organized criticality. The typical lifetime
LETTER TO THE EDITOR: Backbones of traffic jams
NASA Astrophysics Data System (ADS)
Shikhar Gupta, Himadri; Ramaswamy, Ramakrishna
1996-11-01
We study the jam phase of the deterministic traffic model in two dimensions. Within the jam phase, there is a phase transition, from a self-organized jam (formed by initial synchronization followed by jamming), to a random-jam structure. The backbone of the jam is defined and used to analyse self-organization in the jam. The fractal dimension and interparticle correlations on the backbone indicate a continous phase transition at density 0305-4470/29/21/003/img1 with critical exponent 0305-4470/29/21/003/img2, which are characterized through simulations.
A hybrid model for predicting carbon monoxide from vehicular exhausts in urban environments
NASA Astrophysics Data System (ADS)
Gokhale, Sharad; Khare, Mukesh
Several deterministic-based air quality models evaluate and predict the frequently occurring pollutant concentration well but, in general, are incapable of predicting the 'extreme' concentrations. In contrast, the statistical distribution models overcome the above limitation of the deterministic models and predict the 'extreme' concentrations. However, the environmental damages are caused by both extremes as well as by the sustained average concentration of pollutants. Hence, the model should predict not only 'extreme' ranges but also the 'middle' ranges of pollutant concentrations, i.e. the entire range. Hybrid modelling is one of the techniques that estimates/predicts the 'entire range' of the distribution of pollutant concentrations by combining the deterministic based models with suitable statistical distribution models ( Jakeman, et al., 1988). In the present paper, a hybrid model has been developed to predict the carbon monoxide (CO) concentration distributions at one of the traffic intersections, Income Tax Office (ITO), in the Delhi city, where the traffic is heterogeneous in nature and meteorology is 'tropical'. The model combines the general finite line source model (GFLSM) as its deterministic, and log logistic distribution (LLD) model, as its statistical components. The hybrid (GFLSM-LLD) model is then applied at the ITO intersection. The results show that the hybrid model predictions match with that of the observed CO concentration data within the 5-99 percentiles range. The model is further validated at different street location, i.e. Sirifort roadway. The validation results show that the model predicts CO concentrations fairly well ( d=0.91) in 10-95 percentiles range. The regulatory compliance is also developed to estimate the probability of exceedance of hourly CO concentration beyond the National Ambient Air Quality Standards (NAAQS) of India. It consists of light vehicles, heavy vehicles, three- wheelers (auto rickshaws) and two-wheelers (scooters, motorcycles, etc).
DOT National Transportation Integrated Search
1981-10-01
Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...
A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.
Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming
2015-01-01
Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity.
Conflict Detection and Resolution for Future Air Transportation Management
NASA Technical Reports Server (NTRS)
Krozel, Jimmy; Peters, Mark E.; Hunter, George
1997-01-01
With a Free Flight policy, the emphasis for air traffic control is shifting from active control to passive air traffic management with a policy of intervention by exception. Aircraft will be allowed to fly user preferred routes, as long as safety Alert Zones are not violated. If there is a potential conflict, two (or more) aircraft must be able to arrive at a solution for conflict resolution without controller intervention. Thus, decision aid tools are needed in Free Flight to detect and resolve conflicts, and several problems must be solved to develop such tools. In this report, we analyze and solve problems of proximity management, conflict detection, and conflict resolution under a Free Flight policy. For proximity management, we establish a system based on Delaunay Triangulations of aircraft at constant flight levels. Such a system provides a means for analyzing the neighbor relationships between aircraft and the nearby free space around air traffic which can be utilized later in conflict resolution. For conflict detection, we perform both 2-dimensional and 3-dimensional analyses based on the penetration of the Protected Airspace Zone. Both deterministic and non-deterministic analyses are performed. We investigate several types of conflict warnings including tactical warnings prior to penetrating the Protected Airspace Zone, methods based on the reachability overlap of both aircraft, and conflict probability maps to establish strategic Alert Zones around aircraft.
Developing a stochastic traffic volume prediction model for public-private partnership projects
NASA Astrophysics Data System (ADS)
Phong, Nguyen Thanh; Likhitruangsilp, Veerasak; Onishi, Masamitsu
2017-11-01
Transportation projects require an enormous amount of capital investment resulting from their tremendous size, complexity, and risk. Due to the limitation of public finances, the private sector is invited to participate in transportation project development. The private sector can entirely or partially invest in transportation projects in the form of Public-Private Partnership (PPP) scheme, which has been an attractive option for several developing countries, including Vietnam. There are many factors affecting the success of PPP projects. The accurate prediction of traffic volume is considered one of the key success factors of PPP transportation projects. However, only few research works investigated how to predict traffic volume over a long period of time. Moreover, conventional traffic volume forecasting methods are usually based on deterministic models which predict a single value of traffic volume but do not consider risk and uncertainty. This knowledge gap makes it difficult for concessionaires to estimate PPP transportation project revenues accurately. The objective of this paper is to develop a probabilistic traffic volume prediction model. First, traffic volumes were estimated following the Geometric Brownian Motion (GBM) process. Monte Carlo technique is then applied to simulate different scenarios. The results show that this stochastic approach can systematically analyze variations in the traffic volume and yield more reliable estimates for PPP projects.
A theoretical framework for the episodic-urban air quality management plan ( e-UAQMP)
NASA Astrophysics Data System (ADS)
Gokhale, Sharad; Khare, Mukesh
The present research proposes the local urban air quality management plan which combines two different modelling approaches (hybrid model) and possesses an improved predictive ability including the 'probabilistic exceedances over norms' and their 'frequency of occurrences' and so termed, herein, as episodic-urban air quality management plan ( e-UAQMP). The e-UAQMP deals with the consequences of 'extreme' concentrations of pollutant, mainly occurring at urban 'hotspots' e.g. traffic junctions, intersections and signalized roadways and are also influenced by complexities of traffic generated 'wake' effects. The e-UAQMP (based on probabilistic approach), also acts as an efficient preventive measure to predict the 'probability of exceedances' so as to prepare a successful policy responses in relation to the protection of urban environment as well as disseminating information to its sensitive 'receptors'. The e-UAQMP may be tailored to the requirements of the local area for the policy implementation programmes. The importance of such policy-making framework in the context of current air pollution 'episodes' in urban environments is discussed. The hybrid model that is based on both deterministic and stochastic based approaches predicting the 'average' as well as 'extreme' concentration distribution of air pollutants together in form of probability has been used at two air quality control regions (AQCRs) in the Delhi city, India, in formulating and executing the e-UAQMP— first, the income tax office (ITO), one of the busiest signalized traffic intersection and second, the Sirifort, one of the busiest signalized roadways.
NASA Astrophysics Data System (ADS)
Yang, Bo; Yoon, Ji Wei; Monterola, Christopher
We present large scale, detailed analysis of the microscopic empirical data of the congested traffic flow, focusing on the non-linear interactions between the components of the many-body traffic system. By implementing a systematic procedure that averages over relatively unimportant factors, we extract the effective dependence of the acceleration on the gap between the vehicles, velocity and relative velocity. Such relationship is characterised not just by a few vehicles but the traffic system as a whole. Several interesting features of the detailed vehicle-to-vehicle interactions are revealed, including the stochastic distribution of the human responses, relative importance of the non-linear terms in different density regimes, symmetric response to the relative velocity, and the insensitivity of the acceleration to the velocity within a certain gap and velocity range. The latter leads to a multitude of steady-states without a fundamental diagram. The empirically constructed functional dependence of the acceleration on the important dynamical quantities not only gives the detailed collective driving behaviours of the traffic system, it also serves as the fundamental reference for the validations of the deterministic and stochastic microscopic traffic models in the literature.
Real-time adaptive aircraft scheduling
NASA Technical Reports Server (NTRS)
Kolitz, Stephan E.; Terrab, Mostafa
1990-01-01
One of the most important functions of any air traffic management system is the assignment of ground-holding times to flights, i.e., the determination of whether and by how much the take-off of a particular aircraft headed for a congested part of the air traffic control (ATC) system should be postponed in order to reduce the likelihood and extent of airborne delays. An analysis is presented for the fundamental case in which flights from many destinations must be scheduled for arrival at a single congested airport; the formulation is also useful in scheduling the landing of airborne flights within the extended terminal area. A set of approaches is described for addressing a deterministic and a probabilistic version of this problem. For the deterministic case, where airport capacities are known and fixed, several models were developed with associated low-order polynomial-time algorithms. For general delay cost functions, these algorithms find an optimal solution. Under a particular natural assumption regarding the delay cost function, an extremely fast (O(n ln n)) algorithm was developed. For the probabilistic case, using an estimated probability distribution of airport capacities, a model was developed with an associated low-order polynomial-time heuristic algorithm with useful properties.
Deterministic generation of multiparticle entanglement by quantum Zeno dynamics.
Barontini, Giovanni; Hohmann, Leander; Haas, Florian; Estève, Jérôme; Reichel, Jakob
2015-09-18
Multiparticle entangled quantum states, a key resource in quantum-enhanced metrology and computing, are usually generated by coherent operations exclusively. However, unusual forms of quantum dynamics can be obtained when environment coupling is used as part of the state generation. In this work, we used quantum Zeno dynamics (QZD), based on nondestructive measurement with an optical microcavity, to deterministically generate different multiparticle entangled states in an ensemble of 36 qubit atoms in less than 5 microseconds. We characterized the resulting states by performing quantum tomography, yielding a time-resolved account of the entanglement generation. In addition, we studied the dependence of quantum states on measurement strength and quantified the depth of entanglement. Our results show that QZD is a versatile tool for fast and deterministic entanglement generation in quantum engineering applications. Copyright © 2015, American Association for the Advancement of Science.
Active temporal multiplexing of indistinguishable heralded single photons
Xiong, C.; Zhang, X.; Liu, Z.; Collins, M. J.; Mahendra, A.; Helt, L. G.; Steel, M. J.; Choi, D. -Y.; Chae, C. J.; Leong, P. H. W.; Eggleton, B. J.
2016-01-01
It is a fundamental challenge in quantum optics to deterministically generate indistinguishable single photons through non-deterministic nonlinear optical processes, due to the intrinsic coupling of single- and multi-photon-generation probabilities in these processes. Actively multiplexing photons generated in many temporal modes can decouple these probabilities, but key issues are to minimize resource requirements to allow scalability, and to ensure indistinguishability of the generated photons. Here we demonstrate the multiplexing of photons from four temporal modes solely using fibre-integrated optics and off-the-shelf electronic components. We show a 100% enhancement to the single-photon output probability without introducing additional multi-photon noise. Photon indistinguishability is confirmed by a fourfold Hong–Ou–Mandel quantum interference with a 91±16% visibility after subtracting multi-photon noise due to high pump power. Our demonstration paves the way for scalable multiplexing of many non-deterministic photon sources to a single near-deterministic source, which will be of benefit to future quantum photonic technologies. PMID:26996317
The fully actuated traffic control problem solved by global optimization and complementarity
NASA Astrophysics Data System (ADS)
Ribeiro, Isabel M.; de Lurdes de Oliveira Simões, Maria
2016-02-01
Global optimization and complementarity are used to determine the signal timing for fully actuated traffic control, regarding effective green and red times on each cycle. The average values of these parameters can be used to estimate the control delay of vehicles. In this article, a two-phase queuing system for a signalized intersection is outlined, based on the principle of minimization of the total waiting time for the vehicles. The underlying model results in a linear program with linear complementarity constraints, solved by a sequential complementarity algorithm. Departure rates of vehicles during green and yellow periods were treated as deterministic, while arrival rates of vehicles were assumed to follow a Poisson distribution. Several traffic scenarios were created and solved. The numerical results reveal that it is possible to use global optimization and complementarity over a reasonable number of cycles and determine with efficiency effective green and red times for a signalized intersection.
Smart-Grid Backbone Network Real-Time Delay Reduction via Integer Programming.
Pagadrai, Sasikanth; Yilmaz, Muhittin; Valluri, Pratyush
2016-08-01
This research investigates an optimal delay-based virtual topology design using integer linear programming (ILP), which is applied to the current backbone networks such as smart-grid real-time communication systems. A network traffic matrix is applied and the corresponding virtual topology problem is solved using the ILP formulations that include a network delay-dependent objective function and lightpath routing, wavelength assignment, wavelength continuity, flow routing, and traffic loss constraints. The proposed optimization approach provides an efficient deterministic integration of intelligent sensing and decision making, and network learning features for superior smart grid operations by adaptively responding the time-varying network traffic data as well as operational constraints to maintain optimal virtual topologies. A representative optical backbone network has been utilized to demonstrate the proposed optimization framework whose simulation results indicate that superior smart-grid network performance can be achieved using commercial networks and integer programming.
Implementation of a tactical voice/data network over FDDI. [Fiber Distributed Data Interface
NASA Technical Reports Server (NTRS)
Bergman, L. A.; Halloran, F.; Martinez, J.
1988-01-01
An asynchronous high-speed fiber-optic local-area network is described that simultaneously supports packet data traffic with synchronous TI voice traffic over a standard asynchronous FDDI (fiber distributed data interface) token-ring channel. A voice interface module was developed that parses, buffers, and resynchronizes the voice data to the packet network. The technique is general, however, and can be applied to any deterministic class of networks, including multitier backbones. In addition, the higher layer packet data protocols may operate independently of those for the voice, thereby permitting great flexibility in reconfiguring the network. Voice call setup and switching functions are performed external to the network with PABX equipment.
The viability of ADVANTG deterministic method for synthetic radiography generation
NASA Astrophysics Data System (ADS)
Bingham, Andrew; Lee, Hyoung K.
2018-07-01
Fast simulation techniques to generate synthetic radiographic images of high resolution are helpful when new radiation imaging systems are designed. However, the standard stochastic approach requires lengthy run time with poorer statistics at higher resolution. The investigation of the viability of a deterministic approach to synthetic radiography image generation was explored. The aim was to analyze a computational time decrease over the stochastic method. ADVANTG was compared to MCNP in multiple scenarios including a small radiography system prototype, to simulate high resolution radiography images. By using ADVANTG deterministic code to simulate radiography images the computational time was found to decrease 10 to 13 times compared to the MCNP stochastic approach while retaining image quality.
Sensitivity analysis in a Lassa fever deterministic mathematical model
NASA Astrophysics Data System (ADS)
Abdullahi, Mohammed Baba; Doko, Umar Chado; Mamuda, Mamman
2015-05-01
Lassa virus that causes the Lassa fever is on the list of potential bio-weapons agents. It was recently imported into Germany, the Netherlands, the United Kingdom and the United States as a consequence of the rapid growth of international traffic. A model with five mutually exclusive compartments related to Lassa fever is presented and the basic reproduction number analyzed. A sensitivity analysis of the deterministic model is performed. This is done in order to determine the relative importance of the model parameters to the disease transmission. The result of the sensitivity analysis shows that the most sensitive parameter is the human immigration, followed by human recovery rate, then person to person contact. This suggests that control strategies should target human immigration, effective drugs for treatment and education to reduced person to person contact.
NASA Technical Reports Server (NTRS)
Huber, Hans
2006-01-01
Air transport forms complex networks that can be measured in order to understand its structural characteristics and functional properties. Recent models for network growth (i.e., preferential attachment, etc.) remain stochastic and do not seek to understand other network-specific mechanisms that may account for their development in a more microscopic way. Air traffic is made up of many constituent airlines that are either privately or publicly owned and that operate their own networks. They follow more or less similar business policies each. The way these airline networks organize among themselves into distinct traffic distributions reveals complex interaction among them, which in turn can be aggregated into larger (macro-) traffic distributions. Our approach allows for a more deterministic methodology that will assess the impact of airline strategies on the distinct distributions for air traffic, particularly inside Europe. One key question this paper is seeking to answer is whether there are distinct patterns of preferential attachment for given classes of airline networks to distinct types of European airports. Conclusions about the advancing degree of concentration in this industry and the airline operators that accelerate this process can be drawn.
NASA Astrophysics Data System (ADS)
Huang, Jinhui; Liu, Wenxiang; Su, Yingxue; Wang, Feixue
2018-02-01
Space networks, in which connectivity is deterministic and intermittent, can be modeled by delay/disruption tolerant networks. In space delay/disruption tolerant networks, a packet is usually transmitted from the source node to the destination node indirectly via a series of relay nodes. If anyone of the nodes in the path becomes congested, the packet will be dropped due to buffer overflow. One of the main reasons behind congestion is the unbalanced network traffic distribution. We propose a load balancing strategy which takes the congestion status of both the local node and relay nodes into account. The congestion status, together with the end-to-end delay, is used in the routing selection. A lookup-table enhancement is also proposed. The off-line computation and the on-line adjustment are combined together to make a more precise estimate of the end-to-end delay while at the same time reducing the onboard computation. Simulation results show that the proposed strategy helps to distribute network traffic more evenly and therefore reduces the packet drop ratio. In addition, the average delay is also decreased in most cases. The lookup-table enhancement provides a compromise between the need for better communication performance and the desire for less onboard computation.
NASA Technical Reports Server (NTRS)
Bollman, W. E.; Chadwick, C.
1982-01-01
A number of interplanetary missions now being planned involve placing deterministic maneuvers along the flight path to alter the trajectory. Lee and Boain (1973) examined the statistics of trajectory correction maneuver (TCM) magnitude with no deterministic ('bias') component. The Delta v vector magnitude statistics were generated for several values of random Delta v standard deviations using expansions in terms of infinite hypergeometric series. The present investigation uses a different technique (Monte Carlo simulation) to generate Delta v magnitude statistics for a wider selection of random Delta v standard deviations and also extends the analysis to the case of nonzero deterministic Delta v's. These Delta v magnitude statistics are plotted parametrically. The plots are useful in assisting the analyst in quickly answering questions about the statistics of Delta v magnitude for single TCM's consisting of both a deterministic and a random component. The plots provide quick insight into the nature of the Delta v magnitude distribution for the TCM.
DOT National Transportation Integrated Search
2009-09-01
The opening of a major traffic generator in the San Antonio area provided an opportunity to develop and : implement an extensive traffic monitoring system to analyze local, area, and regional traffic impacts from the : generator. Researchers reviewed...
Symbols and warrants for major traffic generator guide signing.
DOT National Transportation Integrated Search
2009-09-01
The Texas Manual on Uniform Traffic Control Devices (TMUTCD) provides the definition of regular traffic generators based on four population types but not for major traffic generators (MTGs). MTG signs have been considered to supplement the overall si...
NASA Astrophysics Data System (ADS)
Gao, Yi
The development and utilization of wind energy for satisfying electrical demand has received considerable attention in recent years due to its tremendous environmental, social and economic benefits, together with public support and government incentives. Electric power generation from wind energy behaves quite differently from that of conventional sources. The fundamentally different operating characteristics of wind energy facilities therefore affect power system reliability in a different manner than those of conventional systems. The reliability impact of such a highly variable energy source is an important aspect that must be assessed when the wind power penetration is significant. The focus of the research described in this thesis is on the utilization of state sampling Monte Carlo simulation in wind integrated bulk electric system reliability analysis and the application of these concepts in system planning and decision making. Load forecast uncertainty is an important factor in long range planning and system development. This thesis describes two approximate approaches developed to reduce the number of steps in a load duration curve which includes load forecast uncertainty, and to provide reasonably accurate generating and bulk system reliability index predictions. The developed approaches are illustrated by application to two composite test systems. A method of generating correlated random numbers with uniform distributions and a specified correlation coefficient in the state sampling method is proposed and used to conduct adequacy assessment in generating systems and in bulk electric systems containing correlated wind farms in this thesis. The studies described show that it is possible to use the state sampling Monte Carlo simulation technique to quantitatively assess the reliability implications associated with adding wind power to a composite generation and transmission system including the effects of multiple correlated wind sites. This is an important development as it permits correlated wind farms to be incorporated in large practical system studies without requiring excessive increases in computer solution time. The procedures described in this thesis for creating monthly and seasonal wind farm models should prove useful in situations where time period models are required to incorporate scheduled maintenance of generation and transmission facilities. There is growing interest in combining deterministic considerations with probabilistic assessment in order to evaluate the quantitative system risk and conduct bulk power system planning. A relatively new approach that incorporates deterministic and probabilistic considerations in a single risk assessment framework has been designated as the joint deterministic-probabilistic approach. The research work described in this thesis illustrates that the joint deterministic-probabilistic approach can be effectively used to integrate wind power in bulk electric system planning. The studies described in this thesis show that the application of the joint deterministic-probabilistic method provides more stringent results for a system with wind power than the traditional deterministic N-1 method because the joint deterministic-probabilistic technique is driven by the deterministic N-1 criterion with an added probabilistic perspective which recognizes the power output characteristics of a wind turbine generator.
NASA Astrophysics Data System (ADS)
Hanada, Masaki; Nakazato, Hidenori; Watanabe, Hitoshi
Multimedia applications such as music or video streaming, video teleconferencing and IP telephony are flourishing in packet-switched networks. Applications that generate such real-time data can have very diverse quality-of-service (QoS) requirements. In order to guarantee diverse QoS requirements, the combined use of a packet scheduling algorithm based on Generalized Processor Sharing (GPS) and leaky bucket traffic regulator is the most successful QoS mechanism. GPS can provide a minimum guaranteed service rate for each session and tight delay bounds for leaky bucket constrained sessions. However, the delay bounds for leaky bucket constrained sessions under GPS are unnecessarily large because each session is served according to its associated constant weight until the session buffer is empty. In order to solve this problem, a scheduling policy called Output Rate-Controlled Generalized Processor Sharing (ORC-GPS) was proposed in [17]. ORC-GPS is a rate-based scheduling like GPS, and controls the service rate in order to lower the delay bounds for leaky bucket constrained sessions. In this paper, we propose a call admission control (CAC) algorithm for ORC-GPS, for leaky-bucket constrained sessions with deterministic delay requirements. This CAC algorithm for ORC-GPS determines the optimal values of parameters of ORC-GPS from the deterministic delay requirements of the sessions. In numerical experiments, we compare the CAC algorithm for ORC-GPS with one for GPS in terms of schedulable region and computational complexity.
Effects of heterogeneous traffic with speed limit zone on the car accidents
NASA Astrophysics Data System (ADS)
Marzoug, R.; Lakouari, N.; Bentaleb, K.; Ez-Zahraouy, H.; Benyoussef, A.
2016-06-01
Using the extended Nagel-Schreckenberg (NS) model, we numerically study the impact of the heterogeneity of traffic with speed limit zone (SLZ) on the probability of occurrence of car accidents (Pac). SLZ in the heterogeneous traffic has an important effect, typically in the mixture velocities case. In the deterministic case, SLZ leads to the appearance of car accidents even in the low densities, in this region Pac increases with increasing of fraction of fast vehicles (Ff). In the nondeterministic case, SLZ decreases the effect of braking probability Pb in the low densities. Furthermore, the impact of multi-SLZ on the probability Pac is also studied. In contrast with the homogeneous case [X. Li, H. Kuang, Y. Fan and G. Zhang, Int. J. Mod. Phys. C 25 (2014) 1450036], it is found that in the low densities the probability Pac without SLZ (n = 0) is low than Pac with multi-SLZ (n > 0). However, the existence of multi-SLZ in the road decreases the risk of collision in the congestion phase.
A fiber optic tactical voice/data network based on FDDI
NASA Technical Reports Server (NTRS)
Bergman, L. A.; Hartmayer, R.; Marelid, S.; Wu, W. H.; Edgar, G.; Cassell, P.; Mancini, R.; Kiernicki, J.; Paul, L. J.; Jeng, J.
1988-01-01
An asynchronous high-speed fiber optic local area network is described that supports ordinary data packet traffic simultaneously with synchronous Tl voice traffic over a common FDDI token ring channel. A voice interface module was developed that parses, buffers, and resynchronizes the voice data to the packet network. The technique is general, however, and can be applied to any deterministic class of networks, including multi-tier backbones. A conventional single token access protocol was employed at the lowest layer, with fixed packet sizes for voice and variable for data. In addition, the higher layer packet data protocols are allowed to operate independently of those for the voice thereby permitting great flexibility in reconfiguring the network. Voice call setup and switching functions were performed external to the network with PABX equipment.
On Transform Domain Communication Systems under Spectrum Sensing Mismatch: A Deterministic Analysis.
Jin, Chuanxue; Hu, Su; Huang, Yixuan; Luo, Qu; Huang, Dan; Li, Yi; Gao, Yuan; Cheng, Shaochi
2017-07-08
Towards the era of mobile Internet and the Internet of Things (IoT), numerous sensors and devices are being introduced and interconnected. To support such an amount of data traffic, traditional wireless communication technologies are facing challenges both in terms of the increasing shortage of spectrum resources and massive multiple access. The transform-domain communication system (TDCS) is considered as an alternative multiple access system, where 5G and mobile IoT are mainly focused. However, previous studies about TDCS are under the assumption that the transceiver has the global spectrum information, without the consideration of spectrum sensing mismatch (SSM). In this paper, we present the deterministic analysis of TDCS systems under arbitrary given spectrum sensing scenarios, especially the influence of the SSM pattern to the signal to noise ratio (SNR) performance. Simulation results show that arbitrary SSM pattern can lead to inferior bit error rate (BER) performance.
On Transform Domain Communication Systems under Spectrum Sensing Mismatch: A Deterministic Analysis
Jin, Chuanxue; Hu, Su; Huang, Yixuan; Luo, Qu; Huang, Dan; Li, Yi; Cheng, Shaochi
2017-01-01
Towards the era of mobile Internet and the Internet of Things (IoT), numerous sensors and devices are being introduced and interconnected. To support such an amount of data traffic, traditional wireless communication technologies are facing challenges both in terms of the increasing shortage of spectrum resources and massive multiple access. The transform-domain communication system (TDCS) is considered as an alternative multiple access system, where 5G and mobile IoT are mainly focused. However, previous studies about TDCS are under the assumption that the transceiver has the global spectrum information, without the consideration of spectrum sensing mismatch (SSM). In this paper, we present the deterministic analysis of TDCS systems under arbitrary given spectrum sensing scenarios, especially the influence of the SSM pattern to the signal to noise ratio (SNR) performance. Simulation results show that arbitrary SSM pattern can lead to inferior bit error rate (BER) performance. PMID:28698477
Memory effects in microscopic traffic models and wide scattering in flow-density data
NASA Astrophysics Data System (ADS)
Treiber, Martin; Helbing, Dirk
2003-10-01
By means of microscopic simulations we show that noninstantaneous adaptation of the driving behavior to the traffic situation together with the conventional method to measure flow-density data provides a possible explanation for the observed inverse-λ shape and the wide scattering of flow-density data in “synchronized” congested traffic. We model a memory effect in the response of drivers to the traffic situation for a wide class of car-following models by introducing an additional dynamical variable (the “subjective level of service”) describing the adaptation of drivers to the surrounding traffic situation during the past few minutes and couple this internal state to parameters of the underlying model that are related to the driving style. For illustration, we use the intelligent-driver model (IDM) as the underlying model, characterize the level of service solely by the velocity, and couple the internal variable to the IDM parameter “time gap” to model an increase of the time gap in congested traffic (“frustration effect”), which is supported by single-vehicle data. We simulate open systems with a bottleneck and obtain flow-density data by implementing “virtual detectors.” The shape, relative size, and apparent “stochasticity” of the region of the scattered data points agree nearly quantitatively with empirical data. Wide scattering is even observed for identical vehicles, although the proposed model is a time-continuous, deterministic, single-lane car-following model with a unique fundamental diagram.
NASA Astrophysics Data System (ADS)
Itoh, Kosuke; Nakada, Tsutomu
2013-04-01
Deterministic nonlinear dynamical processes are ubiquitous in nature. Chaotic sounds generated by such processes may appear irregular and random in waveform, but these sounds are mathematically distinguished from random stochastic sounds in that they contain deterministic short-time predictability in their temporal fine structures. We show that the human brain distinguishes deterministic chaotic sounds from spectrally matched stochastic sounds in neural processing and perception. Deterministic chaotic sounds, even without being attended to, elicited greater cerebral cortical responses than the surrogate control sounds after about 150 ms in latency after sound onset. Listeners also clearly discriminated these sounds in perception. The results support the hypothesis that the human auditory system is sensitive to the subtle short-time predictability embedded in the temporal fine structure of sounds.
Effects of traffic generation patterns on the robustness of complex networks
NASA Astrophysics Data System (ADS)
Wu, Jiajing; Zeng, Junwen; Chen, Zhenhao; Tse, Chi K.; Chen, Bokui
2018-02-01
Cascading failures in communication networks with heterogeneous node functions are studied in this paper. In such networks, the traffic dynamics are highly dependent on the traffic generation patterns which are in turn determined by the locations of the hosts. The data-packet traffic model is applied to Barabási-Albert scale-free networks to study the cascading failures in such networks and to explore the effects of traffic generation patterns on network robustness. It is found that placing the hosts at high-degree nodes in a network can make the network more robust against both intentional attacks and random failures. It is also shown that the traffic generation pattern plays an important role in network design.
MMPP Traffic Generator for the Testing of the SCAR 2 Fast Packet Switch
NASA Technical Reports Server (NTRS)
Chren, William A., Jr.
1995-01-01
A prototype MWP Traffic Generator (TG) has been designed for testing of the COMSAT-supplied SCAR II Fast Packet Switch. By generating packets distributed according to a Markov-Modulated Poisson Process (MMPP) model. it allows the assessment of the switch performance under traffic conditions that are more realistic than could be generated using the COMSAT-supplied Traffic Generator Module. The MMPP model is widely believed to model accurately real-world superimposed voice and data communications traffic. The TG was designed to be as much as possible of a "drop-in" replacement for the COMSAT Traffic Generator Module. The latter fit on two Altera EPM7256EGC 192-pin CPLDs and produced traffic for one switch input port. No board changes are necessary because it has been partitioned to use the existing board traces. The TG, consisting of parts "TGDATPROC" and "TGRAMCTL" must merely be reprogrammed into the Altera devices of the same name. However, the 040 controller software must be modified to provide TG initialization data. This data will be given in Section II.
DOT National Transportation Integrated Search
2009-01-01
Can a self-calibrating signal control system lead to wider adoption of adaptive traffic control systems? The focus of Next Generation of Smart Traffic Signals, an Exploratory Advanced Research (EAR) Program project, is a system that-with lit...
DOT National Transportation Integrated Search
2009-09-01
The purpose of this guide is to aid the Texas Department of Transportation (TxDOT), Metropolitan Planning Organizations (MPO), and other state and local agencies to develop an effective traffic monitoring system for new major traffic generators in th...
The Total Exposure Model (TEM) uses deterministic and stochastic methods to estimate the exposure of a person performing daily activities of eating, drinking, showering, and bathing. There were 250 time histories generated, by subject with activities, for the three exposure ro...
Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems
NASA Technical Reports Server (NTRS)
Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael
2013-01-01
The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.
Distinguishing between stochasticity and determinism: Examples from cell cycle duration variability.
Pearl Mizrahi, Sivan; Sandler, Oded; Lande-Diner, Laura; Balaban, Nathalie Q; Simon, Itamar
2016-01-01
We describe a recent approach for distinguishing between stochastic and deterministic sources of variability, focusing on the mammalian cell cycle. Variability between cells is often attributed to stochastic noise, although it may be generated by deterministic components. Interestingly, lineage information can be used to distinguish between variability and determinism. Analysis of correlations within a lineage of the mammalian cell cycle duration revealed its deterministic nature. Here, we discuss the sources of such variability and the possibility that the underlying deterministic process is due to the circadian clock. Finally, we discuss the "kicked cell cycle" model and its implication on the study of the cell cycle in healthy and cancerous tissues. © 2015 WILEY Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Thiemann, Christian; Treiber, Martin; Kesting, Arne
2008-09-01
Intervehicle communication enables vehicles to exchange messages within a limited broadcast range and thus self-organize into dynamical and geographically embedded wireless ad hoc networks. We study the longitudinal hopping mode in which messages are transported using equipped vehicles driving in the same direction as a relay. Given a finite communication range, we investigate the conditions where messages can percolate through the network, i.e., a linked chain of relay vehicles exists between the sender and receiver. We simulate message propagation in different traffic scenarios and for different fractions of equipped vehicles. Simulations are done with both, modeled and empirical traffic data. These results are used to test the limits of applicability of an analytical model assuming a Poissonian distance distribution between the relays. We found a good agreement for homogeneous traffic scenarios and sufficiently low percentages of equipped vehicles. For higher percentages, the observed connectivity was higher than that of the model while in stop-and-go traffic situations it was lower. We explain these results in terms of correlations of the distances between the relay vehicles. Finally, we introduce variable transmission ranges and found that this additional stochastic component generally increased connectivity compared to a deterministic transmission with the same mean.
DOT National Transportation Integrated Search
2012-06-01
This project evaluates the physical and economic feasibility of using existing traffic infrastructure to mount wind power : generators. Some possible places to mount a light weight wind generator and solar panel hybrid system are: i) Traffic : signal...
Stochastic Multi-Timescale Power System Operations With Variable Wind Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Hongyu; Krad, Ibrahim; Florita, Anthony
This paper describes a novel set of stochastic unit commitment and economic dispatch models that consider stochastic loads and variable generation at multiple operational timescales. The stochastic model includes four distinct stages: stochastic day-ahead security-constrained unit commitment (SCUC), stochastic real-time SCUC, stochastic real-time security-constrained economic dispatch (SCED), and deterministic automatic generation control (AGC). These sub-models are integrated together such that they are continually updated with decisions passed from one to another. The progressive hedging algorithm (PHA) is applied to solve the stochastic models to maintain the computational tractability of the proposed models. Comparative case studies with deterministic approaches are conductedmore » in low wind and high wind penetration scenarios to highlight the advantages of the proposed methodology, one with perfect forecasts and the other with current state-of-the-art but imperfect deterministic forecasts. The effectiveness of the proposed method is evaluated with sensitivity tests using both economic and reliability metrics to provide a broader view of its impact.« less
Nonclassical point of view of the Brownian motion generation via fractional deterministic model
NASA Astrophysics Data System (ADS)
Gilardi-Velázquez, H. E.; Campos-Cantón, E.
In this paper, we present a dynamical system based on the Langevin equation without stochastic term and using fractional derivatives that exhibit properties of Brownian motion, i.e. a deterministic model to generate Brownian motion is proposed. The stochastic process is replaced by considering an additional degree of freedom in the second-order Langevin equation. Thus, it is transformed into a system of three first-order linear differential equations, additionally α-fractional derivative are considered which allow us to obtain better statistical properties. Switching surfaces are established as a part of fluctuating acceleration. The final system of three α-order linear differential equations does not contain a stochastic term, so the system generates motion in a deterministic way. Nevertheless, from the time series analysis, we found that the behavior of the system exhibits statistics properties of Brownian motion, such as, a linear growth in time of mean square displacement, a Gaussian distribution. Furthermore, we use the detrended fluctuation analysis to prove the Brownian character of this motion.
Traffic Generator (TrafficGen) Version 1.4.2: Users Guide
2016-06-01
events, the user has to enter them manually . We will research and implement a way to better define and organize the multicast addresses so they can be...the network with Transmission Control Protocol and User Datagram Protocol Internet Protocol traffic. Each node generating network traffic in an...TrafficGen Graphical User Interface (GUI) 3 3.1 Anatomy of the User Interface 3 3.2 Scenario Configuration and MGEN Files 4 4. Working with
Schilde, M; Doerner, K F; Hartl, R F
2014-10-01
In urban areas, logistic transportation operations often run into problems because travel speeds change, depending on the current traffic situation. If not accounted for, time-dependent and stochastic travel speeds frequently lead to missed time windows and thus poorer service. Especially in the case of passenger transportation, it often leads to excessive passenger ride times as well. Therefore, time-dependent and stochastic influences on travel speeds are relevant for finding feasible and reliable solutions. This study considers the effect of exploiting statistical information available about historical accidents, using stochastic solution approaches for the dynamic dial-a-ride problem (dynamic DARP). The authors propose two pairs of metaheuristic solution approaches, each consisting of a deterministic method (average time-dependent travel speeds for planning) and its corresponding stochastic version (exploiting stochastic information while planning). The results, using test instances with up to 762 requests based on a real-world road network, show that in certain conditions, exploiting stochastic information about travel speeds leads to significant improvements over deterministic approaches.
Small-angle scattering from 3D Sierpinski tetrahedron generated using chaos game
NASA Astrophysics Data System (ADS)
Slyamov, Azat
2017-12-01
We approximate a three dimensional version of deterministic Sierpinski gasket (SG), also known as Sierpinski tetrahedron (ST), by using the chaos game representation (CGR). Structural properties of the fractal, generated by both deterministic and CGR algorithms are determined using small-angle scattering (SAS) technique. We calculate the corresponding monodisperse structure factor of ST, using an optimized Debye formula. We show that scattering from CGR of ST recovers basic fractal properties, such as fractal dimension, iteration number, scaling factor, overall size of the system and the number of units composing the fractal.
The Traffic Management Advisor
NASA Technical Reports Server (NTRS)
Nedell, William; Erzberger, Heinz; Neuman, Frank
1990-01-01
The traffic management advisor (TMA) is comprised of algorithms, a graphical interface, and interactive tools for controlling the flow of air traffic into the terminal area. The primary algorithm incorporated in it is a real-time scheduler which generates efficient landing sequences and landing times for arrivals within about 200 n.m. from touchdown. A unique feature of the TMA is its graphical interface that allows the traffic manager to modify the computer-generated schedules for specific aircraft while allowing the automatic scheduler to continue generating schedules for all other aircraft. The graphical interface also provides convenient methods for monitoring the traffic flow and changing scheduling parameters during real-time operation.
Optimal Control of Hybrid Systems in Air Traffic Applications
NASA Astrophysics Data System (ADS)
Kamgarpour, Maryam
Growing concerns over the scalability of air traffic operations, air transportation fuel emissions and prices, as well as the advent of communication and sensing technologies motivate improvements to the air traffic management system. To address such improvements, in this thesis a hybrid dynamical model as an abstraction of the air traffic system is considered. Wind and hazardous weather impacts are included using a stochastic model. This thesis focuses on the design of algorithms for verification and control of hybrid and stochastic dynamical systems and the application of these algorithms to air traffic management problems. In the deterministic setting, a numerically efficient algorithm for optimal control of hybrid systems is proposed based on extensions of classical optimal control techniques. This algorithm is applied to optimize the trajectory of an Airbus 320 aircraft in the presence of wind and storms. In the stochastic setting, the verification problem of reaching a target set while avoiding obstacles (reach-avoid) is formulated as a two-player game to account for external agents' influence on system dynamics. The solution approach is applied to air traffic conflict prediction in the presence of stochastic wind. Due to the uncertainty in forecasts of the hazardous weather, and hence the unsafe regions of airspace for aircraft flight, the reach-avoid framework is extended to account for stochastic target and safe sets. This methodology is used to maximize the probability of the safety of aircraft paths through hazardous weather. Finally, the problem of modeling and optimization of arrival air traffic and runway configuration in dense airspace subject to stochastic weather data is addressed. This problem is formulated as a hybrid optimal control problem and is solved with a hierarchical approach that decouples safety and performance. As illustrated with this problem, the large scale of air traffic operations motivates future work on the efficient implementation of the proposed algorithms.
Road traffic air and noise pollution exposure assessment - A review of tools and techniques.
Khan, Jibran; Ketzel, Matthias; Kakosimos, Konstantinos; Sørensen, Mette; Jensen, Steen Solvang
2018-09-01
Road traffic induces air and noise pollution in urban environments having negative impacts on human health. Thus, estimating exposure to road traffic air and noise pollution (hereafter, air and noise pollution) is important in order to improve the understanding of human health outcomes in epidemiological studies. The aims of this review are (i) to summarize current practices of modelling and exposure assessment techniques for road traffic air and noise pollution (ii) to highlight the potential of existing tools and techniques for their combined exposure assessment for air and noise together with associated challenges, research gaps and priorities. The study reviews literature about air and noise pollution from urban road traffic, including other relevant characteristics such as the employed dispersion models, Geographic Information System (GIS)-based tool, spatial scale of exposure assessment, study location, sample size, type of traffic data and building geometry information. Deterministic modelling is the most frequently used assessment technique for both air and noise pollution of short-term and long-term exposure. We observed a larger variety among air pollution models as compared to the applied noise models. Correlations between air and noise pollution vary significantly (0.05-0.74) and are affected by several parameters such as traffic attributes, building attributes and meteorology etc. Buildings act as screens for the dispersion of pollution, but the reduction effect is much larger for noise than for air pollution. While, meteorology has a greater influence on air pollution levels as compared to noise, although also important for noise pollution. There is a significant potential for developing a standard tool to assess combined exposure of traffic related air and noise pollution to facilitate health related studies. GIS, due to its geographic nature, is well established and has a significant capability to simultaneously address both exposures. Copyright © 2018 Elsevier B.V. All rights reserved.
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.
Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M
2016-12-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.
Deterministic generation of remote entanglement with active quantum feedback
Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...
2015-12-10
We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less
Self-Organization in 2D Traffic Flow Model with Jam-Avoiding Drive
NASA Astrophysics Data System (ADS)
Nagatani, Takashi
1995-04-01
A stochastic cellular automaton (CA) model is presented to investigate the traffic jam by self-organization in the two-dimensional (2D) traffic flow. The CA model is the extended version of the 2D asymmetric exclusion model to take into account jam-avoiding drive. Each site contains either a car moving to the up, a car moving to the right, or is empty. A up car can shift right with probability p ja if it is blocked ahead by other cars. It is shown that the three phases (the low-density phase, the intermediate-density phase and the high-density phase) appear in the traffic flow. The intermediate-density phase is characterized by the right moving of up cars. The jamming transition to the high-density jamming phase occurs with higher density of cars than that without jam-avoiding drive. The jamming transition point p 2c increases with the shifting probability p ja. In the deterministic limit of p ja=1, it is found that a new jamming transition occurs from the low-density synchronized-shifting phase to the high-density moving phase with increasing density of cars. In the synchronized-shifting phase, all up cars do not move to the up but shift to the right by synchronizing with the move of right cars. We show that the jam-avoiding drive has an important effect on the dynamical jamming transition.
The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments
NASA Astrophysics Data System (ADS)
Chen, Fajing; Jiao, Meiyan; Chen, Jing
2013-04-01
Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.
Network Traffic Generator for Low-rate Small Network Equipment Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lanzisera, Steven
2013-05-28
Application that uses the Python low-level socket interface to pass network traffic between devices on the local side of a NAT router and the WAN side of the NAT router. This application is designed to generate traffic that complies with the Energy Star Small Network Equipment Test Method.
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology
Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.
2016-01-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915
Will higher traffic flow lead to more traffic conflicts? A crash surrogate metric based analysis
Kuang, Yan; Yan, Yadan
2017-01-01
In this paper, we aim to examine the relationship between traffic flow and potential conflict risks by using crash surrogate metrics. It has been widely recognized that one traffic flow corresponds to two distinct traffic states with different speeds and densities. In view of this, instead of simply aggregating traffic conditions with the same traffic volume, we represent potential conflict risks at a traffic flow fundamental diagram. Two crash surrogate metrics, namely, Aggregated Crash Index and Time to Collision, are used in this study to represent the potential conflict risks with respect to different traffic conditions. Furthermore, Beijing North Ring III and Next Generation SIMulation Interstate 80 datasets are utilized to carry out case studies. By using the proposed procedure, both datasets generate similar trends, which demonstrate the applicability of the proposed methodology and the transferability of our conclusions. PMID:28787022
Will higher traffic flow lead to more traffic conflicts? A crash surrogate metric based analysis.
Kuang, Yan; Qu, Xiaobo; Yan, Yadan
2017-01-01
In this paper, we aim to examine the relationship between traffic flow and potential conflict risks by using crash surrogate metrics. It has been widely recognized that one traffic flow corresponds to two distinct traffic states with different speeds and densities. In view of this, instead of simply aggregating traffic conditions with the same traffic volume, we represent potential conflict risks at a traffic flow fundamental diagram. Two crash surrogate metrics, namely, Aggregated Crash Index and Time to Collision, are used in this study to represent the potential conflict risks with respect to different traffic conditions. Furthermore, Beijing North Ring III and Next Generation SIMulation Interstate 80 datasets are utilized to carry out case studies. By using the proposed procedure, both datasets generate similar trends, which demonstrate the applicability of the proposed methodology and the transferability of our conclusions.
Bastián-Monarca, Nicolás A; Suárez, Enrique; Arenas, Jorge P
2016-04-15
In many countries such as Chile, there is scarce official information for generating accurate noise maps. Therefore, specific simplification methods are becoming a real need for the acoustic community in developing countries. Thus, the main purpose of this work was to evaluate and apply simplified methods to generate a cost-effective traffic noise map of a small city of Chile. The experimental design involved the simplification of the cartographic information on buildings by clustering the households within a block, and the classification of the vehicular traffic flows into categories to generate an inexpensive noise map. The streets have been classified according to the official road classification of the country. Segregation of vehicles from light, heavy and motorbikes is made to account for traffic flow. In addition, a number of road traffic noise models were compared with noise measurements and consequently the road traffic model RLS-90 was chosen to generate the noise map of the city using the Computer Aided Noise Abatement (CadnaA) software. It was observed a direct dependence between noise levels and traffic flow versus each category of street used. The methodology developed in this study appears to be convenient in developing countries to obtain accurate approximations to develop inexpensive traffic noise maps. Copyright © 2016 Elsevier B.V. All rights reserved.
Cellular automata models for diffusion of information and highway traffic flow
NASA Astrophysics Data System (ADS)
Fuks, Henryk
In the first part of this work we study a family of deterministic models for highway traffic flow which generalize cellular automaton rule 184. This family is parameterized by the speed limit m and another parameter k that represents degree of 'anticipatory driving'. We compare two driving strategies with identical maximum throughput: 'conservative' driving with high speed limit and 'anticipatory' driving with low speed limit. Those two strategies are evaluated in terms of accident probability. We also discuss fundamental diagrams of generalized traffic rules and examine limitations of maximum achievable throughput. Possible modifications of the model are considered. For rule 184, we present exact calculations of the order parameter in a transition from the moving phase to the jammed phase using the method of preimage counting, and use this result to construct a solution to the density classification problem. In the second part we propose a probabilistic cellular automaton model for the spread of innovations, rumors, news, etc., in a social system. We start from simple deterministic models, for which exact expressions for the density of adopters are derived. For a more realistic model, based on probabilistic cellular automata, we study the influence of a range of interaction R on the shape of the adoption curve. When the probability of adoption is proportional to the local density of adopters, and individuals can drop the innovation with some probability p, the system exhibits a second order phase transition. Critical line separating regions of parameter space in which asymptotic density of adopters is positive from the region where it is equal to zero converges toward the mean-field line when the range of the interaction increases. In a region between R=1 critical line and the mean-field line asymptotic density of adopters depends on R, becoming zero if R is too small (smaller than some critical value). This result demonstrates the importance of connectivity in diffusion of information. We also define a new class of automata networks which incorporates non-local interactions, and discuss its applicability in modeling of diffusion of innovations.
Dispersion-relation phase spectroscopy of neuron transport
NASA Astrophysics Data System (ADS)
Wang, Ru; Wang, Zhuo; Millet, Larry; Gillette, Martha; Leigh, Joseph Robert; Sobh, Nahil; Levine, Alex; Popescu, Gabreil
2012-02-01
Molecular motors move materials along prescribed biopolymer tracks. This sort of active transport is required to rapidly move products over large distances within the cell, where passive diffusion is too slow. We examine intracellular traffic patterns using a new application of spatial light interference microscopy (SLIM) and measure the dispersion relation, i.e. decay rate vs. spatial mode, associated with mass transport in live cells. This approach applies equally well to both discrete and continuous mass distributions without the need for particle tracking. From the quadratic experimental curve specific to diffusion, we extracted the diffusion coefficient as the only fitting parameter. The linear portion of the dispersion relation reveals the deterministic component of the intracellular transport. Our data show a universal behavior where the intracellular transport is diffusive at small scales and deterministic at large scales. We further applied this method to studying transport in neurons and are able to use SLIM to map the changes in index of refraction across the neuron and its extended processes. We found that in dendrites and axons, the transport is mostly active, i.e., diffusion is subdominant.
Schilde, M.; Doerner, K.F.; Hartl, R.F.
2014-01-01
In urban areas, logistic transportation operations often run into problems because travel speeds change, depending on the current traffic situation. If not accounted for, time-dependent and stochastic travel speeds frequently lead to missed time windows and thus poorer service. Especially in the case of passenger transportation, it often leads to excessive passenger ride times as well. Therefore, time-dependent and stochastic influences on travel speeds are relevant for finding feasible and reliable solutions. This study considers the effect of exploiting statistical information available about historical accidents, using stochastic solution approaches for the dynamic dial-a-ride problem (dynamic DARP). The authors propose two pairs of metaheuristic solution approaches, each consisting of a deterministic method (average time-dependent travel speeds for planning) and its corresponding stochastic version (exploiting stochastic information while planning). The results, using test instances with up to 762 requests based on a real-world road network, show that in certain conditions, exploiting stochastic information about travel speeds leads to significant improvements over deterministic approaches. PMID:25844013
NASA Astrophysics Data System (ADS)
Guan, Fengjiao; Zhang, Guanjun; Liu, Jie; Wang, Shujing; Luo, Xu; Zhu, Feng
2017-10-01
Accurate material parameters are critical to construct the high biofidelity finite element (FE) models. However, it is hard to obtain the brain tissue parameters accurately because of the effects of irregular geometry and uncertain boundary conditions. Considering the complexity of material test and the uncertainty of friction coefficient, a computational inverse method for viscoelastic material parameters identification of brain tissue is presented based on the interval analysis method. Firstly, the intervals are used to quantify the friction coefficient in the boundary condition. And then the inverse problem of material parameters identification under uncertain friction coefficient is transformed into two types of deterministic inverse problem. Finally the intelligent optimization algorithm is used to solve the two types of deterministic inverse problems quickly and accurately, and the range of material parameters can be easily acquired with no need of a variety of samples. The efficiency and convergence of this method are demonstrated by the material parameters identification of thalamus. The proposed method provides a potential effective tool for building high biofidelity human finite element model in the study of traffic accident injury.
Efficient Trajectory Options Allocation for the Collaborative Trajectory Options Program
NASA Technical Reports Server (NTRS)
Rodionova, Olga; Arneson, Heather; Sridhar, Banavar; Evans, Antony
2017-01-01
The Collaborative Trajectory Options Program (CTOP) is a Traffic Management Initiative (TMI) intended to control the air traffic flow rates at multiple specified Flow Constrained Areas (FCAs), where demand exceeds capacity. CTOP allows flight operators to submit the desired Trajectory Options Set (TOS) for each affected flight with associated Relative Trajectory Cost (RTC) for each option. CTOP then creates a feasible schedule that complies with capacity constraints by assigning affected flights with routes and departure delays in such a way as to minimize the total cost while maintaining equity across flight operators. The current version of CTOP implements a Ration-by-Schedule (RBS) scheme, which assigns the best available options to flights based on a First-Scheduled-First-Served heuristic. In the present study, an alternative flight scheduling approach is developed based on linear optimization. Results suggest that such an approach can significantly reduce flight delays, in the deterministic case, while maintaining equity as defined using a Max-Min fairness scheme.
Detecting Pulsing Denial-of-Service Attacks with Nondeterministic Attack Intervals
NASA Astrophysics Data System (ADS)
Luo, Xiapu; Chan, Edmond W. W.; Chang, Rocky K. C.
2009-12-01
This paper addresses the important problem of detecting pulsing denial of service (PDoS) attacks which send a sequence of attack pulses to reduce TCP throughput. Unlike previous works which focused on a restricted form of attacks, we consider a very broad class of attacks. In particular, our attack model admits any attack interval between two adjacent pulses, whether deterministic or not. It also includes the traditional flooding-based attacks as a limiting case (i.e., zero attack interval). Our main contribution is Vanguard, a new anomaly-based detection scheme for this class of PDoS attacks. The Vanguard detection is based on three traffic anomalies induced by the attacks, and it detects them using a CUSUM algorithm. We have prototyped Vanguard and evaluated it on a testbed. The experiment results show that Vanguard is more effective than the previous methods that are based on other traffic anomalies (after a transformation using wavelet transform, Fourier transform, and autocorrelation) and detection algorithms (e.g., dynamic time warping).
Chen, Cong; Zhang, Guohui; Yang, Jinfu; Milton, John C; Alcántara, Adélamar Dely
2016-05-01
Rear-end crashes are a major type of traffic crashes in the U.S. Of practical necessity is a comprehensive examination of its mechanism that results in injuries and fatalities. Decision table (DT) and Naïve Bayes (NB) methods have both been used widely but separately for solving classification problems in multiple areas except for traffic safety research. Based on a two-year rear-end crash dataset, this paper applies a decision table/Naïve Bayes (DTNB) hybrid classifier to select the deterministic attributes and predict driver injury outcomes in rear-end crashes. The test results show that the hybrid classifier performs reasonably well, which was indicated by several performance evaluation measurements, such as accuracy, F-measure, ROC, and AUC. Fifteen significant attributes were found to be significant in predicting driver injury severities, including weather, lighting conditions, road geometry characteristics, driver behavior information, etc. The extracted decision rules demonstrate that heavy vehicle involvement, a comfortable traffic environment, inferior lighting conditions, two-lane rural roadways, vehicle disabled damage, and two-vehicle crashes would increase the likelihood of drivers sustaining fatal injuries. The research limitations on data size, data structure, and result presentation are also summarized. The applied methodology and estimation results provide insights for developing effective countermeasures to alleviate rear-end crash injury severities and improve traffic system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.
A robust approach to chance constrained optimal power flow with renewable generation
Lubin, Miles; Dvorkin, Yury; Backhaus, Scott N.
2016-09-01
Optimal Power Flow (OPF) dispatches controllable generation at minimum cost subject to operational constraints on generation and transmission assets. The uncertainty and variability of intermittent renewable generation is challenging current deterministic OPF approaches. Recent formulations of OPF use chance constraints to limit the risk from renewable generation uncertainty, however, these new approaches typically assume the probability distributions which characterize the uncertainty and variability are known exactly. We formulate a robust chance constrained (RCC) OPF that accounts for uncertainty in the parameters of these probability distributions by allowing them to be within an uncertainty set. The RCC OPF is solved usingmore » a cutting-plane algorithm that scales to large power systems. We demonstrate the RRC OPF on a modified model of the Bonneville Power Administration network, which includes 2209 buses and 176 controllable generators. In conclusion, deterministic, chance constrained (CC), and RCC OPF formulations are compared using several metrics including cost of generation, area control error, ramping of controllable generators, and occurrence of transmission line overloads as well as the respective computational performance.« less
Kanter, Ido; Butkovski, Maria; Peleg, Yitzhak; Zigzag, Meital; Aviad, Yaara; Reidler, Igor; Rosenbluh, Michael; Kinzel, Wolfgang
2010-08-16
Random bit generators (RBGs) constitute an important tool in cryptography, stochastic simulations and secure communications. The later in particular has some difficult requirements: high generation rate of unpredictable bit strings and secure key-exchange protocols over public channels. Deterministic algorithms generate pseudo-random number sequences at high rates, however, their unpredictability is limited by the very nature of their deterministic origin. Recently, physical RBGs based on chaotic semiconductor lasers were shown to exceed Gbit/s rates. Whether secure synchronization of two high rate physical RBGs is possible remains an open question. Here we propose a method, whereby two fast RBGs based on mutually coupled chaotic lasers, are synchronized. Using information theoretic analysis we demonstrate security against a powerful computational eavesdropper, capable of noiseless amplification, where all parameters are publicly known. The method is also extended to secure synchronization of a small network of three RBGs.
Deterministic Generation of All-Photonic Quantum Repeaters from Solid-State Emitters
NASA Astrophysics Data System (ADS)
Buterakos, Donovan; Barnes, Edwin; Economou, Sophia E.
2017-10-01
Quantum repeaters are nodes in a quantum communication network that allow reliable transmission of entanglement over large distances. It was recently shown that highly entangled photons in so-called graph states can be used for all-photonic quantum repeaters, which require substantially fewer resources compared to atomic-memory-based repeaters. However, standard approaches to building multiphoton entangled states through pairwise probabilistic entanglement generation severely limit the size of the state that can be created. Here, we present a protocol for the deterministic generation of large photonic repeater states using quantum emitters such as semiconductor quantum dots and defect centers in solids. We show that arbitrarily large repeater states can be generated using only one emitter coupled to a single qubit, potentially reducing the necessary number of photon sources by many orders of magnitude. Our protocol includes a built-in redundancy, which makes it resilient to photon loss.
Komarov, Ivan; D'Souza, Roshan M
2012-01-01
The Gillespie Stochastic Simulation Algorithm (GSSA) and its variants are cornerstone techniques to simulate reaction kinetics in situations where the concentration of the reactant is too low to allow deterministic techniques such as differential equations. The inherent limitations of the GSSA include the time required for executing a single run and the need for multiple runs for parameter sweep exercises due to the stochastic nature of the simulation. Even very efficient variants of GSSA are prohibitively expensive to compute and perform parameter sweeps. Here we present a novel variant of the exact GSSA that is amenable to acceleration by using graphics processing units (GPUs). We parallelize the execution of a single realization across threads in a warp (fine-grained parallelism). A warp is a collection of threads that are executed synchronously on a single multi-processor. Warps executing in parallel on different multi-processors (coarse-grained parallelism) simultaneously generate multiple trajectories. Novel data-structures and algorithms reduce memory traffic, which is the bottleneck in computing the GSSA. Our benchmarks show an 8×-120× performance gain over various state-of-the-art serial algorithms when simulating different types of models.
Flight Departure Delay and Rerouting Under Uncertainty in En Route Convective Weather
NASA Technical Reports Server (NTRS)
Mukherjee, Avijit; Grabbe, Shon; Sridhar, Banavar
2011-01-01
Delays caused by uncertainty in weather forecasts can be reduced by improving traffic flow management decisions. This paper presents a methodology for traffic flow management under uncertainty in convective weather forecasts. An algorithm for assigning departure delays and reroutes to aircraft is presented. Departure delay and route assignment are executed at multiple stages, during which, updated weather forecasts and flight schedules are used. At each stage, weather forecasts up to a certain look-ahead time are treated as deterministic and flight scheduling is done to mitigate the impact of weather on four-dimensional flight trajectories. Uncertainty in weather forecasts during departure scheduling results in tactical airborne holding of flights. The amount of airborne holding depends on the accuracy of forecasts as well as the look-ahead time included in the departure scheduling. The weather forecast look-ahead time is varied systematically within the experiments performed in this paper to analyze its effect on flight delays. Based on the results, longer look-ahead times cause higher departure delays and additional flying time due to reroutes. However, the amount of airborne holding necessary to prevent weather incursions reduces when the forecast look-ahead times are higher. For the chosen day of traffic and weather, setting the look-ahead time to 90 minutes yields the lowest total delay cost.
Design mechanic generator under speed bumper to support electricity recourse for urban traffic light
NASA Astrophysics Data System (ADS)
Sabri, M.; Lauzuardy, Jason; Syam, Bustami
2018-03-01
The electrical energy needs for the traffic lights in some cities of developing countries cannot be achieved continuously due to limited capacity and interruption of electricity distribution, the main power plant. This issues can lead to congestion at the crossroads. To overcome the problem of street chaos due to power failure, we can cultivate to provide electrical energy from other sources such as using the bumper to generate kinetic energy, which can be converted into electrical energy. This study designed a generator mechanic that will be mounted on the bumper construction to generate electricity for the purposes of traffic lights at the crossroads. The Mechanical generator is composed of springs, levers, sprockets, chains, flywheel and customize power generator. Through the rotation of the flywheel, we can earned 9 Volt DC voltage and electrical current of 5.89 Ampere. This achievement can be used to charge the accumulator which can be used to power the traffic lights, and to charge the accumulator capacity of 6 Ah, the generator works in the charging time for 1.01 hours.
Classification of Automated Search Traffic
NASA Astrophysics Data System (ADS)
Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.
As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.
Expanding Regional Airport Usage to Accommodate Increased Air Traffic Demand
NASA Technical Reports Server (NTRS)
Russell, Carl R.
2009-01-01
Small regional airports present an underutilized source of capacity in the national air transportation system. This study sought to determine whether a 50 percent increase in national operations could be achieved by limiting demand growth at large hub airports and instead growing traffic levels at the surrounding regional airports. This demand scenario for future air traffic in the United States was generated and used as input to a 24-hour simulation of the national airspace system. Results of the demand generation process and metrics predicting the simulation results are presented, in addition to the actual simulation results. The demand generation process showed that sufficient runway capacity exists at regional airports to offload a significant portion of traffic from hub airports. Predictive metrics forecast a large reduction of delays at most major airports when demand is shifted. The simulation results then show that offloading hub traffic can significantly reduce nationwide delays.
Streamlining Traffic Mitigation Fees
DOT National Transportation Integrated Search
1999-01-01
The City of Lacey rewrote the ordinance governing collection of fees to mitigate : development impacts on the transportation system. Previously developers : submitted traffic generation and distribution reports prepared by qualified : traffic enginee...
Methodology for Generating Conflict Scenarios by Time Shifting Recorded Traffic Data
NASA Technical Reports Server (NTRS)
Paglione, Mike; Oaks, Robert; Bilimoria, Karl D.
2003-01-01
A methodology is presented for generating conflict scenarios that can be used as test cases to estimate the operational performance of a conflict probe. Recorded air traffic data is time shifted to create traffic scenarios featuring conflicts with characteristic properties similar to those encountered in typical air traffic operations. First, a reference set of conflicts is obtained from trajectories that are computed using birth points and nominal flight plans extracted from recorded traffic data. Distributions are obtained for several primary properties (e.g., encounter angle) that are most likely to affect the performance of a conflict probe. A genetic algorithm is then utilized to determine the values of time shifts for the recorded track data so that the primary properties of conflicts generated by the time shifted data match those of the reference set. This methodology is successfully demonstrated using recorded traffic data for the Memphis Air Route Traffic Control Center; a key result is that the required time shifts are less than 5 min for 99% of the tracks. It is also observed that close matching of the primary properties used in this study additionally provides a good match for some other secondary properties.
Warrants for major traffic generator guide signing.
DOT National Transportation Integrated Search
2009-09-01
Major traffic generators (MTGs) are important regional attractions, events, or facilities that attract persons or groups from beyond a local community, city, or metropolitan area. MTGs are significant because of their unique educational, cultural, hi...
A retrospective evaluation of traffic forecasting techniques.
DOT National Transportation Integrated Search
2016-08-01
Traffic forecasting techniquessuch as extrapolation of previous years traffic volumes, regional travel demand models, or : local trip generation rateshelp planners determine needed transportation improvements. Thus, knowing the accuracy of t...
Traffic Flow Management Using Aggregate Flow Models and the Development of Disaggregation Methods
NASA Technical Reports Server (NTRS)
Sun, Dengfeng; Sridhar, Banavar; Grabbe, Shon
2010-01-01
A linear time-varying aggregate traffic flow model can be used to develop Traffic Flow Management (tfm) strategies based on optimization algorithms. However, there are no methods available in the literature to translate these aggregate solutions into actions involving individual aircraft. This paper describes and implements a computationally efficient disaggregation algorithm, which converts an aggregate (flow-based) solution to a flight-specific control action. Numerical results generated by the optimization method and the disaggregation algorithm are presented and illustrated by applying them to generate TFM schedules for a typical day in the U.S. National Airspace System. The results show that the disaggregation algorithm generates control actions for individual flights while keeping the air traffic behavior very close to the optimal solution.
NASA Astrophysics Data System (ADS)
He, Hong-di; Lu, Wei-Zhen; Xue, Yu
2009-12-01
At urban traffic intersections, vehicles frequently stop with idling engines during the red-light period and speed up rapidly during the green-light period. The changes of driving patterns (i.e., idle, acceleration, deceleration and cruising patterns) generally produce uncertain emission. Additionally, the movement of pedestrians and the influence of wind further result in the random dispersion of pollutants. It is, therefore, too complex to simulate the effects of such dynamics on the resulting emission using conventional deterministic causal models. For this reason, a modified semi-empirical box model for predicting the PM 10 concentrations on roadsides is proposed in this paper. The model constitutes three parts, i.e., traffic, emission and dispersion components. The traffic component is developed using a generalized force traffic model to obtain the instantaneous velocity and acceleration when vehicles move through intersections. Hence the distribution of vehicle emission in street canyon during the green-light period is calculated. Then the dispersion component is investigated using a semi-empirical box model combining average wind speed, box height and background concentrations. With these considerations, the proposed model is applied and evaluated using measured data at a busy traffic intersection in Mong Kok, Hong Kong. In order to test the performance of the model, two situations, i.e., the data sets within a sunny day and between two sunny days, were selected to examine the model performance. The predicted values are generally well coincident with the observed data during different time slots except several values are overestimated or underestimated. Moreover, two types of vehicles, i.e., buses and petrol cars, are separately taken into account in the study. Buses are verified to contribute most to the emission in street canyons, which may be useful in evaluating the impact of vehicle emissions on the ambient air quality when there is a significant change in a specific vehicular population.
Near real-time traffic routing
NASA Technical Reports Server (NTRS)
Yang, Chaowei (Inventor); Xie, Jibo (Inventor); Zhou, Bin (Inventor); Cao, Ying (Inventor)
2012-01-01
A near real-time physical transportation network routing system comprising: a traffic simulation computing grid and a dynamic traffic routing service computing grid. The traffic simulator produces traffic network travel time predictions for a physical transportation network using a traffic simulation model and common input data. The physical transportation network is divided into a multiple sections. Each section has a primary zone and a buffer zone. The traffic simulation computing grid includes multiple of traffic simulation computing nodes. The common input data includes static network characteristics, an origin-destination data table, dynamic traffic information data and historical traffic data. The dynamic traffic routing service computing grid includes multiple dynamic traffic routing computing nodes and generates traffic route(s) using the traffic network travel time predictions.
Man, road and vehicle: risk factors associated with the severity of traffic accidents.
Almeida, Rosa Lívia Freitas de; Bezerra Filho, José Gomes; Braga, José Ueleres; Magalhães, Francismeire Brasileiro; Macedo, Marinila Calderaro Munguba; Silva, Kellyanne Abreu
2013-08-01
To describe the main characteristics of victims, roads and vehicles involved in traffic accidents and the risk factors involved in accidents resulting in death. METHODS A non-concurrent cohort study of traffic accidents in Fortaleza, CE, Northeastern Brazil, in the period from January 2004 to December 2008. Data from the Fortaleza Traffic Accidents Information System, the Mortality Information System, the Hospital Information System and the State Traffic Department Driving Licenses and Vehicle database. Deterministic and probabilistic relationship techniques were used to integrate the databases. First, descriptive analysis of data relating to people, roads, vehicles and weather was carried out. In the investigation of risk factors for death by traffic accident, generalized linear models were used. The fit of the model was verified by likelihood ratio and ROC analysis. RESULTS There were 118,830 accidents recorded in the period. The most common types of accidents were crashes/collisions (78.1%), running over pedestrians (11.9%), colliding with a fixed obstacle (3.9%), and with motorcycles (18.1%). Deaths occurred in 1.4% of accidents. The factors that were independently associated with death by traffic accident in the final model were bicycles (OR = 21.2, 95%CI 16.1;27.8), running over pedestrians OR = 5.9 (95%CI 3.7;9.2), collision with a fixed obstacle (OR = 5.7, 95%CI 3.1;10.5) and accidents involving motorcyclists (OR = 3.5, 95%CI 2.6;4.6). The main contributing factors were a single person being involved (OR = 6.6, 95%CI 4.1;10.73), presence of unskilled drivers (OR = 4.1, 95%CI 2.9;5.5) a single vehicle (OR = 3.9, 95%CI 2,3;6,4), male (OR = 2.5, 95%CI 1.9;3.3), traffic on roads under federal jurisdiction (OR = 2.4, 95%CI 1.8;3.7), early morning hours (OR = 2.4, 95%CI 1.8;3.0), and Sundays (OR = 1.7, 95%CI 1.3;2.2), adjusted according to the log-binomial model. CONCLUSIONS Activities promoting the prevention of traffic accidents should primarily focus on accidents involving two-wheeled vehicles that most often involves a single person, unskilled, male, at nighttime, on weekends and on roads where they travel at higher speeds.
A Numerical Simulation of Traffic-Related Air Pollution Exposures in Urban Street Canyons
NASA Astrophysics Data System (ADS)
Liu, J.; Fu, X.; Tao, S.
2016-12-01
Urban street canyons are usually associated with intensive vehicle emissions. However, the high buildings successively along both sides of a street block the dispersion of traffic-generated air pollutants, which enhances human exposure and adversely affects human health. In this study, an urban scale traffic pollution dispersion model is developed with the consideration of street distribution, canyon geometry, background meteorology, traffic assignment, traffic emissions and air pollutant dispersion. Vehicle exhausts generated from traffic flows will first disperse inside a street canyon along the micro-scale wind field (generated by computational fluid dynamics (CFD) model) and then leave the street canyon and further disperse over the urban area. On the basis of this model, the effects of canyon geometry on the distribution of NOx and CO from traffic emissions were studied over the center of Beijing, China. We found that an increase of building height along the streets leads to higher pollution levels inside streets and lower pollution levels outside, resulting in higher domain-averaged concentrations over the area. In addition, street canyons with equal (or highly uneven) building heights on two sides of a street tend to lower the urban-scale air pollution concentrations at pedestrian level. Our results indicate that canyon geometry strongly influences human exposure to traffic pollutants in the populated urban area. Carefully planning street layout and canyon geometry in consideration of traffic demand as well as local weather pattern may significantly reduce the chances of unhealthy air being inhaled by urban residents.
Generation of Werner states via collective decay of coherently driven atoms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agarwal, Girish S.; Kapale, Kishore T.
2006-02-15
We show deterministic generation of Werner states as a steady state of the collective decay dynamics of a pair of neutral atoms coupled to a leaky cavity and strong coherent drive. We also show how the scheme can be extended to generate a 2N-particle analogue of the bipartite Werner states.
Ibrahim, Ahmad M.; Wilson, Paul P.H.; Sawan, Mohamed E.; ...
2015-06-30
The CADIS and FW-CADIS hybrid Monte Carlo/deterministic techniques dramatically increase the efficiency of neutronics modeling, but their use in the accurate design analysis of very large and geometrically complex nuclear systems has been limited by the large number of processors and memory requirements for their preliminary deterministic calculations and final Monte Carlo calculation. Three mesh adaptivity algorithms were developed to reduce the memory requirements of CADIS and FW-CADIS without sacrificing their efficiency improvement. First, a macromaterial approach enhances the fidelity of the deterministic models without changing the mesh. Second, a deterministic mesh refinement algorithm generates meshes that capture as muchmore » geometric detail as possible without exceeding a specified maximum number of mesh elements. Finally, a weight window coarsening algorithm decouples the weight window mesh and energy bins from the mesh and energy group structure of the deterministic calculations in order to remove the memory constraint of the weight window map from the deterministic mesh resolution. The three algorithms were used to enhance an FW-CADIS calculation of the prompt dose rate throughout the ITER experimental facility. Using these algorithms resulted in a 23.3% increase in the number of mesh tally elements in which the dose rates were calculated in a 10-day Monte Carlo calculation and, additionally, increased the efficiency of the Monte Carlo simulation by a factor of at least 3.4. The three algorithms enabled this difficult calculation to be accurately solved using an FW-CADIS simulation on a regular computer cluster, eliminating the need for a world-class super computer.« less
DOT National Transportation Integrated Search
2011-01-01
Truck volumes represented on this map are Annual Average Daily Traffic Volumes between major traffic generators: i.e., Highway Junctions and Cities. : Truck volumes include 6-Tire and 3 Axle single unit trucks, buses and all multiple unit trucks.
Transportation research synthesis : effects of major traffic generators on local highway systems.
DOT National Transportation Integrated Search
2010-01-01
The Minnesota Department of Transportation initiated a study focused on the effects of major traffic generators on : local highway systems. Minnesota State University and SRF Consulting Group, Inc. will conduct a major research : study on the topic. ...
Studies of next generation air traffic control specialists : why be an air traffic controller?
DOT National Transportation Integrated Search
2011-08-01
With phrases such as Managing Millennials (Gimbel, 2007), descriptions of generational differences are a staple in the : human resources (HR) trade press and corporate training. The Federal Aviation Administration (FAA) offers a course in : man...
DOT National Transportation Integrated Search
2009-03-01
To prepare for forecasted air traffic : growth, the Federal Aviation : Administration (FAA), including its : Joint Planning and Development : Office (JPDO) and Air Traffic : Organization (ATO), is planning for : and implementing the Next : Generation...
Semiautomated Management Of Arriving Air Traffic
NASA Technical Reports Server (NTRS)
Erzberger, Heinz; Nedell, William
1992-01-01
System of computers, graphical workstations, and computer programs developed for semiautomated management of approach and arrival of numerous aircraft at airport. System comprises three subsystems: traffic-management advisor, used for controlling traffic into terminal area; descent advisor generates information integrated into plan-view display of traffic on monitor; and final-approach-spacing tool used to merge traffic converging on final approach path while making sure aircraft are properly spaced. Not intended to restrict decisions of air-traffic controllers.
Analysis of Malicious Traffic in Modbus/TCP Communications
NASA Astrophysics Data System (ADS)
Kobayashi, Tiago H.; Batista, Aguinaldo B.; Medeiros, João Paulo S.; Filho, José Macedo F.; Brito, Agostinho M.; Pires, Paulo S. Motta
This paper presents the results of our analysis about the influence of Information Technology (IT) malicious traffic on an IP-based automation environment. We utilized a traffic generator, called MACE (Malicious trAffic Composition Environment), to inject malicious traffic in a Modbus/TCP communication system and a sniffer to capture and analyze network traffic. The realized tests show that malicious traffic represents a serious risk to critical information infrastructures. We show that this kind of traffic can increase latency of Modbus/TCP communication and that, in some cases, can put Modbus/TCP devices out of communication.
The Trajectory Synthesizer Generalized Profile Interface
NASA Technical Reports Server (NTRS)
Lee, Alan G.; Bouyssounouse, Xavier; Murphy, James R.
2010-01-01
The Trajectory Synthesizer is a software program that generates aircraft predictions for Air Traffic Management decision support tools. The Trajectory Synthesizer being used by researchers at NASA Ames Research Center was restricted in the number of trajectory types that could be generated. This limitation was not sufficient to support the rapidly changing Air Traffic Management research requirements. The Generalized Profile Interface was developed to address this issue. It provides a flexible approach to describe the constraints applied to trajectory generation and may provide a method for interoperability between trajectory generators. It also supports the request and generation of new types of trajectory profiles not possible with the previous interface to the Trajectory Synthesizer. Other enhancements allow the Trajectory Synthesizer to meet the current and future needs of Air Traffic Management research.
DOT National Transportation Integrated Search
2009-03-01
"To prepare for forecasted air traffic growth, the Federal Aviation Administration (FAA), including its Joint Planning and Development Office (JPDO) and Air Traffic Organization (ATO), is planning for and implementing the Next Generation Air Transpor...
Automated variance reduction for MCNP using deterministic methods.
Sweezy, J; Brown, F; Booth, T; Chiaramonte, J; Preeg, B
2005-01-01
In order to reduce the user's time and the computer time needed to solve deep penetration problems, an automated variance reduction capability has been developed for the MCNP Monte Carlo transport code. This new variance reduction capability developed for MCNP5 employs the PARTISN multigroup discrete ordinates code to generate mesh-based weight windows. The technique of using deterministic methods to generate importance maps has been widely used to increase the efficiency of deep penetration Monte Carlo calculations. The application of this method in MCNP uses the existing mesh-based weight window feature to translate the MCNP geometry into geometry suitable for PARTISN. The adjoint flux, which is calculated with PARTISN, is used to generate mesh-based weight windows for MCNP. Additionally, the MCNP source energy spectrum can be biased based on the adjoint energy spectrum at the source location. This method can also use angle-dependent weight windows.
Deterministic entanglement generation from driving through quantum phase transitions.
Luo, Xin-Yu; Zou, Yi-Quan; Wu, Ling-Na; Liu, Qi; Han, Ming-Fei; Tey, Meng Khoon; You, Li
2017-02-10
Many-body entanglement is often created through the system evolution, aided by nonlinear interactions between the constituting particles. These very dynamics, however, can also lead to fluctuations and degradation of the entanglement if the interactions cannot be controlled. Here, we demonstrate near-deterministic generation of an entangled twin-Fock condensate of ~11,000 atoms by driving a arubidium-87 Bose-Einstein condensate undergoing spin mixing through two consecutive quantum phase transitions (QPTs). We directly observe number squeezing of 10.7 ± 0.6 decibels and normalized collective spin length of 0.99 ± 0.01. Together, these observations allow us to infer an entanglement-enhanced phase sensitivity of ~6 decibels beyond the standard quantum limit and an entanglement breadth of ~910 atoms. Our work highlights the power of generating large-scale useful entanglement by taking advantage of the different entanglement landscapes separated by QPTs. Copyright © 2017, American Association for the Advancement of Science.
Evidence of Long Range Dependence and Self-similarity in Urban Traffic Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thakur, Gautam S; Helmy, Ahmed; Hui, Pan
2015-01-01
Transportation simulation technologies should accurately model traffic demand, distribution, and assignment parame- ters for urban environment simulation. These three param- eters significantly impact transportation engineering bench- mark process, are also critical in realizing realistic traffic modeling situations. In this paper, we model and charac- terize traffic density distribution of thousands of locations around the world. The traffic densities are generated from millions of images collected over several years and processed using computer vision techniques. The resulting traffic den- sity distribution time series are then analyzed. It is found using the goodness-of-fit test that the traffic density dis- tributions follows heavy-tailmore » models such as Log-gamma, Log-logistic, and Weibull in over 90% of analyzed locations. Moreover, a heavy-tail gives rise to long-range dependence and self-similarity, which we studied by estimating the Hurst exponent (H). Our analysis based on seven different Hurst estimators strongly indicate that the traffic distribution pat- terns are stochastically self-similar (0.5 H 1.0). We believe this is an important finding that will influence the design and development of the next generation traffic simu- lation techniques and also aid in accurately modeling traffic engineering of urban systems. In addition, it shall provide a much needed input for the development of smart cities.« less
The Aeronautical Data Link: Taxonomy, Architectural Analysis, and Optimization
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Goode, Plesent W.
2002-01-01
The future Communication, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) System will rely on global satellite navigation, and ground-based and satellite based communications via Multi-Protocol Networks (e.g. combined Aeronautical Telecommunications Network (ATN)/Internet Protocol (IP)) to bring about needed improvements in efficiency and safety of operations to meet increasing levels of air traffic. This paper will discuss the development of an approach that completely describes optimal data link architecture configuration and behavior to meet the multiple conflicting objectives of concurrent and different operations functions. The practical application of the approach enables the design and assessment of configurations relative to airspace operations phases. The approach includes a formal taxonomic classification, an architectural analysis methodology, and optimization techniques. The formal taxonomic classification provides a multidimensional correlation of data link performance with data link service, information protocol, spectrum, and technology mode; and to flight operations phase and environment. The architectural analysis methodology assesses the impact of a specific architecture configuration and behavior on the local ATM system performance. Deterministic and stochastic optimization techniques maximize architectural design effectiveness while addressing operational, technology, and policy constraints.
Estimation of number of fatalities caused by toxic gases due to fire in road tunnels.
Qu, Xiaobo; Meng, Qiang; Liu, Zhiyuan
2013-01-01
The quantitative risk assessment (QRA) is one of the explicit requirements under the European Union (EU) Directive (2004/54/EC). As part of this, it is essential to be able to estimate the number of fatalities in different accident scenarios. In this paper, a tangible methodology is developed to estimate the number of fatalities caused by toxic gases due to fire in road tunnels by incorporating traffic flow and the spread of fire in tunnels. First, a deterministic queuing model is proposed to calculate the number of people at risk, by taking into account tunnel geometry, traffic flow patterns, and incident response plans for road tunnels. Second, the Fire Dynamics Simulator (FDS) is used to obtain the temperature and concentrations of CO, CO(2), and O(2). By taking advantage of the additivity of the fractional effective dose (FED) method, fatality rates for different locations in given time periods can be estimated. An illustrative case study is carried out to demonstrate the applicability of the proposed methodology. Copyright © 2012 Elsevier Ltd. All rights reserved.
Fiber optic voice/data network
NASA Technical Reports Server (NTRS)
Bergman, Larry A. (Inventor)
1989-01-01
An asynchronous, high-speed, fiber optic local area network originally developed for tactical environments with additional benefits for other environments such as spacecraft, and the like. The network supports ordinary data packet traffic simultaneously with synchronous T1 voice traffic over a common token ring channel; however, the techniques and apparatus of this invention can be applied to any deterministic class of packet data networks, including multitier backbones, that must transport stream data (e.g., video, SAR, sensors) as well as data. A voice interface module parses, buffers, and resynchronizes the voice data to the packet network employing elastic buffers on both the sending and receiving ends. Voice call setup and switching functions are performed external to the network with ordinary PABX equipment. Clock information is passed across network boundaries in a token passing ring by preceeding the token with an idle period of non-transmission which allows the token to be used to re-establish a clock synchronized to the data. Provision is made to monitor and compensate the elastic receiving buffers so as to prevent them from overflowing or going empty.
Impact of noise and air pollution on pregnancy outcomes.
Gehring, Ulrike; Tamburic, Lillian; Sbihi, Hind; Davies, Hugh W; Brauer, Michael
2014-05-01
Motorized traffic is an important source of both air pollution and community noise. While there is growing evidence for an adverse effect of ambient air pollution on reproductive health, little is known about the association between traffic noise and pregnancy outcomes. We evaluated the impact of residential noise exposure on small size for gestational age, preterm birth, term birth weight, and low birth weight at term in a population-based cohort study, for which we previously reported associations between air pollution and pregnancy outcomes. We also evaluated potential confounding of air pollution effects by noise and vice versa. Linked administrative health data sets were used to identify 68,238 singleton births (1999-2002) in Vancouver, British Columbia, Canada, with complete covariate data (sex, ethnicity, parity, birth month and year, income, and education) and maternal residential history. We estimated exposure to noise with a deterministic model (CadnaA) and exposure to air pollution using temporally adjusted land-use regression models and inverse distance weighting of stationary monitors for the entire pregnancy. Noise exposure was negatively associated with term birth weight (mean difference = -19 [95% confidence interval = -23 to -15] g per 6 dB(A)). In joint air pollution-noise models, associations between noise and term birth weight remained largely unchanged, whereas associations decreased for all air pollutants. Traffic may affect birth weight through exposure to both air pollution and noise.
An Enhanced Convective Forecast (ECF) for the New York TRACON Area
NASA Technical Reports Server (NTRS)
Wheeler, Mark; Stobie, James; Gillen, Robert; Jedlovec, Gary; Sims, Danny
2008-01-01
In an effort to relieve summer-time congestion in the NY Terminal Radar Approach Control (TRACON) area, the FAA is testing an enhanced convective forecast (ECF) product. The test began in June 2008 and is scheduled to run through early September. The ECF is updated every two hours, right before the Air Traffic Control System Command Center (ATCSCC) national planning telcon. It is intended to be used by traffic managers throughout the National Airspace System (NAS) and airlines dispatchers to supplement information from the Collaborative Convective Forecast Product (CCFP) and the Corridor Integrated Weather System (CIWS). The ECF begins where the current CIWS forecast ends at 2 hours and extends out to 12 hours. Unlike the CCFP it is a detailed deterministic forecast with no aerial coverage limits. It is created by an ENSCO forecaster using a variety of guidance products including, the Weather Research and Forecast (WRF) model. This is the same version of the WRF that ENSCO runs over the Florida peninsula in support of launch operations at the Kennedy Space Center. For this project, the WRF model domain has been shifted to the Northeastern US. Several products from the NASA SPoRT group are also used by the ENSCO forecaster. In this paper we will provide examples of the ECF products and discuss individual cases of traffic management actions using ECF guidance.
Stochastic simulations on a model of circadian rhythm generation.
Miura, Shigehiro; Shimokawa, Tetsuya; Nomura, Taishin
2008-01-01
Biological phenomena are often modeled by differential equations, where states of a model system are described by continuous real values. When we consider concentrations of molecules as dynamical variables for a set of biochemical reactions, we implicitly assume that numbers of the molecules are large enough so that their changes can be regarded as continuous and they are described deterministically. However, for a system with small numbers of molecules, changes in their numbers are apparently discrete and molecular noises become significant. In such cases, models with deterministic differential equations may be inappropriate, and the reactions must be described by stochastic equations. In this study, we focus a clock gene expression for a circadian rhythm generation, which is known as a system involving small numbers of molecules. Thus it is appropriate for the system to be modeled by stochastic equations and analyzed by methodologies of stochastic simulations. The interlocked feedback model proposed by Ueda et al. as a set of deterministic ordinary differential equations provides a basis of our analyses. We apply two stochastic simulation methods, namely Gillespie's direct method and the stochastic differential equation method also by Gillespie, to the interlocked feedback model. To this end, we first reformulated the original differential equations back to elementary chemical reactions. With those reactions, we simulate and analyze the dynamics of the model using two methods in order to compare them with the dynamics obtained from the original deterministic model and to characterize dynamics how they depend on the simulation methodologies.
Gulf Coast megaregion evacuation traffic simulation modeling and analysis.
DOT National Transportation Integrated Search
2015-12-01
This paper describes a project to develop a micro-level traffic simulation for a megaregion. To : accomplish this, a mass evacuation event was modeled using a traffic demand generation process that : created a spatial and temporal distribution of dep...
Dynamic traffic assignment based trailblazing guide signing for major traffic generator.
DOT National Transportation Integrated Search
2009-11-01
The placement of guide signs and the display of dynamic massage signs greatly affect drivers : understanding of the network and therefore their route choices. Most existing dynamic traffic assignment : models assume that drivers heading to a Major...
Generation of Conflict Resolution Maneuvers for Air Traffic Management
DOT National Transportation Integrated Search
1997-01-01
We explore the use of distributed on-line motion planning algorithms for multiple mobile agents, in Air Traffic Management Systems (ATMS). The work is motivated by current trends in ATMS to move towards decentralized air traffic management, in which ...
Use of mobile data for weather-responsive traffic management models.
DOT National Transportation Integrated Search
2012-10-01
The evolution of telecommunications and wireless technologies has brought in new sources of traffic data (particularly mobile data generated by vehicle probes), which could offer a breakthrough in the quality and extent of traffic data. This study re...
Maritime dynamic traffic generator. Volume 3 : density data on world maps
DOT National Transportation Integrated Search
1975-06-01
The 18,000 vessels whose weekly movements are tracked by the maritime traffic generator represent 106 different countries. There are 4915 vessels five or less years old. The record for the week of January 26, 1972 includes 11,789 arrivals, 10,896 dep...
Improving ground-penetrating radar data in sedimentary rocks using deterministic deconvolution
Xia, J.; Franseen, E.K.; Miller, R.D.; Weis, T.V.; Byrnes, A.P.
2003-01-01
Resolution is key to confidently identifying unique geologic features using ground-penetrating radar (GPR) data. Source wavelet "ringing" (related to bandwidth) in a GPR section limits resolution because of wavelet interference, and can smear reflections in time and/or space. The resultant potential for misinterpretation limits the usefulness of GPR. Deconvolution offers the ability to compress the source wavelet and improve temporal resolution. Unlike statistical deconvolution, deterministic deconvolution is mathematically simple and stable while providing the highest possible resolution because it uses the source wavelet unique to the specific radar equipment. Source wavelets generated in, transmitted through and acquired from air allow successful application of deterministic approaches to wavelet suppression. We demonstrate the validity of using a source wavelet acquired in air as the operator for deterministic deconvolution in a field application using "400-MHz" antennas at a quarry site characterized by interbedded carbonates with shale partings. We collected GPR data on a bench adjacent to cleanly exposed quarry faces in which we placed conductive rods to provide conclusive groundtruth for this approach to deconvolution. The best deconvolution results, which are confirmed by the conductive rods for the 400-MHz antenna tests, were observed for wavelets acquired when the transmitter and receiver were separated by 0.3 m. Applying deterministic deconvolution to GPR data collected in sedimentary strata at our study site resulted in an improvement in resolution (50%) and improved spatial location (0.10-0.15 m) of geologic features compared to the same data processed without deterministic deconvolution. The effectiveness of deterministic deconvolution for increased resolution and spatial accuracy of specific geologic features is further demonstrated by comparing results of deconvolved data with nondeconvolved data acquired along a 30-m transect immediately adjacent to a fresh quarry face. The results at this site support using deterministic deconvolution, which incorporates the GPR instrument's unique source wavelet, as a standard part of routine GPR data processing. ?? 2003 Elsevier B.V. All rights reserved.
DOT National Transportation Integrated Search
2011-01-01
Inductive loops are widely used nationwide for traffic monitoring as a data source for a variety of : needs in generating traffic information for operation and planning analysis, validations of travel : demand models, freight studies, pavement design...
Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow
NASA Astrophysics Data System (ADS)
Gupta, Atma Ram; Kumar, Ashwani
2017-12-01
Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.
NASA Technical Reports Server (NTRS)
Hathaway, Michael D.
1986-01-01
Measurements of the unsteady velocity field within the stator row of a transonic axial-flow fan were acquired using a laser anemometer. Measurements were obtained on axisymmetric surfaces located at 10 and 50 percent span from the shroud, with the fan operating at maximum efficiency at design speed. The ensemble-average and variance of the measured velocities are used to identify rotor-wake-generated (deterministic) unsteadiness and turbulence, respectively. Correlations of both deterministic and turbulent velocity fluctuations provide information on the characteristics of unsteady interactions within the stator row. These correlations are derived from the Navier-Stokes equation in a manner similar to deriving the Reynolds stress terms, whereby various averaging operators are used to average the aperiodic, deterministic, and turbulent velocity fluctuations which are known to be present in multistage turbomachines. The correlations of deterministic and turbulent velocity fluctuations throughout the axial fan stator row are presented. In particular, amplification and attenuation of both types of unsteadiness are shown to occur within the stator blade passage.
DOT National Transportation Integrated Search
2013-03-01
The Federal Aviation Administration (FAA) faces two significant organizational challenges in the 21st century: (1) transformation of the current NAS into the Next Generation Air Transportation System (NextGen); and (2) recruitment, selection, a...
Effect of Traffic Position Accuracy for Conducting Safe Airport Surface Operations
NASA Technical Reports Server (NTRS)
Jones, Denise R.; Prinzel, Lawrence J., III; Bailey, Randall E.; Arthur, Jarvis J., III; Barnes, James R.
2014-01-01
The Next Generation Air Transportation System (NextGen) concept proposes many revolutionary operational concepts and technologies, such as display of traffic information and movements, airport moving maps (AMM), and proactive alerts of runway incursions and surface traffic conflicts, to deliver an overall increase in system capacity and safety. A piloted simulation study was conducted at the National Aeronautics and Space Administration (NASA) Langley Research Center to evaluate the ability to conduct safe and efficient airport surface operations while utilizing an AMM displaying traffic of various position accuracies as well as the effect of traffic position accuracy on airport conflict detection and resolution (CD&R) capability. Nominal scenarios and off-nominal conflict scenarios were conducted using 12 airline crews operating in a simulated Memphis International Airport terminal environment. The data suggest that all traffic should be shown on the airport moving map, whether qualified or unqualified, and conflict detection and resolution technologies provide significant safety benefits. Despite the presence of traffic information on the map, collisions or near collisions still occurred; when indications or alerts were generated in these same scenarios, the incidences were averted.
Phase diagram of congested traffic flow: An empirical study
Lee; Lee; Kim
2000-10-01
We analyze traffic data from a highway section containing one effective on-ramp. Based on two criteria, local velocity variation patterns and expansion (or nonexpansion) of congested regions, three distinct congested traffic states are identified. These states appear at different levels of the upstream flux and the on-ramp flux, thereby generating a phase digram of the congested traffic flow. Observed traffic states are compared with recent theoretical analyses and both agreeing and disagreeing features are found.
Nishimoto, Ryu; Tani, Jun
2004-09-01
This study shows how sensory-action sequences of imitating finite state machines (FSMs) can be learned by utilizing the deterministic dynamics of recurrent neural networks (RNNs). Our experiments indicated that each possible combinatorial sequence can be recalled by specifying its respective initial state value and also that fractal structures appear in this initial state mapping after the learning converges. We also observed that the sequences of mimicking FSMs are encoded utilizing the transient regions rather than the invariant sets of the evolved dynamical systems of the RNNs.
Deterministic filtering of breakdown flashing at telecom wavelengths
NASA Astrophysics Data System (ADS)
Marini, Loris; Camphausen, Robin; Eggleton, Benjamin J.; Palomba, Stefano
2017-11-01
Breakdown flashes are undesired photo-emissions from the active area of single-photon avalanche photo-diodes. They arise from radiative recombinations of hot carriers generated during an avalanche and can induce crosstalk, compromise the measurement of optical quantum states, and hinder the security of quantum communications. Although the spectrum of this emission extends over hundreds of nanometers, active quenching may lead to a smaller uncertainty in the time of emission, thus enabling deterministic filtering. Our results pave the way to broadband interference mitigation in time-correlated single-photon applications.
State of the practice for traffic data quality : traffic data quality workshop : white paper.
DOT National Transportation Integrated Search
2002-12-31
This White Paper documents the current state of the practice in the quality of traffic data generated by Intelligent Transportation Systems (ITS). The current state of the practice is viewed from the perspectives of both Operations and Planning perso...
Modeling DNP3 Traffic Characteristics of Field Devices in SCADA Systems of the Smart Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Huan; Cheng, Liang; Chuah, Mooi Choo
In the generation, transmission, and distribution sectors of the smart grid, intelligence of field devices is realized by programmable logic controllers (PLCs). Many smart-grid subsystems are essentially cyber-physical energy systems (CPES): For instance, the power system process (i.e., the physical part) within a substation is monitored and controlled by a SCADA network with hosts running miscellaneous applications (i.e., the cyber part). To study the interactions between the cyber and physical components of a CPES, several co-simulation platforms have been proposed. However, the network simulators/emulators of these platforms do not include a detailed traffic model that takes into account the impactsmore » of the execution model of PLCs on traffic characteristics. As a result, network traces generated by co-simulation only reveal the impacts of the physical process on the contents of the traffic generated by SCADA hosts, whereas the distinction between PLCs and computing nodes (e.g., a hardened computer running a process visualization application) has been overlooked. To generate realistic network traces using co-simulation for the design and evaluation of applications relying on accurate traffic profiles, it is necessary to establish a traffic model for PLCs. In this work, we propose a parameterized model for PLCs that can be incorporated into existing co-simulation platforms. We focus on the DNP3 subsystem of slave PLCs, which automates the processing of packets from the DNP3 master. To validate our approach, we extract model parameters from both the configuration and network traces of real PLCs. Simulated network traces are generated and compared against those from PLCs. Our evaluation shows that our proposed model captures the essential traffic characteristics of DNP3 slave PLCs, which can be used to extend existing co-simulation platforms and gain further insights into the behaviors of CPES.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Huan; Cheng, Liang; Chuah, Mooi Choo
In the generation, transmission, and distribution sectors of the smart grid, intelligence of field devices is realized by programmable logic controllers (PLCs). Many smart-grid subsystems are essentially cyber-physical energy systems (CPES): For instance, the power system process (i.e., the physical part) within a substation is monitored and controlled by a SCADA network with hosts running miscellaneous applications (i.e., the cyber part). To study the interactions between the cyber and physical components of a CPES, several co-simulation platforms have been proposed. However, the network simulators/emulators of these platforms do not include a detailed traffic model that takes into account the impactsmore » of the execution model of PLCs on traffic characteristics. As a result, network traces generated by co-simulation only reveal the impacts of the physical process on the contents of the traffic generated by SCADA hosts, whereas the distinction between PLCs and computing nodes (e.g., a hardened computer running a process visualization application) has been overlooked. To generate realistic network traces using co-simulation for the design and evaluation of applications relying on accurate traffic profiles, it is necessary to establish a traffic model for PLCs. In this work, we propose a parameterized model for PLCs that can be incorporated into existing co-simulation platforms. We focus on the DNP3 subsystem of slave PLCs, which automates the processing of packets from the DNP3 master. To validate our approach, we extract model parameters from both the configuration and network traces of real PLCs. Simulated network traces are generated and compared against those from PLCs. Our evaluation shows that our proposed model captures the essential traffic characteristics of DNP3 slave PLCs, which can be used to extend existing co-simulation platforms and gain further insights into the behaviors of CPES.« less
Mirzaei, Ramazan; Hafezi-Nejad, Nima; Sadegh Sabagh, Mohammad; Ansari Moghaddam, Alireza; Eslami, Vahid; Rakhshani, Fatemeh; Rahimi-Movaghar, Vafa
2014-05-01
Evaluating the relation between Iranian drivers' knowledge, attitude, and practice (KAP) regarding traffic regulations, and their deterministic effect on road traffic crashes (RTCs). Two cities of Tehran and Zahedan, Iran. A cross-sectional study was designed. Using a simplified cluster sampling design, 2200 motor vehicle drivers including 1200 in Tehran and 1000 in Zahedan were selected. Sixty locations in Tehran and 50 in Zahedan were chosen. In each pre-identified location, 20 adult drivers were approached consecutively. A questionnaire developed by researchers was filled by each participant. The questionnaire had four sections including items assessing the demographics, knowledge, attitude and practice of drivers toward traffic regulations. Logistic regression analysis was used to evaluate the relationship between the RTCs and KAP variables. The study sample consisted of 619 (28.1%) occupational and 1580 (71.8%) private drivers. Among them, 86.4% were male. The median age was 33.6 ± 10.83. Drivers in Tehran and Zahedan had no significant differences between their mean scores of KAP items of the questionnaire. Higher knowledge, safer attitude, and safer practice were associated with a decreased number of RTC. After adjusting for possible confounders, increase of one standard deviation in attitude and practice scores (but not knowledge) resulted in 26.4% and 18.5% decrease in RTC, respectively. Finally, considering knowledge, attitude and practice of drivers in one model to assess their mutual effect, it was shown that only attitude is significantly associated with a decrease of RTC (OR=0.76, P=0.007). Increase in attitude and practice accompanied with decreased number of RTCs in Iranian drivers. Specifically, drivers' attitude had the crucial effect. It is not knowledge and standard traffic education; rather it is how such education is registered as an attitude that translates what is being learned into actions. Without safer attitude, even safer self-reported practice will not result in lower RTCs. Copyright © 2014 Elsevier Ltd. All rights reserved.
Large-scale multi-agent transportation simulations
NASA Astrophysics Data System (ADS)
Cetin, Nurhan; Nagel, Kai; Raney, Bryan; Voellmy, Andreas
2002-08-01
It is now possible to microsimulate the traffic of whole metropolitan areas with 10 million travelers or more, "micro" meaning that each traveler is resolved individually as a particle. In contrast to physics or chemistry, these particles have internal intelligence; for example, they know where they are going. This means that a transportation simulation project will have, besides the traffic microsimulation, modules which model this intelligent behavior. The most important modules are for route generation and for demand generation. Demand is generated by each individual in the simulation making a plan of activities such as sleeping, eating, working, shopping, etc. If activities are planned at different locations, they obviously generate demand for transportation. This however is not enough since those plans are influenced by congestion which initially is not known. This is solved via a relaxation method, which means iterating back and forth between the activities/routes generation and the traffic simulation.
Super Ensemble-based Aviation Turbulence Guidance (SEATG) for Air Traffic Management (ATM)
NASA Astrophysics Data System (ADS)
Kim, Jung-Hoon; Chan, William; Sridhar, Banavar; Sharman, Robert
2014-05-01
Super Ensemble (ensemble of ten turbulence metrics from time-lagged ensemble members of weather forecast data)-based Aviation Turbulence Guidance (SEATG) is developed using Weather Research and Forecasting (WRF) model and in-situ eddy dissipation rate (EDR) observations equipped on commercial aircraft over the contiguous United States. SEATG is a sequence of five procedures including weather modeling, calculating turbulence metrics, mapping EDR-scale, evaluating metrics, and producing final SEATG forecast. This uses similar methodology to the operational Graphic Turbulence Guidance (GTG) with three major improvements. First, SEATG use a higher resolution (3-km) WRF model to capture cloud-resolving scale phenomena. Second, SEATG computes turbulence metrics for multiple forecasts that are combined at the same valid time resulting in an time-lagged ensemble of multiple turbulence metrics. Third, SEATG provides both deterministic and probabilistic turbulence forecasts to take into account weather uncertainties and user demands. It is found that the SEATG forecasts match well with observed radar reflectivity along a surface front as well as convectively induced turbulence outside the clouds on 7-8 Sep 2012. And, overall performance skill of deterministic SEATG against the observed EDR data during this period is superior to any single turbulence metrics. Finally, probabilistic SEATG is used as an example application of turbulence forecast for air-traffic management. In this study, a simple Wind-Optimal Route (WOR) passing through the potential areas of probabilistic SEATG and Lateral Turbulence Avoidance Route (LTAR) taking into account the SEATG are calculated at z = 35000 ft (z = 12 km) from Los Angeles to John F. Kennedy international airports. As a result, WOR takes total of 239 minutes with 16 minutes of SEATG areas for 40% of moderate turbulence potential, while LTAR takes total of 252 minutes travel time that 5% of fuel would be additionally consumed to entirely avoid the moderate SEATG regions.
NASA Astrophysics Data System (ADS)
Radev, Dimitar; Lokshina, Izabella
2010-11-01
The paper examines self-similar (or fractal) properties of real communication network traffic data over a wide range of time scales. These self-similar properties are very different from the properties of traditional models based on Poisson and Markov-modulated Poisson processes. Advanced fractal models of sequentional generators and fixed-length sequence generators, and efficient algorithms that are used to simulate self-similar behavior of IP network traffic data are developed and applied. Numerical examples are provided; and simulation results are obtained and analyzed.
Realistic Simulation for Body Area and Body-To-Body Networks
Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D’Errico, Raffaele
2016-01-01
In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices. PMID:27104537
Realistic Simulation for Body Area and Body-To-Body Networks.
Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D'Errico, Raffaele
2016-04-20
In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices.
Demonstration of alternative traffic information collection and management technologies
NASA Astrophysics Data System (ADS)
Knee, Helmut E.; Smith, Cy; Black, George; Petrolino, Joe
2004-03-01
Many of the components associated with the deployment of Intelligent Transportation Systems (ITS) to support a traffic management center (TMC) such as remote control cameras, traffic speed detectors, and variable message signs, have been available for many years. Their deployment, however, has been expensive and applied primarily to freeways and interstates, and have been deployed principally in the major metropolitan areas in the US; not smaller cities. The Knoxville (Tennessee) Transportation Planning Organization is sponsoring a project that will test the integration of several technologies to estimate near-real time traffic information data and information that could eventually be used by travelers to make better and more informed decisions related to their travel needs. The uniqueness of this demonstration is that it will seek to predict traffic conditions based on cellular phone signals already being collected by cellular communications companies. Information about the average speed on various portions of local arterials and incident identification (incident location) will be collected and compared to similar data generated by "probe vehicles". Successful validation of the speed information generated from cell phone data will allow traffic data to be generated much more economically and utilize technologies that are minimally infrastructure invasive. Furthermore, when validated, traffic information could be provided to the traveling public allowing then to make better decisions about trips. More efficient trip planning and execution can reduce congestion and associated vehicle emissions. This paper will discuss the technologies, the demonstration project, the project details, and future directions.
Field-free deterministic ultrafast creation of magnetic skyrmions by spin-orbit torques
NASA Astrophysics Data System (ADS)
Büttner, Felix; Lemesh, Ivan; Schneider, Michael; Pfau, Bastian; Günther, Christian M.; Hessing, Piet; Geilhufe, Jan; Caretta, Lucas; Engel, Dieter; Krüger, Benjamin; Viefhaus, Jens; Eisebitt, Stefan; Beach, Geoffrey S. D.
2017-11-01
Magnetic skyrmions are stabilized by a combination of external magnetic fields, stray field energies, higher-order exchange interactions and the Dzyaloshinskii-Moriya interaction (DMI). The last favours homochiral skyrmions, whose motion is driven by spin-orbit torques and is deterministic, which makes systems with a large DMI relevant for applications. Asymmetric multilayers of non-magnetic heavy metals with strong spin-orbit interactions and transition-metal ferromagnetic layers provide a large and tunable DMI. Also, the non-magnetic heavy metal layer can inject a vertical spin current with transverse spin polarization into the ferromagnetic layer via the spin Hall effect. This leads to torques that can be used to switch the magnetization completely in out-of-plane magnetized ferromagnetic elements, but the switching is deterministic only in the presence of a symmetry-breaking in-plane field. Although spin-orbit torques led to domain nucleation in continuous films and to stochastic nucleation of skyrmions in magnetic tracks, no practical means to create individual skyrmions controllably in an integrated device design at a selected position has been reported yet. Here we demonstrate that sub-nanosecond spin-orbit torque pulses can generate single skyrmions at custom-defined positions in a magnetic racetrack deterministically using the same current path as used for the shifting operation. The effect of the DMI implies that no external in-plane magnetic fields are needed for this aim. This implementation exploits a defect, such as a constriction in the magnetic track, that can serve as a skyrmion generator. The concept is applicable to any track geometry, including three-dimensional designs.
DOT National Transportation Integrated Search
2008-08-01
Report Abstract: : The purpose of this guide is to aid the Texas Department of Transportation (TxDOT), Metropolitan Planning Organizations (MPO), and other state and local agencies to develop an effective traffic monitoring system for new major traff...
Wu, Jun; Ren, Cizao; Delfino, Ralph J; Chung, Judith; Wilhelm, Michelle; Ritz, Beate
2009-11-01
Preeclampsia is a major complication of pregnancy that can lead to substantial maternal and perinatal morbidity, mortality, and preterm birth. Increasing evidence suggests that air pollution adversely affects pregnancy outcomes. Yet few studies have examined how local traffic-generated emissions affect preeclampsia in addition to preterm birth. We examined effects of residential exposure to local traffic-generated air pollution on preeclampsia and preterm delivery (PTD). We identified 81,186 singleton birth records from four hospitals (1997-2006) in Los Angeles and Orange Counties, California (USA). We used a line-source dispersion model (CALINE4) to estimate individual exposure to local traffic-generated nitrogen oxides (NO(x)) and particulate matter < 2.5 mum in aerodynamic diameter (PM(2.5)) across the entire pregnancy. We used logistic regression to estimate effects of air pollution exposures on preeclampsia, PTD (gestational age < 37 weeks), moderate PTD (MPTD; gestational age < 35 weeks), and very PTD (VPTD; gestational age < 30 weeks). We observed elevated risks for preeclampsia and preterm birth from maternal exposure to local traffic-generated NO(x) and PM(2.5). The risk of preeclampsia increased 33% [odds ratio (OR) = 1.33; 95% confidence interval (CI), 1.18-1.49] and 42% (OR = 1.42; 95% CI, 1.26-1.59) for the highest NO(x) and PM(2.5) exposure quartiles, respectively. The risk of VPTD increased 128% (OR = 2.28; 95% CI, 2.15-2.42) and 81% (OR = 1.81; 95% CI, 1.71-1.92) for women in the highest NO(x) and PM(2.5) exposure quartiles, respectively. Exposure to local traffic-generated air pollution during pregnancy increases the risk of preeclampsia and preterm birth in Southern California women. These results provide further evidence that air pollution is associated with adverse reproductive outcomes.
Wei, Yu-Jia; He, Yu-Ming; Chen, Ming-Cheng; Hu, Yi-Nan; He, Yu; Wu, Dian; Schneider, Christian; Kamp, Martin; Höfling, Sven; Lu, Chao-Yang; Pan, Jian-Wei
2014-11-12
Single photons are attractive candidates of quantum bits (qubits) for quantum computation and are the best messengers in quantum networks. Future scalable, fault-tolerant photonic quantum technologies demand both stringently high levels of photon indistinguishability and generation efficiency. Here, we demonstrate deterministic and robust generation of pulsed resonance fluorescence single photons from a single semiconductor quantum dot using adiabatic rapid passage, a method robust against fluctuation of driving pulse area and dipole moments of solid-state emitters. The emitted photons are background-free, have a vanishing two-photon emission probability of 0.3% and a raw (corrected) two-photon Hong-Ou-Mandel interference visibility of 97.9% (99.5%), reaching a precision that places single photons at the threshold for fault-tolerant surface-code quantum computing. This single-photon source can be readily scaled up to multiphoton entanglement and used for quantum metrology, boson sampling, and linear optical quantum computing.
Hybrid deterministic-stochastic modeling of x-ray beam bowtie filter scatter on a CT system.
Liu, Xin; Hsieh, Jiang
2015-01-01
Knowledge of scatter generated by bowtie filter (i.e. x-ray beam compensator) is crucial for providing artifact free images on the CT scanners. Our approach is to use a hybrid deterministic-stochastic simulation to estimate the scatter level generated by a bowtie filter made of a material with low atomic number. First, major components of CT systems, such as source, flat filter, bowtie filter, body phantom, are built into a 3D model. The scattered photon fluence and the primary transmitted photon fluence are simulated by MCNP - a Monte Carlo simulation toolkit. The rejection of scattered photon by the post patient collimator (anti-scatter grid) is simulated with an analytical formula. The biased sinogram is created by superimposing scatter signal generated by the simulation onto the primary x-ray beam signal. Finally, images with artifacts are reconstructed with the biased signal. The effect of anti-scatter grid height on scatter rejection are also discussed and demonstrated.
Deterministic blade row interactions in a centrifugal compressor stage
NASA Technical Reports Server (NTRS)
Kirtley, K. R.; Beach, T. A.
1991-01-01
The three-dimensional viscous flow in a low speed centrifugal compressor stage is simulated using an average passage Navier-Stokes analysis. The impeller discharge flow is of the jet/wake type with low momentum fluid in the shroud-pressure side corner coincident with the tip leakage vortex. This nonuniformity introduces periodic unsteadiness in the vane frame of reference. The effect of such deterministic unsteadiness on the time-mean is included in the analysis through the average passage stress, which allows the analysis of blade row interactions. The magnitude of the divergence of the deterministic unsteady stress is of the order of the divergence of the Reynolds stress over most of the span, from the impeller trailing edge to the vane throat. Although the potential effects on the blade trailing edge from the diffuser vane are small, strong secondary flows generated by the impeller degrade the performance of the diffuser vanes.
Zhang, Binbin; Chen, Jun; Jin, Long; Deng, Weili; Zhang, Lei; Zhang, Haitao; Zhu, Minhao; Yang, Weiqing; Wang, Zhong Lin
2016-06-28
Wireless traffic volume detectors play a critical role for measuring the traffic-flow in a real-time for current Intelligent Traffic System. However, as a battery-operated electronic device, regularly replacing battery remains a great challenge, especially in the remote area and wide distribution. Here, we report a self-powered active wireless traffic volume sensor by using a rotating-disk-based hybridized nanogenerator of triboelectric nanogenerator and electromagnetic generator as the sustainable power source. Operated at a rotating rate of 1000 rpm, the device delivered an output power of 17.5 mW, corresponding to a volume power density of 55.7 W/m(3) (Pd = P/V, see Supporting Information for detailed calculation) at a loading resistance of 700 Ω. The hybridized nanogenerator was demonstrated to effectively harvest energy from wind generated by a moving vehicle through the tunnel. And the delivered power is capable of triggering a counter via a wireless transmitter for real-time monitoring the traffic volume in the tunnel. This study further expands the applications of triboelectric nanogenerators for high-performance ambient mechanical energy harvesting and as sustainable power sources for driving wireless traffic volume sensors.
DOT National Transportation Integrated Search
2010-04-21
To prepare for future air traffic growth, the Federal Aviation Administration (FAA), including its Joint Planning and Development Office (JPDO) and Air Traffic Organization, is planning and implementing the Next Generation Air Transportation System (...
Analysis of Air Traffic Track Data with the AutoBayes Synthesis System
NASA Technical Reports Server (NTRS)
Schumann, Johann Martin Philip; Cate, Karen; Lee, Alan G.
2010-01-01
The Next Generation Air Traffic System (NGATS) is aiming to provide substantial computer support for the air traffic controllers. Algorithms for the accurate prediction of aircraft movements are of central importance for such software systems but trajectory prediction has to work reliably in the presence of unknown parameters and uncertainties. We are using the AutoBayes program synthesis system to generate customized data analysis algorithms that process large sets of aircraft radar track data in order to estimate parameters and uncertainties. In this paper, we present, how the tasks of finding structure in track data, estimation of important parameters in climb trajectories, and the detection of continuous descent approaches can be accomplished with compact task-specific AutoBayes specifications. We present an overview of the AutoBayes architecture and describe, how its schema-based approach generates customized analysis algorithms, documented C/C++ code, and detailed mathematical derivations. Results of experiments with actual air traffic control data are discussed.
Högnäs, G; Tuomi, S; Veltel, S; Mattila, E; Murumägi, A; Edgren, H; Kallioniemi, O; Ivaska, J
2012-01-01
Aneuploidy is frequently detected in solid tumors but the mechanisms regulating the generation of aneuploidy and their relevance in cancer initiation remain under debate and are incompletely characterized. Spatial and temporal regulation of integrin traffic is critical for cell migration and cytokinesis. Impaired integrin endocytosis, because of the loss of Rab21 small GTPase or mutations in the integrin β-subunit cytoplasmic tail, induces failure of cytokinesis in vitro. Here, we describe that repeatedly failed cytokinesis, because of impaired traffic, is sufficient to trigger the generation of aneuploid cells, which display characteristics of oncogenic transformation in vitro and are tumorigenic in vivo. Furthermore, in an in vivo mouse xenograft model, non-transformed cells with impaired integrin traffic formed tumors with a long latency. More detailed investigation of these tumors revealed that the tumor cells were aneuploid. Therefore, abnormal integrin traffic was linked with generation of aneuploidy and cell transformation also in vivo. In human prostate and ovarian cancer samples, downregulation of Rab21 correlates with increased malignancy. Loss-of-function experiments demonstrate that long-term depletion of Rab21 is sufficient to induce chromosome number aberrations in normal human epithelial cells. These data are the first to demonstrate that impaired integrin traffic is sufficient to induce conversion of non-transformed cells to tumorigenic cells in vitro and in vivo. PMID:22120710
Evaluation of Deployment Challenges of Wireless Sensor Networks at Signalized Intersections
Azpilicueta, Leyre; López-Iturri, Peio; Aguirre, Erik; Martínez, Carlos; Astrain, José Javier; Villadangos, Jesús; Falcone, Francisco
2016-01-01
With the growing demand of Intelligent Transportation Systems (ITS) for safer and more efficient transportation, research on and development of such vehicular communication systems have increased considerably in the last years. The use of wireless networks in vehicular environments has grown exponentially. However, it is highly important to analyze radio propagation prior to the deployment of a wireless sensor network in such complex scenarios. In this work, the radio wave characterization for ISM 2.4 GHz and 5 GHz Wireless Sensor Networks (WSNs) deployed taking advantage of the existence of traffic light infrastructure has been assessed. By means of an in-house developed 3D ray launching algorithm, the impact of topology as well as urban morphology of the environment has been analyzed, emulating the realistic operation in the framework of the scenario. The complexity of the scenario, which is an intersection city area with traffic lights, vehicles, people, buildings, vegetation and urban environment, makes necessary the channel characterization with accurate models before the deployment of wireless networks. A measurement campaign has been conducted emulating the interaction of the system, in the vicinity of pedestrians as well as nearby vehicles. A real time interactive application has been developed and tested in order to visualize and monitor traffic as well as pedestrian user location and behavior. Results show that the use of deterministic tools in WSN deployment can aid in providing optimal layouts in terms of coverage, capacity and energy efficiency of the network. PMID:27455270
Evaluation of Deployment Challenges of Wireless Sensor Networks at Signalized Intersections.
Azpilicueta, Leyre; López-Iturri, Peio; Aguirre, Erik; Martínez, Carlos; Astrain, José Javier; Villadangos, Jesús; Falcone, Francisco
2016-07-22
With the growing demand of Intelligent Transportation Systems (ITS) for safer and more efficient transportation, research on and development of such vehicular communication systems have increased considerably in the last years. The use of wireless networks in vehicular environments has grown exponentially. However, it is highly important to analyze radio propagation prior to the deployment of a wireless sensor network in such complex scenarios. In this work, the radio wave characterization for ISM 2.4 GHz and 5 GHz Wireless Sensor Networks (WSNs) deployed taking advantage of the existence of traffic light infrastructure has been assessed. By means of an in-house developed 3D ray launching algorithm, the impact of topology as well as urban morphology of the environment has been analyzed, emulating the realistic operation in the framework of the scenario. The complexity of the scenario, which is an intersection city area with traffic lights, vehicles, people, buildings, vegetation and urban environment, makes necessary the channel characterization with accurate models before the deployment of wireless networks. A measurement campaign has been conducted emulating the interaction of the system, in the vicinity of pedestrians as well as nearby vehicles. A real time interactive application has been developed and tested in order to visualize and monitor traffic as well as pedestrian user location and behavior. Results show that the use of deterministic tools in WSN deployment can aid in providing optimal layouts in terms of coverage, capacity and energy efficiency of the network.
Elliptical quantum dots as on-demand single photons sources with deterministic polarization states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teng, Chu-Hsiang; Demory, Brandon; Ku, Pei-Cheng, E-mail: peicheng@umich.edu
In quantum information, control of the single photon's polarization is essential. Here, we demonstrate single photon generation in a pre-programmed and deterministic polarization state, on a chip-scale platform, utilizing site-controlled elliptical quantum dots (QDs) synthesized by a top-down approach. The polarization from the QD emission is found to be linear with a high degree of linear polarization and parallel to the long axis of the ellipse. Single photon emission with orthogonal polarizations is achieved, and the dependence of the degree of linear polarization on the QD geometry is analyzed.
Crash characteristics at work zones
DOT National Transportation Integrated Search
2001-05-01
Work zones tend to cause hazardous conditions for vehicle drivers and construction workers since they generate conflicts between construction activities and the traffic, and therefore aggravate the existing traffic conditions.
Distributed Traffic Complexity Management by Preserving Trajectory Flexibility
NASA Technical Reports Server (NTRS)
Idris, Husni; Vivona, Robert A.; Garcia-Chico, Jose-Luis; Wing, David J.
2007-01-01
In order to handle the expected increase in air traffic volume, the next generation air transportation system is moving towards a distributed control architecture, in which groundbased service providers such as controllers and traffic managers and air-based users such as pilots share responsibility for aircraft trajectory generation and management. This paper presents preliminary research investigating a distributed trajectory-oriented approach to manage traffic complexity, based on preserving trajectory flexibility. The underlying hypotheses are that preserving trajectory flexibility autonomously by aircraft naturally achieves the aggregate objective of avoiding excessive traffic complexity, and that trajectory flexibility is increased by collaboratively minimizing trajectory constraints without jeopardizing the intended air traffic management objectives. This paper presents an analytical framework in which flexibility is defined in terms of robustness and adaptability to disturbances and preliminary metrics are proposed that can be used to preserve trajectory flexibility. The hypothesized impacts are illustrated through analyzing a trajectory solution space in a simple scenario with only speed as a degree of freedom, and in constraint situations involving meeting multiple times of arrival and resolving conflicts.
Xia, J.; Franseen, E.K.; Miller, R.D.; Weis, T.V.
2004-01-01
We successfully applied deterministic deconvolution to real ground-penetrating radar (GPR) data by using the source wavelet that was generated in and transmitted through air as the operator. The GPR data were collected with 400-MHz antennas on a bench adjacent to a cleanly exposed quarry face. The quarry site is characterized by horizontally bedded carbonate strata with shale partings. In order to provide groundtruth for this deconvolution approach, 23 conductive rods were drilled into the quarry face at key locations. The steel rods provided critical information for: (1) correlation between reflections on GPR data and geologic features exposed in the quarry face, (2) GPR resolution limits, (3) accuracy of velocities calculated from common midpoint data and (4) identifying any multiples. Comparing the results of deconvolved data with non-deconvolved data demonstrates the effectiveness of deterministic deconvolution in low dielectric-loss media for increased accuracy of velocity models (improved at least 10-15% in our study after deterministic deconvolution), increased vertical and horizontal resolution of specific geologic features and more accurate representation of geologic features as confirmed from detailed study of the adjacent quarry wall. ?? 2004 Elsevier B.V. All rights reserved.
Vehicle Modeling for Future Generation Transportation Simulation
DOT National Transportation Integrated Search
2009-05-10
Recent development of inter-vehicular wireless communication technologies have motivated many innovative applications aiming at significantly increasing traffic throughput and improving highway safety. Powerful traffic simulation is an indispensable ...
Automated Announcements of Approaching Emergency Vehicles
NASA Technical Reports Server (NTRS)
Bachelder, Aaron; Foster, Conrad
2006-01-01
Street intersections that are equipped with traffic lights would also be equipped with means for generating audible announcements of approaching emergency vehicles, according to a proposal. The means to generate the announcements would be implemented in the intersection- based subsystems of emergency traffic-light-preemption systems like those described in the two immediately preceding articles and in "Systems Would Preempt Traffic Lights for Emergency Vehicles" (NPO-30573), NASA Tech Briefs, Vol. 28, No. 10 (October 2004), page 36. Preempting traffic lights is not, by itself, sufficient to warn pedestrians at affected intersections that emergency vehicles are approaching. Automated visual displays that warn of approaching emergency vehicles can be helpful as a supplement to preemption of traffic lights, but experience teaches that for a variety of reasons, pedestrians often do not see such displays. Moreover, in noisy and crowded urban settings, the lights and sirens on emergency vehicles are often not noticed until a few seconds before the vehicles arrive. According to the proposal, the traffic-light preemption subsystem at each intersection would generate an audible announcement for example, emergency vehicle approaching, please clear intersection whenever a preemption was triggered. The subsystem would estimate the time of arrival of an approaching emergency vehicle by use of vehicle identity, position, and time data from one or more sources that could include units connected to traffic loops and/or transponders connected to diagnostic and navigation systems in participating emergency vehicles. The intersection-based subsystem would then start the announcement far enough in advance to enable pedestrians to leave the roadway before any emergency vehicles arrive.
Global phenomena from local rules: Peer-to-peer networks and crystal steps
NASA Astrophysics Data System (ADS)
Finkbiner, Amy
Even simple, deterministic rules can generate interesting behavior in dynamical systems. This dissertation examines some real world systems for which fairly simple, locally defined rules yield useful or interesting properties in the system as a whole. In particular, we study routing in peer-to-peer networks and the motion of crystal steps. Peers can vary by three orders of magnitude in their capacities to process network traffic. This heterogeneity inspires our use of "proportionate load balancing," where each peer provides resources in proportion to its individual capacity. We provide an implementation that employs small, local adjustments to bring the entire network into a global balance. Analytically and through simulations, we demonstrate the effectiveness of proportionate load balancing on two routing methods for de Bruijn graphs, introducing a new "reversed" routing method which performs better than standard forward routing in some cases. The prevalence of peer-to-peer applications prompts companies to locate the hosts participating in these networks. We explore the use of supervised machine learning to identify peer-to-peer hosts, without using application-specific information. We introduce a model for "triples," which exploits information about nearly contemporaneous flows to give a statistical picture of a host's activities. We find that triples, together with measurements of inbound vs. outbound traffic, can capture most of the behavior of peer-to-peer hosts. An understanding of crystal surface evolution is important for the development of modern nanoscale electronic devices. The most commonly studied surface features are steps, which form at low temperatures when the crystal is cut close to a plane of symmetry. Step bunching, when steps arrange into widely separated clusters of tightly packed steps, is one important step phenomenon. We analyze a discrete model for crystal steps, in which the motion of each step depends on the two steps on either side of it. We find an time-dependence term for the motion that does not appear in continuum models, and we determine an explicit dependence on step number.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-31
... also claims that MSS networks provide the only means to create a next generation air traffic management..., articulated their plans to offer high-speed data services, especially in connection with terrestrial networks... claimed that MSS networks provide the only means to create a next generation air traffic management (ATM...
NASA Astrophysics Data System (ADS)
Fu, Xiangwen; Liu, Junfeng; Ban-Weiss, George A.; Zhang, Jiachen; Huang, Xin; Ouyang, Bin; Popoola, Olalekan; Tao, Shu
2017-09-01
Street canyons are ubiquitous in urban areas. Traffic-related air pollutants in street canyons can adversely affect human health. In this study, an urban-scale traffic pollution dispersion model is developed considering street distribution, canyon geometry, background meteorology, traffic assignment, traffic emissions and air pollutant dispersion. In the model, vehicle exhausts generated from traffic flows first disperse inside street canyons along the micro-scale wind field generated by computational fluid dynamics (CFD) model. Then, pollutants leave the street canyon and further disperse over the urban area. On the basis of this model, the effects of canyon geometry on the distribution of NOx and CO from traffic emissions were studied over the center of Beijing. We found that an increase in building height leads to heavier pollution inside canyons and lower pollution outside canyons at pedestrian level, resulting in higher domain-averaged concentrations over the area. In addition, canyons with highly even or highly uneven building heights on each side of the street tend to lower the urban-scale air pollution concentrations at pedestrian level. Further, increasing street widths tends to lead to lower pollutant concentrations by reducing emissions and enhancing ventilation simultaneously. Our results indicate that canyon geometry strongly influences human exposure to traffic pollutants in the populated urban area. Carefully planning street layout and canyon geometry while considering traffic demand as well as local weather patterns may significantly reduce inhalation of unhealthy air by urban residents.
Automatic 3D high-fidelity traffic interchange modeling using 2D road GIS data
NASA Astrophysics Data System (ADS)
Wang, Jie; Shen, Yuzhong
2011-03-01
3D road models are widely used in many computer applications such as racing games and driving simulations. However, almost all high-fidelity 3D road models were generated manually by professional artists at the expense of intensive labor. There are very few existing methods for automatically generating 3D high-fidelity road networks, especially for those existing in the real world. Real road network contains various elements such as road segments, road intersections and traffic interchanges. Among them, traffic interchanges present the most challenges to model due to their complexity and the lack of height information (vertical position) of traffic interchanges in existing road GIS data. This paper proposes a novel approach that can automatically produce 3D high-fidelity road network models, including traffic interchange models, from real 2D road GIS data that mainly contain road centerline information. The proposed method consists of several steps. The raw road GIS data are first preprocessed to extract road network topology, merge redundant links, and classify road types. Then overlapped points in the interchanges are detected and their elevations are determined based on a set of level estimation rules. Parametric representations of the road centerlines are then generated through link segmentation and fitting, and they have the advantages of arbitrary levels of detail with reduced memory usage. Finally a set of civil engineering rules for road design (e.g., cross slope, superelevation) are selected and used to generate realistic road surfaces. In addition to traffic interchange modeling, the proposed method also applies to other more general road elements. Preliminary results show that the proposed method is highly effective and useful in many applications.
Automatic drawing for traffic marking with MMS LIDAR intensity
NASA Astrophysics Data System (ADS)
Takahashi, G.; Takeda, H.; Shimano, Y.
2014-05-01
Upgrading the database of CYBER JAPAN has been strategically promoted because the "Basic Act on Promotion of Utilization of Geographical Information", was enacted in May 2007. In particular, there is a high demand for road information that comprises a framework in this database. Therefore, road inventory mapping work has to be accurate and eliminate variation caused by individual human operators. Further, the large number of traffic markings that are periodically maintained and possibly changed require an efficient method for updating spatial data. Currently, we apply manual photogrammetry drawing for mapping traffic markings. However, this method is not sufficiently efficient in terms of the required productivity, and data variation can arise from individual operators. In contrast, Mobile Mapping Systems (MMS) and high-density Laser Imaging Detection and Ranging (LIDAR) scanners are rapidly gaining popularity. The aim in this study is to build an efficient method for automatically drawing traffic markings using MMS LIDAR data. The key idea in this method is extracting lines using a Hough transform strategically focused on changes in local reflection intensity along scan lines. However, also note that this method processes every traffic marking. In this paper, we discuss a highly accurate and non-human-operator-dependent method that applies the following steps: (1) Binarizing LIDAR points by intensity and extracting higher intensity points; (2) Generating a Triangulated Irregular Network (TIN) from higher intensity points; (3) Deleting arcs by length and generating outline polygons on the TIN; (4) Generating buffers from the outline polygons; (5) Extracting points from the buffers using the original LIDAR points; (6) Extracting local-intensity-changing points along scan lines using the extracted points; (7) Extracting lines from intensity-changing points through a Hough transform; and (8) Connecting lines to generate automated traffic marking mapping data.
Modeling the coevolution of topology and traffic on weighted technological networks
NASA Astrophysics Data System (ADS)
Xie, Yan-Bo; Wang, Wen-Xu; Wang, Bing-Hong
2007-02-01
For many technological networks, the network structures and the traffic taking place on them mutually interact. The demands of traffic increment spur the evolution and growth of the networks to maintain their normal and efficient functioning. In parallel, a change of the network structure leads to redistribution of the traffic. In this paper, we perform an extensive numerical and analytical study, extending results of Wang [Phys. Rev. Lett. 94, 188702 (2005)]. By introducing a general strength-coupling interaction driven by the traffic increment between any pair of vertices, our model generates networks of scale-free distributions of strength, weight, and degree. In particular, the obtained nonlinear correlation between vertex strength and degree, and the disassortative property demonstrate that the model is capable of characterizing weighted technological networks. Moreover, the generated graphs possess both dense clustering structures and an anticorrelation between vertex clustering and degree, which are widely observed in real-world networks. The corresponding theoretical predictions are well consistent with simulation results.
Terçariol, César Augusto Sangaletti; Martinez, Alexandre Souto
2005-08-01
Consider a medium characterized by N points whose coordinates are randomly generated by a uniform distribution along the edges of a unitary d-dimensional hypercube. A walker leaves from each point of this disordered medium and moves according to the deterministic rule to go to the nearest point which has not been visited in the preceding mu steps (deterministic tourist walk). Each trajectory generated by this dynamics has an initial nonperiodic part of t steps (transient) and a final periodic part of p steps (attractor). The neighborhood rank probabilities are parametrized by the normalized incomplete beta function Id= I1/4 [1/2, (d+1) /2] . The joint distribution S(N) (mu,d) (t,p) is relevant, and the marginal distributions previously studied are particular cases. We show that, for the memory-less deterministic tourist walk in the euclidean space, this distribution is Sinfinity(1,d) (t,p) = [Gamma (1+ I(-1)(d)) (t+ I(-1)(d) ) /Gamma(t+p+ I(-1)(d)) ] delta(p,2), where t=0, 1,2, ... infinity, Gamma(z) is the gamma function and delta(i,j) is the Kronecker delta. The mean-field models are the random link models, which correspond to d-->infinity, and the random map model which, even for mu=0 , presents nontrivial cycle distribution [ S(N)(0,rm) (p) proportional to p(-1) ] : S(N)(0,rm) (t,p) =Gamma(N)/ {Gamma[N+1- (t+p) ] N( t+p)}. The fundamental quantities are the number of explored points n(e)=t+p and Id. Although the obtained distributions are simple, they do not follow straightforwardly and they have been validated by numerical experiments.
NASA Astrophysics Data System (ADS)
Sinner, K.; Teasley, R. L.
2016-12-01
Groundwater models serve as integral tools for understanding flow processes and informing stakeholders and policy makers in management decisions. Historically, these models tended towards a deterministic nature, relying on historical data to predict and inform future decisions based on model outputs. This research works towards developing a stochastic method of modeling recharge inputs from pipe main break predictions in an existing groundwater model, which subsequently generates desired outputs incorporating future uncertainty rather than deterministic data. The case study for this research is the Barton Springs segment of the Edwards Aquifer near Austin, Texas. Researchers and water resource professionals have modeled the Edwards Aquifer for decades due to its high water quality, fragile ecosystem, and stakeholder interest. The original case study and model that this research is built upon was developed as a co-design problem with regional stakeholders and the model outcomes are generated specifically for communication with policy makers and managers. Recently, research in the Barton Springs segment demonstrated a significant contribution of urban, or anthropogenic, recharge to the aquifer, particularly during dry period, using deterministic data sets. Due to social and ecological importance of urban water loss to recharge, this study develops an evaluation method to help predicted pipe breaks and their related recharge contribution within the Barton Springs segment of the Edwards Aquifer. To benefit groundwater management decision processes, the performance measures captured in the model results, such as springflow, head levels, storage, and others, were determined by previous work in elicitation of problem framing to determine stakeholder interests and concerns. The results of the previous deterministic model and the stochastic model are compared to determine gains to stakeholder knowledge through the additional modeling
A preliminary estimate of future communications traffic for the electric power system
NASA Technical Reports Server (NTRS)
Barnett, R. M.
1981-01-01
Diverse new generator technologies using renewable energy, and to improve operational efficiency throughout the existing electric power systems are presented. A description of a model utility and the information transfer requirements imposed by incorporation of dispersed storage and generation technologies and implementation of more extensive energy management are estimated. An example of possible traffic for an assumed system, and an approach that can be applied to other systems, control configurations, or dispersed storage and generation penetrations is provided.
NASA Technical Reports Server (NTRS)
1973-01-01
The traffic analyses and system requirements data generated in the study resulted in the development of two traffic models; the baseline traffic model and the new traffic model. The baseline traffic model provides traceability between the numbers and types of geosynchronous missions considered in the study and the entire spectrum of missions foreseen in the total national space program. The information presented pertaining to the baseline traffic model includes: (1) definition of the baseline traffic model, including identification of specific geosynchronous missions and their payload delivery schedules through 1990; (2) Satellite location criteria, including the resulting distribution of the satellite population; (3) Geosynchronous orbit saturation analyses, including the effects of satellite physical proximity and potential electromagnetic interference; and (4) Platform system requirements analyses, including satellite and mission equipment descriptions, the options and limitations in grouping satellites, and on-orbit servicing criteria (both remotely controlled and man-attended).
NASA Astrophysics Data System (ADS)
Naseri Kouzehgarani, Asal
2009-12-01
Most models of aircraft trajectories are non-linear and stochastic in nature; and their internal parameters are often poorly defined. The ability to model, simulate and analyze realistic air traffic management conflict detection scenarios in a scalable, composable, multi-aircraft fashion is an extremely difficult endeavor. Accurate techniques for aircraft mode detection are critical in order to enable the precise projection of aircraft conflicts, and for the enactment of altitude separation resolution strategies. Conflict detection is an inherently probabilistic endeavor; our ability to detect conflicts in a timely and accurate manner over a fixed time horizon is traded off against the increased human workload created by false alarms---that is, situations that would not develop into an actual conflict, or would resolve naturally in the appropriate time horizon-thereby introducing a measure of probabilistic uncertainty in any decision aid fashioned to assist air traffic controllers. The interaction of the continuous dynamics of the aircraft, used for prediction purposes, with the discrete conflict detection logic gives rise to the hybrid nature of the overall system. The introduction of the probabilistic element, common to decision alerting and aiding devices, places the conflict detection and resolution problem in the domain of probabilistic hybrid phenomena. A hidden Markov model (HMM) has two stochastic components: a finite-state Markov chain and a finite set of output probability distributions. In other words an unobservable stochastic process (hidden) that can only be observed through another set of stochastic processes that generate the sequence of observations. The problem of self separation in distributed air traffic management reduces to the ability of aircraft to communicate state information to neighboring aircraft, as well as model the evolution of aircraft trajectories between communications, in the presence of probabilistic uncertain dynamics as well as partially observable and uncertain data. We introduce the Hybrid Hidden Markov Modeling (HHMM) formalism to enable the prediction of the stochastic aircraft states (and thus, potential conflicts), by combining elements of the probabilistic timed input output automaton and the partially observable Markov decision process frameworks, along with the novel addition of a Markovian scheduler to remove the non-deterministic elements arising from the enabling of several actions simultaneously. Comparisons of aircraft in level, climbing/descending and turning flight are performed, and unknown flight track data is evaluated probabilistically against the tuned model in order to assess the effectiveness of the model in detecting the switch between multiple flight modes for a given aircraft. This also allows for the generation of probabilistic distribution over the execution traces of the hybrid hidden Markov model, which then enables the prediction of the states of aircraft based on partially observable and uncertain data. Based on the composition properties of the HHMM, we study a decentralized air traffic system where aircraft are moving along streams and can perform cruise, accelerate, climb and turn maneuvers. We develop a common decentralized policy for conflict avoidance with spatially distributed agents (aircraft in the sky) and assure its safety properties via correctness proofs.
Throughput analysis of the IEEE 802.4 token bus standard under heavy load
NASA Technical Reports Server (NTRS)
Pang, Joseph; Tobagi, Fouad
1987-01-01
It has become clear in the last few years that there is a trend towards integrated digital services. Parallel to the development of public Integrated Services Digital Network (ISDN) is service integration in the local area (e.g., a campus, a building, an aircraft). The types of services to be integrated depend very much on the specific local environment. However, applications tend to generate data traffic belonging to one of two classes. According to IEEE 802.4 terminology, the first major class of traffic is termed synchronous, such as packetized voice and data generated from other applications with real-time constraints, and the second class is called asynchronous which includes most computer data traffic such as file transfer or facsimile. The IEEE 802.4 token bus protocol which was designed to support both synchronous and asynchronous traffic is examined. The protocol is basically a timer-controlled token bus access scheme. By a suitable choice of the design parameters, it can be shown that access delay is bounded for synchronous traffic. As well, the bandwidth allocated to asynchronous traffic can be controlled. A throughput analysis of the protocol under heavy load with constant channel occupation of synchronous traffic and constant token-passing times is presented.
De Los Ríos, F. A.; Paluszny, M.
2015-01-01
We consider some methods to extract information about the rotator cuff based on magnetic resonance images; the study aims to define an alternative method of display that might facilitate the detection of partial tears in the supraspinatus tendon. Specifically, we are going to use families of ellipsoidal triangular patches to cover the humerus head near the affected area. These patches are going to be textured and displayed with the information of the magnetic resonance images using the trilinear interpolation technique. For the generation of points to texture each patch, we propose a new method that guarantees the uniform distribution of its points using a random statistical method. Its computational cost, defined as the average computing time to generate a fixed number of points, is significantly lower as compared with deterministic and other standard statistical techniques. PMID:25650281
Intelligent Manufacturing of Commercial Optics Final Report CRADA No. TC-0313-92
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, J. S.; Pollicove, H.
The project combined the research and development efforts of LLNL and the University of Rochester Center for Manufacturing Optics (COM), to develop a new generation of flexible computer controlled optics· grinding machines. COM's principal near term development effort is to commercialize the OPTICAM-SM, a new prototype spherical grinding machine. A crucial requirement for commercializing the OPTICAM-SM is the development of a predictable and repeatable material removal process ( deterministic micro-grinding) that yields high quality surfaces that minimize non-deterministic polishing. OPTICAM machine tools and the fabrication process development studies are part of COM' s response to the DOD (ARPA) request tomore » implement a modernization strategy for revitalizing the U.S. optics manufacturing base. This project was entered into in order to develop a new generation of :flexible, computer-controlled optics grinding machines.« less
Deterministic binary vectors for efficient automated indexing of MEDLINE/PubMed abstracts.
Wahle, Manuel; Widdows, Dominic; Herskovic, Jorge R; Bernstam, Elmer V; Cohen, Trevor
2012-01-01
The need to maintain accessibility of the biomedical literature has led to development of methods to assist human indexers by recommending index terms for newly encountered articles. Given the rapid expansion of this literature, it is essential that these methods be scalable. Document vector representations are commonly used for automated indexing, and Random Indexing (RI) provides the means to generate them efficiently. However, RI is difficult to implement in real-world indexing systems, as (1) efficient nearest-neighbor search requires retaining all document vectors in RAM, and (2) it is necessary to maintain a store of randomly generated term vectors to index future documents. Motivated by these concerns, this paper documents the development and evaluation of a deterministic binary variant of RI. The increased capacity demonstrated by binary vectors has implications for information retrieval, and the elimination of the need to retain term vectors facilitates distributed implementations, enhancing the scalability of RI.
Deterministic Binary Vectors for Efficient Automated Indexing of MEDLINE/PubMed Abstracts
Wahle, Manuel; Widdows, Dominic; Herskovic, Jorge R.; Bernstam, Elmer V.; Cohen, Trevor
2012-01-01
The need to maintain accessibility of the biomedical literature has led to development of methods to assist human indexers by recommending index terms for newly encountered articles. Given the rapid expansion of this literature, it is essential that these methods be scalable. Document vector representations are commonly used for automated indexing, and Random Indexing (RI) provides the means to generate them efficiently. However, RI is difficult to implement in real-world indexing systems, as (1) efficient nearest-neighbor search requires retaining all document vectors in RAM, and (2) it is necessary to maintain a store of randomly generated term vectors to index future documents. Motivated by these concerns, this paper documents the development and evaluation of a deterministic binary variant of RI. The increased capacity demonstrated by binary vectors has implications for information retrieval, and the elimination of the need to retain term vectors facilitates distributed implementations, enhancing the scalability of RI. PMID:23304369
Stochastic Watershed Models for Risk Based Decision Making
NASA Astrophysics Data System (ADS)
Vogel, R. M.
2017-12-01
Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation
Adaptation to Temporally Fluctuating Environments by the Evolution of Maternal Effects.
Dey, Snigdhadip; Proulx, Stephen R; Teotónio, Henrique
2016-02-01
All organisms live in temporally fluctuating environments. Theory predicts that the evolution of deterministic maternal effects (i.e., anticipatory maternal effects or transgenerational phenotypic plasticity) underlies adaptation to environments that fluctuate in a predictably alternating fashion over maternal-offspring generations. In contrast, randomizing maternal effects (i.e., diversifying and conservative bet-hedging), are expected to evolve in response to unpredictably fluctuating environments. Although maternal effects are common, evidence for their adaptive significance is equivocal since they can easily evolve as a correlated response to maternal selection and may or may not increase the future fitness of offspring. Using the hermaphroditic nematode Caenorhabditis elegans, we here show that the experimental evolution of maternal glycogen provisioning underlies adaptation to a fluctuating normoxia-anoxia hatching environment by increasing embryo survival under anoxia. In strictly alternating environments, we found that hermaphrodites evolved the ability to increase embryo glycogen provisioning when they experienced normoxia and to decrease embryo glycogen provisioning when they experienced anoxia. At odds with existing theory, however, populations facing irregularly fluctuating normoxia-anoxia hatching environments failed to evolve randomizing maternal effects. Instead, adaptation in these populations may have occurred through the evolution of fitness effects that percolate over multiple generations, as they maintained considerably high expected growth rates during experimental evolution despite evolving reduced fecundity and reduced embryo survival under one or two generations of anoxia. We develop theoretical models that explain why adaptation to a wide range of patterns of environmental fluctuations hinges on the existence of deterministic maternal effects, and that such deterministic maternal effects are more likely to contribute to adaptation than randomizing maternal effects.
Adaptation to Temporally Fluctuating Environments by the Evolution of Maternal Effects
Dey, Snigdhadip; Proulx, Stephen R.; Teotónio, Henrique
2016-01-01
All organisms live in temporally fluctuating environments. Theory predicts that the evolution of deterministic maternal effects (i.e., anticipatory maternal effects or transgenerational phenotypic plasticity) underlies adaptation to environments that fluctuate in a predictably alternating fashion over maternal-offspring generations. In contrast, randomizing maternal effects (i.e., diversifying and conservative bet-hedging), are expected to evolve in response to unpredictably fluctuating environments. Although maternal effects are common, evidence for their adaptive significance is equivocal since they can easily evolve as a correlated response to maternal selection and may or may not increase the future fitness of offspring. Using the hermaphroditic nematode Caenorhabditis elegans, we here show that the experimental evolution of maternal glycogen provisioning underlies adaptation to a fluctuating normoxia–anoxia hatching environment by increasing embryo survival under anoxia. In strictly alternating environments, we found that hermaphrodites evolved the ability to increase embryo glycogen provisioning when they experienced normoxia and to decrease embryo glycogen provisioning when they experienced anoxia. At odds with existing theory, however, populations facing irregularly fluctuating normoxia–anoxia hatching environments failed to evolve randomizing maternal effects. Instead, adaptation in these populations may have occurred through the evolution of fitness effects that percolate over multiple generations, as they maintained considerably high expected growth rates during experimental evolution despite evolving reduced fecundity and reduced embryo survival under one or two generations of anoxia. We develop theoretical models that explain why adaptation to a wide range of patterns of environmental fluctuations hinges on the existence of deterministic maternal effects, and that such deterministic maternal effects are more likely to contribute to adaptation than randomizing maternal effects. PMID:26910440
NASA Technical Reports Server (NTRS)
Roske-Hofstrand, Renate J.
1990-01-01
The man-machine interface and its influence on the characteristics of computer displays in automated air traffic is discussed. The graphical presentation of spatial relationships and the problems it poses for air traffic control, and the solution of such problems are addressed. Psychological factors involved in the man-machine interface are stressed.
Khawaja, Sajid Gul; Mushtaq, Mian Hamza; Khan, Shoab A; Akram, M Usman; Jamal, Habib Ullah
2015-01-01
With the increase of transistors' density, popularity of System on Chip (SoC) has increased exponentially. As a communication module for SoC, Network on Chip (NoC) framework has been adapted as its backbone. In this paper, we propose a methodology for designing area-optimized application specific NoC while providing hard Quality of Service (QoS) guarantees for real time flows. The novelty of the proposed system lies in derivation of a Mixed Integer Linear Programming model which is then used to generate a resource optimal Network on Chip (NoC) topology and architecture while considering traffic and QoS requirements. We also present the micro-architectural design features used for enabling traffic and latency guarantees and discuss how the solution adapts for dynamic variations in the application traffic. The paper highlights the effectiveness of proposed method by generating resource efficient NoC solutions for both industrial and benchmark applications. The area-optimized results are generated in few seconds by proposed technique, without resorting to heuristics, even for an application with 48 traffic flows.
Khawaja, Sajid Gul; Mushtaq, Mian Hamza; Khan, Shoab A.; Akram, M. Usman; Jamal, Habib ullah
2015-01-01
With the increase of transistors' density, popularity of System on Chip (SoC) has increased exponentially. As a communication module for SoC, Network on Chip (NoC) framework has been adapted as its backbone. In this paper, we propose a methodology for designing area-optimized application specific NoC while providing hard Quality of Service (QoS) guarantees for real time flows. The novelty of the proposed system lies in derivation of a Mixed Integer Linear Programming model which is then used to generate a resource optimal Network on Chip (NoC) topology and architecture while considering traffic and QoS requirements. We also present the micro-architectural design features used for enabling traffic and latency guarantees and discuss how the solution adapts for dynamic variations in the application traffic. The paper highlights the effectiveness of proposed method by generating resource efficient NoC solutions for both industrial and benchmark applications. The area-optimized results are generated in few seconds by proposed technique, without resorting to heuristics, even for an application with 48 traffic flows. PMID:25898016
Byzantine-fault tolerant self-stabilizing protocol for distributed clock synchronization systems
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R. (Inventor)
2010-01-01
A rapid Byzantine self-stabilizing clock synchronization protocol that self-stabilizes from any state, tolerates bursts of transient failures, and deterministically converges within a linear convergence time with respect to the self-stabilization period. Upon self-stabilization, all good clocks proceed synchronously. The Byzantine self-stabilizing clock synchronization protocol does not rely on any assumptions about the initial state of the clocks. Furthermore, there is neither a central clock nor an externally generated pulse system. The protocol converges deterministically, is scalable, and self-stabilizes in a short amount of time. The convergence time is linear with respect to the self-stabilization period.
Deterministic nonlinear phase gates induced by a single qubit
NASA Astrophysics Data System (ADS)
Park, Kimin; Marek, Petr; Filip, Radim
2018-05-01
We propose deterministic realizations of nonlinear phase gates by repeating a finite sequence of non-commuting Rabi interactions between a harmonic oscillator and only a single two-level ancillary qubit. We show explicitly that the key nonclassical features of the ideal cubic phase gate and the quartic phase gate are generated in the harmonic oscillator faithfully by our method. We numerically analyzed the performance of our scheme under realistic imperfections of the oscillator and the two-level system. The methodology is extended further to higher-order nonlinear phase gates. This theoretical proposal completes the set of operations required for continuous-variable quantum computation.
Detecting determinism from point processes.
Andrzejak, Ralph G; Mormann, Florian; Kreuz, Thomas
2014-12-01
The detection of a nonrandom structure from experimental data can be crucial for the classification, understanding, and interpretation of the generating process. We here introduce a rank-based nonlinear predictability score to detect determinism from point process data. Thanks to its modular nature, this approach can be adapted to whatever signature in the data one considers indicative of deterministic structure. After validating our approach using point process signals from deterministic and stochastic model dynamics, we show an application to neuronal spike trains recorded in the brain of an epilepsy patient. While we illustrate our approach in the context of temporal point processes, it can be readily applied to spatial point processes as well.
Airborne Four-Dimensional Flight Management in a Time-based Air Traffic Control Environment
NASA Technical Reports Server (NTRS)
Williams, David H.; Green, Steven M.
1991-01-01
Advanced Air Traffic Control (ATC) systems are being developed which contain time-based (4D) trajectory predictions of aircraft. Airborne flight management systems (FMS) exist or are being developed with similar 4D trajectory generation capabilities. Differences between the ATC generated profiles and those generated by the airborne 4D FMS may introduce system problems. A simulation experiment was conducted to explore integration of a 4D equipped aircraft into a 4D ATC system. The NASA Langley Transport Systems Research Vehicle cockpit simulator was linked in real time to the NASA Ames Descent Advisor ATC simulation for this effort. Candidate procedures for handling 4D equipped aircraft were devised and traffic scenarios established which required time delays absorbed through speed control alone or in combination with path stretching. Dissimilarities in 4D speed strategies between airborne and ATC generated trajectories were tested in these scenarios. The 4D procedures and FMS operation were well received by airline pilot test subjects, who achieved an arrival accuracy at the metering fix of 2.9 seconds standard deviation time error. The amount and nature of the information transmitted during a time clearance were found to be somewhat of a problem using the voice radio communication channel. Dissimilarities between airborne and ATC-generated speed strategies were found to be a problem when the traffic remained on established routes. It was more efficient for 4D equipped aircraft to fly trajectories with similar, though less fuel efficient, speeds which conform to the ATC strategy. Heavy traffic conditions, where time delays forced off-route path stretching, were found to produce a potential operational benefit of the airborne 4D FMS.
Carpenter, Peter W; Kudar, Karen L; Ali, Reza; Sen, Pradeep K; Davies, Christopher
2007-10-15
We present a relatively simple, deterministic, theoretical model for the sublayer streaks in a turbulent boundary layer based on an analogy with Klebanoff modes. Our approach is to generate the streamwise vortices found in the buffer layer by means of a vorticity source in the form of a fictitious body force. It is found that the strongest streaks correspond to a spanwise wavelength that lies within the range of the experimentally observed values for the statistical mean streak spacing. We also present results showing the effect of streamwise pressure gradient, Reynolds number and wall compliance on the sublayer streaks. The theoretical predictions for the effects of wall compliance on the streak characteristics agree well with experimental data. Our proposed theoretical model for the quasi-periodic bursting cycle is also described, which places the streak modelling in context. The proposed bursting process is as follows: (i) streamwise vortices generate sublayer streaks and other vortical elements generate propagating plane waves, (ii) when the streaks reach a sufficient amplitude, they interact nonlinearly with the plane waves to produce oblique waves that exhibit transient growth, and (iii) the oblique waves interact nonlinearly with the plane wave to generate streamwise vortices; these in turn generate the sublayer streaks and so the cycle is renewed.
Doulamis, A D; Doulamis, N D; Kollias, S D
2003-01-01
Multimedia services and especially digital video is expected to be the major traffic component transmitted over communication networks [such as internet protocol (IP)-based networks]. For this reason, traffic characterization and modeling of such services are required for an efficient network operation. The generated models can be used as traffic rate predictors, during the network operation phase (online traffic modeling), or as video generators for estimating the network resources, during the network design phase (offline traffic modeling). In this paper, an adaptable neural-network architecture is proposed covering both cases. The scheme is based on an efficient recursive weight estimation algorithm, which adapts the network response to current conditions. In particular, the algorithm updates the network weights so that 1) the network output, after the adaptation, is approximately equal to current bit rates (current traffic statistics) and 2) a minimal degradation over the obtained network knowledge is provided. It can be shown that the proposed adaptable neural-network architecture simulates a recursive nonlinear autoregressive model (RNAR) similar to the notation used in the linear case. The algorithm presents low computational complexity and high efficiency in tracking traffic rates in contrast to conventional retraining schemes. Furthermore, for the problem of offline traffic modeling, a novel correlation mechanism is proposed for capturing the burstness of the actual MPEG video traffic. The performance of the model is evaluated using several real-life MPEG coded video sources of long duration and compared with other linear/nonlinear techniques used for both cases. The results indicate that the proposed adaptable neural-network architecture presents better performance than other examined techniques.
Empirical Data Fusion for Convective Weather Hazard Nowcasting
NASA Astrophysics Data System (ADS)
Williams, J.; Ahijevych, D.; Steiner, M.; Dettling, S.
2009-09-01
This paper describes a statistical analysis approach to developing an automated convective weather hazard nowcast system suitable for use by aviation users in strategic route planning and air traffic management. The analysis makes use of numerical weather prediction model fields and radar, satellite, and lightning observations and derived features along with observed thunderstorm evolution data, which are aligned using radar-derived motion vectors. Using a dataset collected during the summers of 2007 and 2008 over the eastern U.S., the predictive contributions of the various potential predictor fields are analyzed for various spatial scales, lead-times and scenarios using a technique called random forests (RFs). A minimal, skillful set of predictors is selected for each scenario requiring distinct forecast logic, and RFs are used to construct an empirical probabilistic model for each. The resulting data fusion system, which ran in real-time at the National Center for Atmospheric Research during the summer of 2009, produces probabilistic and deterministic nowcasts of the convective weather hazard and assessments of the prediction uncertainty. The nowcasts' performance and results for several case studies are presented to demonstrate the value of this approach. This research has been funded by the U.S. Federal Aviation Administration to support the development of the Consolidated Storm Prediction for Aviation (CoSPA) system, which is intended to provide convective hazard nowcasts and forecasts for the U.S. Next Generation Air Transportation System (NextGen).
NASA Technical Reports Server (NTRS)
Lax, F. M.
1975-01-01
A time-controlled navigation system applicable to the descent phase of flight for airline transport aircraft was developed and simulated. The design incorporates the linear discrete-time sampled-data version of the linearized continuous-time system describing the aircraft's aerodynamics. Using optimal linear quadratic control techniques, an optimal deterministic control regulator which is implementable on an airborne computer is designed. The navigation controller assists the pilot in complying with assigned times of arrival along a four-dimensional flight path in the presence of wind disturbances. The strategic air traffic control concept is also described, followed by the design of a strategic control descent path. A strategy for determining possible times of arrival at specified waypoints along the descent path and for generating the corresponding route-time profiles that are within the performance capabilities of the aircraft is presented. Using a mathematical model of the Boeing 707-320B aircraft along with a Boeing 707 cockpit simulator interfaced with an Adage AGT-30 digital computer, a real-time simulation of the complete aircraft aerodynamics was achieved. The strategic four-dimensional navigation controller for longitudinal dynamics was tested on the nonlinear aircraft model in the presence of 15, 30, and 45 knot head-winds. The results indicate that the controller preserved the desired accuracy and precision of a time-controlled aircraft navigation system.
Sakai, Kenshi; Upadhyaya, Shrinivasa K; Andrade-Sanchez, Pedro; Sviridova, Nina V
2017-03-01
Real-world processes are often combinations of deterministic and stochastic processes. Soil failure observed during farm tillage is one example of this phenomenon. In this paper, we investigated the nonlinear features of soil failure patterns in a farm tillage process. We demonstrate emerging determinism in soil failure patterns from stochastic processes under specific soil conditions. We normalized the deterministic nonlinear prediction considering autocorrelation and propose it as a robust way of extracting a nonlinear dynamical system from noise contaminated motion. Soil is a typical granular material. The results obtained here are expected to be applicable to granular materials in general. From a global scale to nano scale, the granular material is featured in seismology, geotechnology, soil mechanics, and particle technology. The results and discussions presented here are applicable in these wide research areas. The proposed method and our findings are useful with respect to the application of nonlinear dynamics to investigate complex motions generated from granular materials.
NASA Astrophysics Data System (ADS)
Sohn, Hyunmin; Liang, Cheng-yen; Nowakowski, Mark E.; Hwang, Yongha; Han, Seungoh; Bokor, Jeffrey; Carman, Gregory P.; Candler, Robert N.
2017-10-01
We demonstrate deterministic multi-step rotation of a magnetic single-domain (SD) state in Nickel nanodisks using the multiferroic magnetoelastic effect. Ferromagnetic Nickel nanodisks are fabricated on a piezoelectric Lead Zirconate Titanate (PZT) substrate, surrounded by patterned electrodes. With the application of a voltage between opposing electrode pairs, we generate anisotropic in-plane strains that reshape the magnetic energy landscape of the Nickel disks, reorienting magnetization toward a new easy axis. By applying a series of voltages sequentially to adjacent electrode pairs, circulating in-plane anisotropic strains are applied to the Nickel disks, deterministically rotating a SD state in the Nickel disks by increments of 45°. The rotation of the SD state is numerically predicted by a fully-coupled micromagnetic/elastodynamic finite element analysis (FEA) model, and the predictions are experimentally verified with magnetic force microscopy (MFM). This experimental result will provide a new pathway to develop energy efficient magnetic manipulation techniques at the nanoscale.
NASA Astrophysics Data System (ADS)
Cai, Kaiming; Yang, Meiyin; Ju, Hailang; Wang, Sumei; Ji, Yang; Li, Baohe; Edmonds, Kevin William; Sheng, Yu; Zhang, Bao; Zhang, Nan; Liu, Shuai; Zheng, Houzhi; Wang, Kaiyou
2017-07-01
All-electrical and programmable manipulations of ferromagnetic bits are highly pursued for the aim of high integration and low energy consumption in modern information technology. Methods based on the spin-orbit torque switching in heavy metal/ferromagnet structures have been proposed with magnetic field, and are heading toward deterministic switching without external magnetic field. Here we demonstrate that an in-plane effective magnetic field can be induced by an electric field without breaking the symmetry of the structure of the thin film, and realize the deterministic magnetization switching in a hybrid ferromagnetic/ferroelectric structure with Pt/Co/Ni/Co/Pt layers on PMN-PT substrate. The effective magnetic field can be reversed by changing the direction of the applied electric field on the PMN-PT substrate, which fully replaces the controllability function of the external magnetic field. The electric field is found to generate an additional spin-orbit torque on the CoNiCo magnets, which is confirmed by macrospin calculations and micromagnetic simulations.
NASA Astrophysics Data System (ADS)
Fischer, P.; Jardani, A.; Lecoq, N.
2018-02-01
In this paper, we present a novel inverse modeling method called Discrete Network Deterministic Inversion (DNDI) for mapping the geometry and property of the discrete network of conduits and fractures in the karstified aquifers. The DNDI algorithm is based on a coupled discrete-continuum concept to simulate numerically water flows in a model and a deterministic optimization algorithm to invert a set of observed piezometric data recorded during multiple pumping tests. In this method, the model is partioned in subspaces piloted by a set of parameters (matrix transmissivity, and geometry and equivalent transmissivity of the conduits) that are considered as unknown. In this way, the deterministic optimization process can iteratively correct the geometry of the network and the values of the properties, until it converges to a global network geometry in a solution model able to reproduce the set of data. An uncertainty analysis of this result can be performed from the maps of posterior uncertainties on the network geometry or on the property values. This method has been successfully tested for three different theoretical and simplified study cases with hydraulic responses data generated from hypothetical karstic models with an increasing complexity of the network geometry, and of the matrix heterogeneity.
JFK airport ground control recommendations.
DOT National Transportation Integrated Search
1971-11-01
The object of this effort was to generate a detailed recommendation on what to do about the JFK Airport Ground Traffic Control Problem, including a review of STRACS, a Surface Traffic Control System. Problem areas were identified by direct observatio...
Data-driven traffic impact assessment tool for work zones.
DOT National Transportation Integrated Search
2017-03-01
Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...
Planning Inmarsat's second generation of spacecraft
NASA Astrophysics Data System (ADS)
Williams, W. P.
1982-09-01
The next generation of studies of the Inmarsat service are outlined, such as traffic forecasting studies, communications capacity estimates, space segment design, cost estimates, and financial analysis. Traffic forecasting will require future demand estimates, and a computer model has been developed which estimates demand over the Atlantic, Pacific, and Indian ocean regions. Communications estimates are based on traffic estimates, as a model converts traffic demand into a required capacity figure for a given area. The Erlang formula is used, requiring additional data such as peak hour ratios and distribution estimates. Basic space segment technical requirements are outlined (communications payload, transponder arrangements, etc), and further design studies involve such areas as space segment configuration, launcher and spacecraft studies, transmission planning, and earth segment configurations. Cost estimates of proposed design parameters will be performed, but options must be reduced to make construction feasible. Finally, a financial analysis will be carried out in order to calculate financial returns.
NASA Astrophysics Data System (ADS)
Garcia, Elena
The demand for air travel is expanding beyond the capacity of the existing National Airspace System. Excess traffic results in delays and compromised safety. Thus, a number of initiatives to improve airspace capacity have been proposed. To assess the impact of these technologies on air traffic one must move beyond the vehicle to a system-of-systems point of view. This top-level perspective must include consideration of the aircraft, airports, air traffic control and airlines that make up the airspace system. In addition to these components and their interactions economics, safety and government regulations must also be considered. Furthermore, the air transportation system is inherently variable with changes in everything from fuel prices to the weather. The development of a modeling environment that enables a comprehensive probabilistic evaluation of technological impacts was the subject of this thesis. The final modeling environment developed used economics as the thread to tie the airspace components together. Airport capacities and delays were calculated explicitly with due consideration to the impacts of air traffic control. The delay costs were then calculated for an entire fleet, and an airline economic analysis, considering the impact of these costs, was carried out. Airline return on investment was considered the metric of choice since it brings together all costs and revenues, including the cost of delays, landing fees for airport use and aircraft financing costs. Safety was found to require a level of detail unsuitable for a system-of-systems approach and was relegated to future airspace studies. Environmental concerns were considered to be incorporated into airport regulations and procedures and were not explicitly modeled. A deterministic case study was developed to test this modeling environment. The Atlanta airport operations for the year 2000 were used for validation purposes. A 2005 baseline was used as a basis for comparing the four technologies considered: a very large aircraft, Terminal Area Productivity air traffic control technologies, smoothing of an airline schedule, and the addition of a runway. A case including all four technologies simultaneously was also considered. Unfortunately, the complexity of the system prevented full exploration of the probabilistic aspects of the National Airspace System.
CHAOS AND STOCHASTICITY IN DETERMINISTICALLY GENERATED MULTIFRACTAL MEASURES. (R824780)
The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...
Marzocchi, W.; Vilardo, G.; Hill, D.P.; Ricciardi, G.P.; Ricco, C.
2001-01-01
We analyzed and compared the seismic activity that has occurred in the last two to three decades in three distinct volcanic areas: Phlegraean Fields, Italy; Vesuvius, Italy; and Long Valley, California. Our main goal is to identify and discuss common features and peculiarities in the temporal evolution of earthquake sequences that may reflect similarities and differences in the generating processes between these volcanic systems. In particular, we tried to characterize the time series of the number of events and of the seismic energy release in terms of stochastic, deterministic, and chaotic components. The time sequences from each area consist of thousands of earthquakes that allow a detailed quantitative analysis and comparison. The results obtained showed no evidence for either deterministic or chaotic components in the earthquake sequences in Long Valley caldera, which appears to be dominated by stochastic behavior. In contrast, earthquake sequences at Phlegrean Fields and Mount Vesuvius show a deterministic signal mainly consisting of a 24-hour periodicity. Our analysis suggests that the modulation in seismicity is in some way related to thermal diurnal processes, rather than luni-solar tidal effects. Independently from the process that generates these periodicities on the seismicity., it is suggested that the lack (or presence) of diurnal cycles is seismic swarms of volcanic areas could be closely linked to the presence (or lack) of magma motion.
NASA Astrophysics Data System (ADS)
Liu, Gang; He, Jing; Luo, Zhiyong; Yang, Wunian; Zhang, Xiping
2015-05-01
It is important to study the effects of pedestrian crossing behaviors on traffic flow for solving the urban traffic jam problem. Based on the Nagel-Schreckenberg (NaSch) traffic cellular automata (TCA) model, a new one-dimensional TCA model is proposed considering the uncertainty conflict behaviors between pedestrians and vehicles at unsignalized mid-block crosswalks and defining the parallel updating rules of motion states of pedestrians and vehicles. The traffic flow is simulated for different vehicle densities and behavior trigger probabilities. The fundamental diagrams show that no matter what the values of vehicle braking probability, pedestrian acceleration crossing probability, pedestrian backing probability and pedestrian generation probability, the system flow shows the "increasing-saturating-decreasing" trend with the increase of vehicle density; when the vehicle braking probability is lower, it is easy to cause an emergency brake of vehicle and result in great fluctuation of saturated flow; the saturated flow decreases slightly with the increase of the pedestrian acceleration crossing probability; when the pedestrian backing probability lies between 0.4 and 0.6, the saturated flow is unstable, which shows the hesitant behavior of pedestrians when making the decision of backing; the maximum flow is sensitive to the pedestrian generation probability and rapidly decreases with increasing the pedestrian generation probability, the maximum flow is approximately equal to zero when the probability is more than 0.5. The simulations prove that the influence of frequent crossing behavior upon vehicle flow is immense; the vehicle flow decreases and gets into serious congestion state rapidly with the increase of the pedestrian generation probability.
A Simple Two Aircraft Conflict Resolution Algorithm
NASA Technical Reports Server (NTRS)
Chatterji, Gano B.
1999-01-01
Conflict detection and resolution methods are crucial for distributed air-ground traffic management in which the crew in the cockpit, dispatchers in operation control centers and air traffic controllers in the ground-based air traffic management facilities share information and participate in the traffic flow and traffic control imctions.This paper describes a conflict detection and a conflict resolution method. The conflict detection method predicts the minimum separation and the time-to-go to the closest point of approach by assuming that both the aircraft will continue to fly at their current speeds along their current headings. The conflict resolution method described here is motivated by the proportional navigation algorithm. It generates speed and heading commands to rotate the line-of-sight either clockwise or counter-clockwise for conflict resolution. Once the aircraft achieve a positive range-rate and no further conflict is predicted, the algorithm generates heading commands to turn back the aircraft to their nominal trajectories. The speed commands are set to the optimal pre-resolution speeds. Six numerical examples are presented to demonstrate the conflict detection and resolution method.
Transforming the NAS: The Next Generation Air Traffic Control System
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
2004-01-01
The next-generation air traffic control system must be designed to safely and efficiently accommodate the large growth of traffic expected in the near future. It should be sufficiently scalable to contend with the factor of 2 or more increase in demand expected by the year 2020. Analysis has shown that the current method of controlling air traffic cannot be scaled up to provide such levels of capacity. Therefore, to achieve a large increase in capacity while also giving pilots increased freedom to optimize their flight trajectories requires a fundamental change in the way air traffic is controlled. The key to achieving a factor of 2 or more increase in airspace capacity is to automate separation monitoring and control and to use an air-ground data link to send trajectories and clearances directly between ground-based and airborne systems. In addition to increasing capacity and offering greater flexibility in the selection of trajectories, this approach also has the potential to increase safety by reducing controller and pilot errors that occur in routine monitoring and voice communication tasks.
Cluster-type entangled coherent states: Generation and application
DOE Office of Scientific and Technical Information (OSTI.GOV)
An, Nguyen Ba; Kim, Jaewan; Korea Institute for Advanced Study, 207-43 Cheongryangni 2-dong, Dongdaemun-gu, Seoul 130-722
2009-10-15
We consider a type of (M+N)-mode entangled coherent states and propose a simple deterministic scheme to generate these states that can fly freely in space. We then exploit such free-flying states to teleport certain kinds of superpositions of multimode coherent states. We also address the issue of manipulating size and type of entangled coherent states by means of linear optics elements only.
Cluster-type entangled coherent states: Generation and application
NASA Astrophysics Data System (ADS)
An, Nguyen Ba; Kim, Jaewan
2009-10-01
We consider a type of (M+N) -mode entangled coherent states and propose a simple deterministic scheme to generate these states that can fly freely in space. We then exploit such free-flying states to teleport certain kinds of superpositions of multimode coherent states. We also address the issue of manipulating size and type of entangled coherent states by means of linear optics elements only.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, Emily B.; Crump, Alex R.; Resch, Charles T.
2017-03-28
Subsurface zones of groundwater and surface water mixing (hyporheic zones) are regions of enhanced rates of biogeochemical cycling, yet ecological processes governing hyporheic microbiome composition and function through space and time remain unknown. We sampled attached and planktonic microbiomes in the Columbia River hyporheic zone across seasonal hydrologic change, and employed statistical null models to infer mechanisms generating temporal changes in microbiomes within three hydrologically-connected, physicochemically-distinct geographic zones (inland, nearshore, river). We reveal that microbiomes remain dissimilar through time across all zones and habitat types (attached vs. planktonic) and that deterministic assembly processes regulate microbiome composition in all data subsets.more » The consistent presence of heterotrophic taxa and members of the Planctomycetes-Verrucomicrobia-Chlamydiae (PVC) superphylum nonetheless suggests common selective pressures for physiologies represented in these groups. Further, co-occurrence networks were used to provide insight into taxa most affected by deterministic assembly processes. We identified network clusters to represent groups of organisms that correlated with seasonal and physicochemical change. Extended network analyses identified keystone taxa within each cluster that we propose are central in microbiome composition and function. Finally, the abundance of one network cluster of nearshore organisms exhibited a seasonal shift from heterotrophic to autotrophic metabolisms and correlated with microbial metabolism, possibly indicating an ecological role for these organisms as foundational species in driving biogeochemical reactions within the hyporheic zone. Taken together, our research demonstrates a predominant role for deterministic assembly across highly-connected environments and provides insight into niche dynamics associated with seasonal changes in hyporheic microbiome composition and metabolism.« less
Next generation traffic management centers.
DOT National Transportation Integrated Search
2013-05-01
Traffic management centers (TMCs) are critical to providing mobility to millions of people travelling on high-volume roadways. In Virginia, as with most regions of the United States, TMCs were aggressively deployed in the late 1990s and early 2000s. ...
Research implementation of the SMART SIGNAL system on Trunk Highway (TH) 13.
DOT National Transportation Integrated Search
2013-02-01
In our previous research, the SMART-SIGNAL (Systematic Monitoring of Arterial Road Traffic and Signals) : system that can collect event-based traffic data and generate comprehensive performance measures has been : successfully developed by the Univer...
Guidelines for traffic signal energy back\\0x2010up systems : final report.
DOT National Transportation Integrated Search
2009-07-01
Power outages affect traffic signalized intersections, leading to potentially serious problems. Current practices of responding to power failures are very basic, ranging from do nothing to installing portable generators. The purpose of this res...
Al-Shargabi, Mohammed A; Shaikh, Asadullah; Ismail, Abdulsamad S
2016-01-01
Optical burst switching (OBS) networks have been attracting much consideration as a promising approach to build the next generation optical Internet. A solution for enhancing the Quality of Service (QoS) for high priority real time traffic over OBS with the fairness among the traffic types is absent in current OBS' QoS schemes. In this paper we present a novel Real Time Quality of Service with Fairness Ratio (RT-QoSFR) scheme that can adapt the burst assembly parameters according to the traffic QoS needs in order to enhance the real time traffic QoS requirements and to ensure the fairness for other traffic. The results show that RT-QoSFR scheme is able to fulfill the real time traffic requirements (end to end delay, and loss rate) ensuring the fairness for other traffics under various conditions such as the type of real time traffic and traffic load. RT-QoSFR can guarantee that the delay of the real time traffic packets does not exceed the maximum packets transfer delay value. Furthermore, it can reduce the real time traffic packets loss, at the same time guarantee the fairness for non real time traffic packets by determining the ratio of real time traffic inside the burst to be 50-60%, 30-40%, and 10-20% for high, normal, and low traffic loads respectively.
2013-06-01
of the ATCIS in the NetSPIN Name Main functions Terminal Functions as the terminal that generates traffics MFE (Multi-Function accessing...generates traffics : MFE Function to transform messages of SST into TCP liP packets (Multi-Function accessing Equipment) Termmal PPP Functions of the...center Operation battalion DMT Computer shelter DLP Operation center MFE DMTTerminal Command post of a corps Brigade communication Operation
Emerging Definition of Next-Generation of Aeronautical Communications
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.
2006-01-01
Aviation continues to experience rapid growth. In regions such as the United States and Europe air traffic congestion is constraining operations, leading to major new efforts to develop methodologies and infrastructures to enable continued aviation growth through transformational air traffic management systems. Such a transformation requires better communications linking airborne and ground-based elements. Technologies for next-generation communications, the required capacities, frequency spectrum of operation, network interconnectivity, and global interoperability are now receiving increased attention. A number of major planning and development efforts have taken place or are in process now to define the transformed airspace of the future. These activities include government and industry led efforts in the United States and Europe, and by international organizations. This paper will review the features, approaches, and activities of several representative planning and development efforts, and identify the emerging global consensus on requirements of next generation aeronautical communications systems for air traffic control.
A traffic analyzer for multiple SpaceWire links
NASA Astrophysics Data System (ADS)
Liu, Scige J.; Giusi, Giovanni; Di Giorgio, Anna M.; Vertolli, Nello; Galli, Emanuele; Biondi, David; Farina, Maria; Pezzuto, Stefano; Spinoglio, Luigi
2014-07-01
Modern space missions are becoming increasingly complex: the interconnection of the units in a satellite is now a network of terminals linked together through routers, where devices with different level of automation and intelligence share the same data-network. The traceability of the network transactions is performed mostly at terminal level through log analysis and hence it is difficult to verify in real time the reliability of the interconnections and the interchange protocols. To improve and ease the traffic analysis in a SpaceWire network we implemented a low-level link analyzer, with the specific goal to simplify the integration and test phases in the development of space instrumentation. The traffic analyzer collects signals coming from pod probes connected in-series on the interested links between two SpaceWire terminals. With respect to the standard traffic analyzers, the design of this new tool includes the possibility to internally reshape the LVDS signal. This improvement increases the robustness of the analyzer towards environmental noise effects and guarantees a deterministic delay on all analyzed signals. The analyzer core is implemented on a Xilinx FPGA, programmed to decode the bidirectional LVDS signals at Link and Network level. Successively, the core packetizes protocol characters in homogeneous sets of time ordered events. The analyzer provides time-tagging functionality for each characters set, with a precision down to the FPGA Clock, i.e. about 20nsec in the adopted HW environment. The use of a common time reference for each character stream allows synchronous performance measurements. The collected information is then routed to an external computer for quick analysis: this is done via high-speed USB2 connection. With this analyzer it is possible to verify the link performances in terms of induced delays in the transmitted signals. A case study focused on the analysis of the Time-Code synchronization in presence of a SpaceWire Router is shown in this paper as well.
Air-Traffic Controllers Evaluate The Descent Advisor
NASA Technical Reports Server (NTRS)
Tobias, Leonard; Volckers, Uwe; Erzberger, Heinz
1992-01-01
Report describes study of Descent Advisor algorithm: software automation aid intended to assist air-traffic controllers in spacing traffic and meeting specified times or arrival. Based partly on mathematical models of weather conditions and performances of aircraft, it generates suggested clearances, including top-of-descent points and speed-profile data to attain objectives. Study focused on operational characteristics with specific attention to how it can be used for prediction, spacing, and metering.
NASA Astrophysics Data System (ADS)
Marinas, Javier; Salgado, Luis; Arróspide, Jon; Camplani, Massimo
2012-01-01
In this paper we propose an innovative method for the automatic detection and tracking of road traffic signs using an onboard stereo camera. It involves a combination of monocular and stereo analysis strategies to increase the reliability of the detections such that it can boost the performance of any traffic sign recognition scheme. Firstly, an adaptive color and appearance based detection is applied at single camera level to generate a set of traffic sign hypotheses. In turn, stereo information allows for sparse 3D reconstruction of potential traffic signs through a SURF-based matching strategy. Namely, the plane that best fits the cloud of 3D points traced back from feature matches is estimated using a RANSAC based approach to improve robustness to outliers. Temporal consistency of the 3D information is ensured through a Kalman-based tracking stage. This also allows for the generation of a predicted 3D traffic sign model, which is in turn used to enhance the previously mentioned color-based detector through a feedback loop, thus improving detection accuracy. The proposed solution has been tested with real sequences under several illumination conditions and in both urban areas and highways, achieving very high detection rates in challenging environments, including rapid motion and significant perspective distortion.
Piloted simulation of a ground-based time-control concept for air traffic control
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Green, Steven M.
1989-01-01
A concept for aiding air traffic controllers in efficiently spacing traffic and meeting scheduled arrival times at a metering fix was developed and tested in a real time simulation. The automation aid, referred to as the ground based 4-D descent advisor (DA), is based on accurate models of aircraft performance and weather conditions. The DA generates suggested clearances, including both top-of-descent-point and speed-profile data, for one or more aircraft in order to achieve specific time or distance separation objectives. The DA algorithm is used by the air traffic controller to resolve conflicts and issue advisories to arrival aircraft. A joint simulation was conducted using a piloted simulator and an advanced concept air traffic control simulation to study the acceptability and accuracy of the DA automation aid from both the pilot's and the air traffic controller's perspectives. The results of the piloted simulation are examined. In the piloted simulation, airline crews executed controller issued descent advisories along standard curved path arrival routes, and were able to achieve an arrival time precision of + or - 20 sec at the metering fix. An analysis of errors generated in turns resulted in further enhancements of the algorithm to improve the predictive accuracy. Evaluations by pilots indicate general support for the concept and provide specific recommendations for improvement.
Measuring the effects of aborted takeoffs and landings on traffic flow at JFK
DOT National Transportation Integrated Search
2012-10-14
The FAA Office of Accident Investigation and Prevention (AVP) supports research, analysis and demonstration of quantitative air traffic analyses to estimate safety performance and benefits of the Next Generation Air Transportation System (NextGen). T...
Supplementary Computer Generated Cueing to Enhance Air Traffic Controller Efficiency
2013-03-01
assess the complexity of air traffic control (Mogford, Guttman, Morrow, & Kopardekar, 1995; Laudeman, Shelden, Branstrom, & Brasil , 1998). Controllers...Behaviorial Sciences: Volume 1: Methodological Issues Volume 2: Statistical Issues, 1, 257. Laudeman, I. V., Shelden, S. G., Branstrom, R., & Brasil
Realistic Data-Driven Traffic Flow Animation Using Texture Synthesis.
Chao, Qianwen; Deng, Zhigang; Ren, Jiaping; Ye, Qianqian; Jin, Xiaogang
2018-02-01
We present a novel data-driven approach to populate virtual road networks with realistic traffic flows. Specifically, given a limited set of vehicle trajectories as the input samples, our approach first synthesizes a large set of vehicle trajectories. By taking the spatio-temporal information of traffic flows as a 2D texture, the generation of new traffic flows can be formulated as a texture synthesis process, which is solved by minimizing a newly developed traffic texture energy. The synthesized output captures the spatio-temporal dynamics of the input traffic flows, and the vehicle interactions in it strictly follow traffic rules. After that, we position the synthesized vehicle trajectory data to virtual road networks using a cage-based registration scheme, where a few traffic-specific constraints are enforced to maintain each vehicle's original spatial location and synchronize its motion in concert with its neighboring vehicles. Our approach is intuitive to control and scalable to the complexity of virtual road networks. We validated our approach through many experiments and paired comparison user studies.
Relationship between microscopic dynamics in traffic flow and complexity in networks.
Li, Xin-Gang; Gao, Zi-You; Li, Ke-Ping; Zhao, Xiao-Mei
2007-07-01
Complex networks are constructed in the evolution process of traffic flow, and the states of traffic flow are represented by nodes in the network. The traffic dynamics can then be studied by investigating the statistical properties of those networks. According to Kerner's three-phase theory, there are two different phases in congested traffic, synchronized flow and wide moving jam. In the framework of this theory, we study different properties of synchronized flow and moving jam in relation to complex network. Scale-free network is constructed in stop-and-go traffic, i.e., a sequence of moving jams [Chin. Phys. Lett. 10, 2711 (2005)]. In this work, the networks generated in synchronized flow are investigated in detail. Simulation results show that the degree distribution of the networks constructed in synchronized flow has two power law regions, so the distinction in topological structure can really reflect the different dynamics in traffic flow. Furthermore, the real traffic data are investigated by this method, and the results are consistent with the simulations.
DOT National Transportation Integrated Search
2004-11-01
The motivation behind the Transportation Infrastructure and Traffic Management Analysis of : Cross Border Bottlenecks study was generated by the U.S.-Mexico Border Partnership Action : Plan (Action item #2 of the 22-Point Smart Border Action Plan: De...
A measurement of disorder in binary sequences
NASA Astrophysics Data System (ADS)
Gong, Longyan; Wang, Haihong; Cheng, Weiwen; Zhao, Shengmei
2015-03-01
We propose a complex quantity, AL, to characterize the degree of disorder of L-length binary symbolic sequences. As examples, we respectively apply it to typical random and deterministic sequences. One kind of random sequences is generated from a periodic binary sequence and the other is generated from the logistic map. The deterministic sequences are the Fibonacci and Thue-Morse sequences. In these analyzed sequences, we find that the modulus of AL, denoted by |AL | , is a (statistically) equivalent quantity to the Boltzmann entropy, the metric entropy, the conditional block entropy and/or other quantities, so it is a useful quantitative measure of disorder. It can be as a fruitful index to discern which sequence is more disordered. Moreover, there is one and only one value of |AL | for the overall disorder characteristics. It needs extremely low computational costs. It can be easily experimentally realized. From all these mentioned, we believe that the proposed measure of disorder is a valuable complement to existing ones in symbolic sequences.
Characterization, adaptive traffic shaping, and multiplexing of real-time MPEG II video
NASA Astrophysics Data System (ADS)
Agrawal, Sanjay; Barry, Charles F.; Binnai, Vinay; Kazovsky, Leonid G.
1997-01-01
We obtain network traffic model for real-time MPEG-II encoded digital video by analyzing video stream samples from real-time encoders from NUKO Information Systems. MPEG-II sample streams include a resolution intensive movie, City of Joy, an action intensive movie, Aliens, a luminance intensive (black and white) movie, Road To Utopia, and a chrominance intensive (color) movie, Dick Tracy. From our analysis we obtain a heuristic model for the encoded video traffic which uses a 15-stage Markov process to model the I,B,P frame sequences within a group of pictures (GOP). A jointly-correlated Gaussian process is used to model the individual frame sizes. Scene change arrivals are modeled according to a gamma process. Simulations show that our MPEG-II traffic model generates, I,B,P frame sequences and frame sizes that closely match the sample MPEG-II stream traffic characteristics as they relate to latency and buffer occupancy in network queues. To achieve high multiplexing efficiency we propose a traffic shaping scheme which sets preferred 1-frame generation times among a group of encoders so as to minimize the overall variation in total offered traffic while still allowing the individual encoders to react to scene changes. Simulations show that our scheme results in multiplexing gains of up to 10% enabling us to multiplex twenty 6 Mbps MPEG-II video streams instead of 18 streams over an ATM/SONET OC3 link without latency or cell loss penalty. This scheme is due for a patent.
Data traffic reduction schemes for Cholesky factorization on asynchronous multiprocessor systems
NASA Technical Reports Server (NTRS)
Naik, Vijay K.; Patrick, Merrell L.
1989-01-01
Communication requirements of Cholesky factorization of dense and sparse symmetric, positive definite matrices are analyzed. The communication requirement is characterized by the data traffic generated on multiprocessor systems with local and shared memory. Lower bound proofs are given to show that when the load is uniformly distributed the data traffic associated with factoring an n x n dense matrix using n to the alpha power (alpha less than or equal 2) processors is omega(n to the 2 + alpha/2 power). For n x n sparse matrices representing a square root of n x square root of n regular grid graph the data traffic is shown to be omega(n to the 1 + alpha/2 power), alpha less than or equal 1. Partitioning schemes that are variations of block assignment scheme are described and it is shown that the data traffic generated by these schemes are asymptotically optimal. The schemes allow efficient use of up to O(n to the 2nd power) processors in the dense case and up to O(n) processors in the sparse case before the total data traffic reaches the maximum value of O(n to the 3rd power) and O(n to the 3/2 power), respectively. It is shown that the block based partitioning schemes allow a better utilization of the data accessed from shared memory and thus reduce the data traffic than those based on column-wise wrap around assignment schemes.
Al-Shargabi, Mohammed A.; Ismail, Abdulsamad S.
2016-01-01
Optical burst switching (OBS) networks have been attracting much consideration as a promising approach to build the next generation optical Internet. A solution for enhancing the Quality of Service (QoS) for high priority real time traffic over OBS with the fairness among the traffic types is absent in current OBS’ QoS schemes. In this paper we present a novel Real Time Quality of Service with Fairness Ratio (RT-QoSFR) scheme that can adapt the burst assembly parameters according to the traffic QoS needs in order to enhance the real time traffic QoS requirements and to ensure the fairness for other traffic. The results show that RT-QoSFR scheme is able to fulfill the real time traffic requirements (end to end delay, and loss rate) ensuring the fairness for other traffics under various conditions such as the type of real time traffic and traffic load. RT-QoSFR can guarantee that the delay of the real time traffic packets does not exceed the maximum packets transfer delay value. Furthermore, it can reduce the real time traffic packets loss, at the same time guarantee the fairness for non real time traffic packets by determining the ratio of real time traffic inside the burst to be 50–60%, 30–40%, and 10–20% for high, normal, and low traffic loads respectively. PMID:27583557
Cellular automata model for urban road traffic flow considering pedestrian crossing street
NASA Astrophysics Data System (ADS)
Zhao, Han-Tao; Yang, Shuo; Chen, Xiao-Xu
2016-11-01
In order to analyze the effect of pedestrians' crossing street on vehicle flows, we investigated traffic characteristics of vehicles and pedestrians. Based on that, rules of lane changing, acceleration, deceleration, randomization and update are modified. Then we established two urban two-lane cellular automata models of traffic flow, one of which is about sections with non-signalized crosswalk and the other is on uncontrolled sections with pedestrians crossing street at random. MATLAB is used for numerical simulation of the different traffic conditions; meanwhile space-time diagram and relational graphs of traffic flow parameters are generated and then comparatively analyzed. Simulation results indicate that when vehicle density is lower than around 25 vehs/(km lane), pedestrians have modest impact on traffic flow, whereas when vehicle density is higher than about 60 vehs/(km lane), traffic speed and volume will decrease significantly especially on sections with non-signal-controlled crosswalk. The results illustrate that the proposed models reconstruct the traffic flow's characteristic with the situation where there are pedestrians crossing and can provide some practical reference for urban traffic management.
Noise annoyance through railway traffic - a case study.
Trombetta Zannin, Paulo Henrique; Bunn, Fernando
2014-01-08
This paper describes an assessment of noise caused by railway traffic in a large Latin American city. Measurements were taken of noise levels generated by trains passing through residential neighborhoods with and without blowing their horns. Noise maps were also calculated showing noise pollution generated by the train traffic. In addition - annoyance of the residents - affected by railway noise, was evaluated based on interviews. The measurements indicated that the noise levels generated by the passage of the train with its horn blowing are extremely high, clearly exceeding the daytime limits of equivalent sound pressure level - Leq = 55 dB(A) - established by the municipal laws No 10.625 of the city of Curitiba. The Leq = 45 dB (A) which is the limit for the night period also are exceeded during the passage of trains. The residents reported feeling affected by the noise generated by passing trains, which causes irritability, headaches, poor concentration and insomnia, and 88% of them claimed that nocturnal noise pollution is the most distressing. This study showed that the vast majority of residents surveyed, (69%) believe that the noise of the train can devalue their property.
Transparent flexible nanogenerator as self-powered sensor for transportation monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Zhong Lin; Hu, Youfan; Lin, Long
2016-06-14
A traffic sensor includes a flexible substrate having a top surface. A piezoelectric structure extends from the first electrode layer. The piezoelectric structure has a top end. An insulating layer is infused into the piezoelectric structure. A first electrode layer is disposed on top of the insulating layer. A second electrode layer is disposed below the flexible substrate. A packaging layer is disposed around the substrate, the first electrode layer, the piezoelectric structure, the insulating layer and the second electrode layer. In a method of sensing a traffic parameter, a piezoelectric nanostructure-based traffic sensor is applied to a roadway. Anmore » electrical event generated by the piezoelectric nanostructure-based traffic sensor in response to a vehicle interacting with the piezoelectric nanostructure-based traffic sensor is detected. The electrical event is correlated with the traffic parameter.« less
DOT National Transportation Integrated Search
2004-11-01
The motivation behind the Transportation Infrastructure and Traffic Management Analysis of : Cross Border Bottlenecks study was generated by the U.S.-Mexico Border Partnership Action : Plan (Action item #2 of the 22-Point Smart Border Action Plan: De...
A Simple Two Aircraft Conflict Resolution Algorithm
NASA Technical Reports Server (NTRS)
Chatterji, Gano B.
2006-01-01
Conflict detection and resolution methods are crucial for distributed air-ground traffic management in which the crew in, the cockpit, dispatchers in operation control centers sad and traffic controllers in the ground-based air traffic management facilities share information and participate in the traffic flow and traffic control functions. This paper describes a conflict detection, and a conflict resolution method. The conflict detection method predicts the minimum separation and the time-to-go to the closest point of approach by assuming that both the aircraft will continue to fly at their current speeds along their current headings. The conflict resolution method described here is motivated by the proportional navigation algorithm, which is often used for missile guidance during the terminal phase. It generates speed and heading commands to rotate the line-of-sight either clockwise or counter-clockwise for conflict resolution. Once the aircraft achieve a positive range-rate and no further conflict is predicted, the algorithm generates heading commands to turn back the aircraft to their nominal trajectories. The speed commands are set to the optimal pre-resolution speeds. Six numerical examples are presented to demonstrate the conflict detection, and the conflict resolution methods.
Highway traffic noise prediction based on GIS
NASA Astrophysics Data System (ADS)
Zhao, Jianghua; Qin, Qiming
2014-05-01
Before building a new road, we need to predict the traffic noise generated by vehicles. Traditional traffic noise prediction methods are based on certain locations and they are not only time-consuming, high cost, but also cannot be visualized. Geographical Information System (GIS) can not only solve the problem of manual data processing, but also can get noise values at any point. The paper selected a road segment from Wenxi to Heyang. According to the geographical overview of the study area and the comparison between several models, we combine the JTG B03-2006 model and the HJ2.4-2009 model to predict the traffic noise depending on the circumstances. Finally, we interpolate the noise values at each prediction point and then generate contours of noise. By overlaying the village data on the noise contour layer, we can get the thematic maps. The use of GIS for road traffic noise prediction greatly facilitates the decision-makers because of GIS spatial analysis function and visualization capabilities. We can clearly see the districts where noise are excessive, and thus it becomes convenient to optimize the road line and take noise reduction measures such as installing sound barriers and relocating villages and so on.
Identifying MMORPG Bots: A Traffic Analysis Approach
NASA Astrophysics Data System (ADS)
Chen, Kuan-Ta; Jiang, Jhih-Wei; Huang, Polly; Chu, Hao-Hua; Lei, Chin-Laung; Chen, Wen-Chin
2008-12-01
Massively multiplayer online role playing games (MMORPGs) have become extremely popular among network gamers. Despite their success, one of MMORPG's greatest challenges is the increasing use of game bots, that is, autoplaying game clients. The use of game bots is considered unsportsmanlike and is therefore forbidden. To keep games in order, game police, played by actual human players, often patrol game zones and question suspicious players. This practice, however, is labor-intensive and ineffective. To address this problem, we analyze the traffic generated by human players versus game bots and propose general solutions to identify game bots. Taking Ragnarok Online as our subject, we study the traffic generated by human players and game bots. We find that their traffic is distinguishable by 1) the regularity in the release time of client commands, 2) the trend and magnitude of traffic burstiness in multiple time scales, and 3) the sensitivity to different network conditions. Based on these findings, we propose four strategies and two ensemble schemes to identify bots. Finally, we discuss the robustness of the proposed methods against countermeasures of bot developers, and consider a number of possible ways to manage the increasingly serious bot problem.
Fluidic Energy Harvester Optimization in Grid Turbulence
NASA Astrophysics Data System (ADS)
Danesh-Yazdi, Amir; Elvin, Niell; Andreopoulos, Yiannis
2017-11-01
Even though it is omnipresent in nature, there has not been a great deal of research in the literature involving turbulence as an energy source for piezoelectric fluidic harvesters. In the present work, a grid-generated turbulence forcing function model which we derived previously is employed in the single degree-of-freedom electromechanical equations to find the power output and tip displacement of piezoelectric cantilever beams. Additionally, we utilize simplified, deterministic models of the turbulence forcing function to obtain closed-form expressions for the power output. These theoretical models are studied using experiments that involve separately placing a hot-wire anemometer probe and a short PVDF beam in flows where turbulence is generated by means of passive and semi-passive grids. From a parametric study on the deterministic models, we show that the white noise forcing function best mimics the experimental data. Furthermore, our parametric study of the response spectrum of a generic fluidic harvester in grid-generated turbulent flow shows that optimum power output is attained for beams placed closer to the grid with a low natural frequency and damping ratio and a large electromechanical coupling coefficient. NSF Grant No. CBET 1033117.
Direct generation of linearly polarized single photons with a deterministic axis in quantum dots
NASA Astrophysics Data System (ADS)
Wang, Tong; Puchtler, Tim J.; Patra, Saroj K.; Zhu, Tongtong; Ali, Muhammad; Badcock, Tom J.; Ding, Tao; Oliver, Rachel A.; Schulz, Stefan; Taylor, Robert A.
2017-07-01
We report the direct generation of linearly polarized single photons with a deterministic polarization axis in self-assembled quantum dots (QDs), achieved by the use of non-polar InGaN without complex device geometry engineering. Here, we present a comprehensive investigation of the polarization properties of these QDs and their origin with statistically significant experimental data and rigorous k·p modeling. The experimental study of 180 individual QDs allows us to compute an average polarization degree of 0.90, with a standard deviation of only 0.08. When coupled with theoretical insights, we show that these QDs are highly insensitive to size differences, shape anisotropies, and material content variations. Furthermore, 91% of the studied QDs exhibit a polarization axis along the crystal [1-100] axis, with the other 9% polarized orthogonal to this direction. These features give non-polar InGaN QDs unique advantages in polarization control over other materials, such as conventional polar nitride, InAs, or CdSe QDs. Hence, the ability to generate single photons with polarization control makes non-polar InGaN QDs highly attractive for quantum cryptography protocols.
NASA Astrophysics Data System (ADS)
Larrañeta, M.; Moreno-Tejera, S.; Lillo-Bravo, I.; Silva-Pérez, M. A.
2018-02-01
Many of the available solar radiation databases only provide global horizontal irradiance (GHI) while there is a growing need of extensive databases of direct normal radiation (DNI) mainly for the development of concentrated solar power and concentrated photovoltaic technologies. In the present work, we propose a methodology for the generation of synthetic DNI hourly data from the hourly average GHI values by dividing the irradiance into a deterministic and stochastic component intending to emulate the dynamics of the solar radiation. The deterministic component is modeled through a simple classical model. The stochastic component is fitted to measured data in order to maintain the consistency of the synthetic data with the state of the sky, generating statistically significant DNI data with a cumulative frequency distribution very similar to the measured data. The adaptation and application of the model to the location of Seville shows significant improvements in terms of frequency distribution over the classical models. The proposed methodology applied to other locations with different climatological characteristics better results than the classical models in terms of frequency distribution reaching a reduction of the 50% in the Finkelstein-Schafer (FS) and Kolmogorov-Smirnov test integral (KSI) statistics.
Dynamical behavior of lean swirling premixed flame generated by change in gravitational orientation
NASA Astrophysics Data System (ADS)
Gotoda, Hiroshi; Miyano, Takaya; Shepherd, Ian
2010-11-01
The dynamic behavior of flame front instability in lean swirling premixed flame generated by the effect of gravitational orientation has been experimentally investigated in this work. When the gravitational direction is changed relative to the flame front, i.e., in inverted gravity, an unstably fluctuating flame (unstable flame) is formed in a limited domain of equivalence ratio and swirl number (Gotoda. H et al., Physical Review E, vol. 81, 026211, 2010). The time history of flame front fluctuations show that in the buoyancy-dominated region, chaotic irregular fluctuation with low frequencies is superimposed on the dominant periodic oscillation of the unstable flame. This periodic oscillation is produced by unstable large-scale vortex motion in combustion products generated by a change in the buoyancy/swirl interaction due to the inversion of gravitational orientation. As a result, the dynamic behavior of the unstable flame becomes low-dimensional deterministic chaos. Its dynamics maintains low-dimensional deterministic chaos even in the momentum-dominated region, in which vortex breakdown in the combustion products clearly occurs. These results were clearly demonstrated by the use of nonlinear time series analysis based on chaos theory, which has not been widely applied to the investigation of combustion phenomena.
Comparative study on neutronics characteristics of a 1500 MWe metal fuel sodium-cooled fast reactor
Ohgama, Kazuya; Aliberti, Gerardo; Stauff, Nicolas E.; ...
2017-02-28
Under the cooperative effort of the Civil Nuclear Energy R&D Working Group within the framework of the U.S.-Japan bilateral, Argonne National Laboratory (ANL) and Japan Atomic Energy Agency (JAEA) have been performing benchmark study using Japan Sodium-cooled Fast Reactor (JSFR) design with metal fuel. In this benchmark study, core characteristic parameters at the beginning of cycle were evaluated by the best estimate deterministic and stochastic methodologies of ANL and JAEA. The results obtained by both institutions show a good agreement with less than 200 pcm of discrepancy on the neutron multiplication factor, and less than 3% of discrepancy on themore » sodium void reactivity, Doppler reactivity, and control rod worth. The results by the stochastic and deterministic approaches were compared in each party to investigate impacts of the deterministic approximation and to understand potential variations in the results due to different calculation methodologies employed. From the detailed analysis of methodologies, it was found that the good agreement in multiplication factor from the deterministic calculations comes from the cancellation of the differences on the methodology (0.4%) and nuclear data (0.6%). The different treatment in reflector cross section generation was estimated as the major cause of the discrepancy between the multiplication factors by the JAEA and ANL deterministic methodologies. Impacts of the nuclear data libraries were also investigated using a sensitivity analysis methodology. Furthermore, the differences on the inelastic scattering cross sections of U-238, ν values and fission cross sections of Pu-239 and µ-average of Na-23 are the major contributors to the difference on the multiplication factors.« less
Comparative study on neutronics characteristics of a 1500 MWe metal fuel sodium-cooled fast reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohgama, Kazuya; Aliberti, Gerardo; Stauff, Nicolas E.
Under the cooperative effort of the Civil Nuclear Energy R&D Working Group within the framework of the U.S.-Japan bilateral, Argonne National Laboratory (ANL) and Japan Atomic Energy Agency (JAEA) have been performing benchmark study using Japan Sodium-cooled Fast Reactor (JSFR) design with metal fuel. In this benchmark study, core characteristic parameters at the beginning of cycle were evaluated by the best estimate deterministic and stochastic methodologies of ANL and JAEA. The results obtained by both institutions show a good agreement with less than 200 pcm of discrepancy on the neutron multiplication factor, and less than 3% of discrepancy on themore » sodium void reactivity, Doppler reactivity, and control rod worth. The results by the stochastic and deterministic approaches were compared in each party to investigate impacts of the deterministic approximation and to understand potential variations in the results due to different calculation methodologies employed. From the detailed analysis of methodologies, it was found that the good agreement in multiplication factor from the deterministic calculations comes from the cancellation of the differences on the methodology (0.4%) and nuclear data (0.6%). The different treatment in reflector cross section generation was estimated as the major cause of the discrepancy between the multiplication factors by the JAEA and ANL deterministic methodologies. Impacts of the nuclear data libraries were also investigated using a sensitivity analysis methodology. Furthermore, the differences on the inelastic scattering cross sections of U-238, ν values and fission cross sections of Pu-239 and µ-average of Na-23 are the major contributors to the difference on the multiplication factors.« less
Deterministic entanglement of superconducting qubits by parity measurement and feedback.
Ristè, D; Dukalski, M; Watson, C A; de Lange, G; Tiggelman, M J; Blanter, Ya M; Lehnert, K W; Schouten, R N; DiCarlo, L
2013-10-17
The stochastic evolution of quantum systems during measurement is arguably the most enigmatic feature of quantum mechanics. Measuring a quantum system typically steers it towards a classical state, destroying the coherence of an initial quantum superposition and the entanglement with other quantum systems. Remarkably, the measurement of a shared property between non-interacting quantum systems can generate entanglement, starting from an uncorrelated state. Of special interest in quantum computing is the parity measurement, which projects the state of multiple qubits (quantum bits) to a state with an even or odd number of excited qubits. A parity meter must discern the two qubit-excitation parities with high fidelity while preserving coherence between same-parity states. Despite numerous proposals for atomic, semiconducting and superconducting qubits, realizing a parity meter that creates entanglement for both even and odd measurement results has remained an outstanding challenge. Here we perform a time-resolved, continuous parity measurement of two superconducting qubits using the cavity in a three-dimensional circuit quantum electrodynamics architecture and phase-sensitive parametric amplification. Using postselection, we produce entanglement by parity measurement reaching 88 per cent fidelity to the closest Bell state. Incorporating the parity meter in a feedback-control loop, we transform the entanglement generation from probabilistic to fully deterministic, achieving 66 per cent fidelity to a target Bell state on demand. These realizations of a parity meter and a feedback-enabled deterministic measurement protocol provide key ingredients for active quantum error correction in the solid state.
The effects of demand uncertainty on strategic gaming in the merit-order electricity pool market
NASA Astrophysics Data System (ADS)
Frem, Bassam
In a merit-order electricity pool market, generating companies (Gencos) game with their offered incremental cost to meet the electricity demand and earn bigger market shares and higher profits. However when the demand is treated as a random variable instead of as a known constant, these Genco gaming strategies become more complex. After a brief introduction of electricity markets and gaming, the effects of demand uncertainty on strategic gaming are studied in two parts: (1) Demand modelled as a discrete random variable (2) Demand modelled as a continuous random variable. In the first part, we proposed an algorithm, the discrete stochastic strategy (DSS) algorithm that generates a strategic set of offers from the perspective of the Gencos' profits. The DSS offers were tested and compared to the deterministic Nash equilibrium (NE) offers based on the predicted demand. This comparison, based on the expected Genco profits, showed the DSS to be a better strategy in a probabilistic sense than the deterministic NE. In the second part, we presented three gaming strategies: (1) Deterministic NE (2) No-Risk (3) Risk-Taking. The strategies were then tested and their profit performances were compared using two assessment tools: (a) Expected value and standard deviation (b) Inverse cumulative distribution. We concluded that despite yielding higher profit performance under the right conjectures, Risk-Taking strategies are very sensitive to incorrect conjectures on the competitors' gaming decisions. As such, despite its lower profit performance, the No-Risk strategy was deemed preferable.
Barellini, A; Bogi, L; Licitra, G; Silvi, A M; Zari, A
2009-12-01
Air traffic control (ATC) primary radars are 'classical' radars that use echoes of radiofrequency (RF) pulses from aircraft to determine their position. High-power RF pulses radiated from radar antennas may produce high electromagnetic field levels in the surrounding area. Measurement of electromagnetic fields produced by RF-pulsed radar by means of a swept-tuned spectrum analyser are investigated here. Measurements have been carried out both in the laboratory and in situ on signals generated by an ATC primary radar.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
In Search of Determinism-Sensitive Region to Avoid Artefacts in Recurrence Plots
NASA Astrophysics Data System (ADS)
Wendi, Dadiyorto; Marwan, Norbert; Merz, Bruno
As an effort to reduce parameter uncertainties in constructing recurrence plots, and in particular to avoid potential artefacts, this paper presents a technique to derive artefact-safe region of parameter sets. This technique exploits both deterministic (incl. chaos) and stochastic signal characteristics of recurrence quantification (i.e. diagonal structures). It is useful when the evaluated signal is known to be deterministic. This study focuses on the recurrence plot generated from the reconstructed phase space in order to represent many real application scenarios when not all variables to describe a system are available (data scarcity). The technique involves random shuffling of the original signal to destroy its original deterministic characteristics. Its purpose is to evaluate whether the determinism values of the original and the shuffled signal remain closely together, and therefore suggesting that the recurrence plot might comprise artefacts. The use of such determinism-sensitive region shall be accompanied by standard embedding optimization approaches, e.g. using indices like false nearest neighbor and mutual information, to result in a more reliable recurrence plot parameterization.
Deterministic time-reversible thermostats: chaos, ergodicity, and the zeroth law of thermodynamics
NASA Astrophysics Data System (ADS)
Patra, Puneet Kumar; Sprott, Julien Clinton; Hoover, William Graham; Griswold Hoover, Carol
2015-09-01
The relative stability and ergodicity of deterministic time-reversible thermostats, both singly and in coupled pairs, are assessed through their Lyapunov spectra. Five types of thermostat are coupled to one another through a single Hooke's-law harmonic spring. The resulting dynamics shows that three specific thermostat types, Hoover-Holian, Ju-Bulgac, and Martyna-Klein-Tuckerman, have very similar Lyapunov spectra in their equilibrium four-dimensional phase spaces and when coupled in equilibrium or nonequilibrium pairs. All three of these oscillator-based thermostats are shown to be ergodic, with smooth analytic Gaussian distributions in their extended phase spaces (coordinate, momentum, and two control variables). Evidently these three ergodic and time-reversible thermostat types are particularly useful as statistical-mechanical thermometers and thermostats. Each of them generates Gibbs' universal canonical distribution internally as well as for systems to which they are coupled. Thus they obey the zeroth law of thermodynamics, as a good heat bath should. They also provide dissipative heat flow with relatively small nonlinearity when two or more such temperature baths interact and provide useful deterministic replacements for the stochastic Langevin equation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayes, T.; Smith, K.S.; Severino, F.
A critical capability of the new RHIC low level rf (LLRF) system is the ability to synchronize signals across multiple locations. The 'Update Link' provides this functionality. The 'Update Link' is a deterministic serial data link based on the Xilinx RocketIO protocol that is broadcast over fiber optic cable at 1 gigabit per second (Gbps). The link provides timing events and data packets as well as time stamp information for synchronizing diagnostic data from multiple sources. The new RHIC LLRF was designed to be a flexible, modular system. The system is constructed of numerous independent RF Controller chassis. To providemore » synchronization among all of these chassis, the Update Link system was designed. The Update Link system provides a low latency, deterministic data path to broadcast information to all receivers in the system. The Update Link system is based on a central hub, the Update Link Master (ULM), which generates the data stream that is distributed via fiber optic links. Downstream chassis have non-deterministic connections back to the ULM that allow any chassis to provide data that is broadcast globally.« less
Canino-Rodríguez, José M; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G; Travieso-González, Carlos; Alonso-Hernández, Jesús B
2015-03-04
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers' indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications.
Canino-Rodríguez, José M.; García-Herrero, Jesús; Besada-Portas, Juan; Ravelo-García, Antonio G.; Travieso-González, Carlos; Alonso-Hernández, Jesús B.
2015-01-01
The limited efficiency of current air traffic systems will require a next-generation of Smart Air Traffic System (SATS) that relies on current technological advances. This challenge means a transition toward a new navigation and air-traffic procedures paradigm, where pilots and air traffic controllers perform and coordinate their activities according to new roles and technological supports. The design of new Human-Computer Interactions (HCI) for performing these activities is a key element of SATS. However efforts for developing such tools need to be inspired on a parallel characterization of hypothetical air traffic scenarios compatible with current ones. This paper is focused on airborne HCI into SATS where cockpit inputs came from aircraft navigation systems, surrounding traffic situation, controllers’ indications, etc. So the HCI is intended to enhance situation awareness and decision-making through pilot cockpit. This work approach considers SATS as a system distributed on a large-scale with uncertainty in a dynamic environment. Therefore, a multi-agent systems based approach is well suited for modeling such an environment. We demonstrate that current methodologies for designing multi-agent systems are a useful tool to characterize HCI. We specifically illustrate how the selected methodological approach provides enough guidelines to obtain a cockpit HCI design that complies with future SATS specifications. PMID:25746092
Switching performance of OBS network model under prefetched real traffic
NASA Astrophysics Data System (ADS)
Huang, Zhenhua; Xu, Du; Lei, Wen
2005-11-01
Optical Burst Switching (OBS) [1] is now widely considered as an efficient switching technique in building the next generation optical Internet .So it's very important to precisely evaluate the performance of the OBS network model. The performance of the OBS network model is variable in different condition, but the most important thing is that how it works under real traffic load. In the traditional simulation models, uniform traffics are usually generated by simulation software to imitate the data source of the edge node in the OBS network model, and through which the performance of the OBS network is evaluated. Unfortunately, without being simulated by real traffic, the traditional simulation models have several problems and their results are doubtable. To deal with this problem, we present a new simulation model for analysis and performance evaluation of the OBS network, which uses prefetched IP traffic to be data source of the OBS network model. The prefetched IP traffic can be considered as real IP source of the OBS edge node and the OBS network model has the same clock rate with a real OBS system. So it's easy to conclude that this model is closer to the real OBS system than the traditional ones. The simulation results also indicate that this model is more accurate to evaluate the performance of the OBS network system and the results of this model are closer to the actual situation.
DOT National Transportation Integrated Search
2017-11-01
With the emergence of data generated from connected vehicles, connected travelers, and connected infrastructure, the capabilities of traffic management systems or centers (TMCs) will need to be improved to allow agencies to compile and benefit from u...
Modeling and Impacts of Traffic Emissions on Air Toxics Concentrations near Roadways
The dispersion formulation incorporated in the U.S. Environmental Protection Agency’s AERMOD regulatory dispersion model is used to estimate the contribution of traffic-generated emissions of select VOCs – benzene, 1,3-butadiene, toluene – to ambient air concentrations at downwin...
DOT National Transportation Integrated Search
2014-09-09
Automatic Dependent Surveillance-Broadcast (ADS-B) In technology supports the display of traffic data on Cockpit Displays of Traffic Information (CDTIs). The data are used by flightcrews to perform defined self-separation procedures, such as the in-t...
Conducting Safe and Efficient Airport Surface Operations in a NextGen Environment
NASA Technical Reports Server (NTRS)
Jones, Denise R.; Prinzel, Lawrence J., III; Bailey, Randall E.; Arthur, Jarvis J., III; Barnes, James R.
2016-01-01
The Next Generation Air Transportation System (NextGen) vision proposes many revolutionary operational concepts, such as surface trajectory-based operations (STBO) and technologies, including display of traffic information and movements, airport moving maps (AMM), and proactive alerts of runway incursions and surface traffic conflicts, to deliver an overall increase in system capacity and safety. A piloted simulation study was conducted at the National Aeronautics and Space Administration (NASA) Langley Research Center to evaluate the ability of a flight crew to conduct safe and efficient airport surface operations while utilizing an AMM. Position accuracy of traffic was varied, and the effect of traffic position accuracy on airport conflict detection and resolution (CD&R) capability was measured. Another goal was to evaluate the crew's ability to safely conduct STBO by assessing the impact of providing traffic intent information, CD&R system capability, and the display of STBO guidance to the flight crew on both head-down and head-up displays (HUD). Nominal scenarios and off-nominal conflict scenarios were conducted using 12 airline crews operating in a simulated Memphis International Airport terminal environment. The data suggest that all traffic should be shown on the airport moving map, whether qualified or unqualified, and conflict detection and resolution technologies provide significant safety benefits. Despite the presence of traffic information on the map, collisions or near-collisions still occurred; when indications or alerts were generated in these same scenarios, the incidents were averted. During the STBO testing, the flight crews met their required time-of-arrival at route end within 10 seconds on 98 percent of the trials, well within the acceptable performance bounds of 15 seconds. Traffic intent information was found to be useful in determining the intent of conflicting traffic, with graphical presentation preferred. The CD&R system was only minimally effective during STBO because the prevailing visibility was sufficient for visual detection of conflicting traffic. Overall, the pilots indicated STBO increased general situation awareness but also negatively impacted workload, reduced the ability to watch for other traffic, and increased head-down time.
A Flexible Spatio-Temporal Model for Air Pollution with Spatial and Spatio-Temporal Covariates.
Lindström, Johan; Szpiro, Adam A; Sampson, Paul D; Oron, Assaf P; Richards, Mark; Larson, Tim V; Sheppard, Lianne
2014-09-01
The development of models that provide accurate spatio-temporal predictions of ambient air pollution at small spatial scales is of great importance for the assessment of potential health effects of air pollution. Here we present a spatio-temporal framework that predicts ambient air pollution by combining data from several different monitoring networks and deterministic air pollution model(s) with geographic information system (GIS) covariates. The model presented in this paper has been implemented in an R package, SpatioTemporal, available on CRAN. The model is used by the EPA funded Multi-Ethnic Study of Atherosclerosis and Air Pollution (MESA Air) to produce estimates of ambient air pollution; MESA Air uses the estimates to investigate the relationship between chronic exposure to air pollution and cardiovascular disease. In this paper we use the model to predict long-term average concentrations of NO x in the Los Angeles area during a ten year period. Predictions are based on measurements from the EPA Air Quality System, MESA Air specific monitoring, and output from a source dispersion model for traffic related air pollution (Caline3QHCR). Accuracy in predicting long-term average concentrations is evaluated using an elaborate cross-validation setup that accounts for a sparse spatio-temporal sampling pattern in the data, and adjusts for temporal effects. The predictive ability of the model is good with cross-validated R 2 of approximately 0.7 at subject sites. Replacing four geographic covariate indicators of traffic density with the Caline3QHCR dispersion model output resulted in very similar prediction accuracy from a more parsimonious and more interpretable model. Adding traffic-related geographic covariates to the model that included Caline3QHCR did not further improve the prediction accuracy.
A Model for Risk Analysis of Oil Tankers
NASA Astrophysics Data System (ADS)
Montewka, Jakub; Krata, Przemysław; Goerland, Floris; Kujala, Pentti
2010-01-01
The paper presents a model for risk analysis regarding marine traffic, with the emphasis on two types of the most common marine accidents which are: collision and grounding. The focus is on oil tankers as these pose the highest environmental risk. A case study in selected areas of Gulf of Finland in ice free conditions is presented. The model utilizes a well-founded formula for risk calculation, which combines the probability of an unwanted event with its consequences. Thus the model is regarded a block type model, consisting of blocks for the probability of collision and grounding estimation respectively as well as blocks for consequences of an accident modelling. Probability of vessel colliding is assessed by means of a Minimum Distance To Collision (MDTC) based model. The model defines in a novel way the collision zone, using mathematical ship motion model and recognizes traffic flow as a non homogeneous process. The presented calculations address waterways crossing between Helsinki and Tallinn, where dense cross traffic during certain hours is observed. For assessment of a grounding probability, a new approach is proposed, which utilizes a newly developed model, where spatial interactions between objects in different locations are recognized. A ship at a seaway and navigational obstructions may be perceived as interacting objects and their repulsion may be modelled by a sort of deterministic formulation. Risk due to tankers running aground addresses an approach fairway to an oil terminal in Sköldvik, near Helsinki. The consequences of an accident are expressed in monetary terms, and concern costs of an oil spill, based on statistics of compensations claimed from the International Oil Pollution Compensation Funds (IOPC Funds) by parties involved.
Synchrony and entrainment properties of robust circadian oscillators
Bagheri, Neda; Taylor, Stephanie R.; Meeker, Kirsten; Petzold, Linda R.; Doyle, Francis J.
2008-01-01
Systems theoretic tools (i.e. mathematical modelling, control, and feedback design) advance the understanding of robust performance in complex biological networks. We highlight phase entrainment as a key performance measure used to investigate dynamics of a single deterministic circadian oscillator for the purpose of generating insight into the behaviour of a population of (synchronized) oscillators. More specifically, the analysis of phase characteristics may facilitate the identification of appropriate coupling mechanisms for the ensemble of noisy (stochastic) circadian clocks. Phase also serves as a critical control objective to correct mismatch between the biological clock and its environment. Thus, we introduce methods of investigating synchrony and entrainment in both stochastic and deterministic frameworks, and as a property of a single oscillator or population of coupled oscillators. PMID:18426774
Will systems biology offer new holistic paradigms to life sciences?
Conti, Filippo; Valerio, Maria Cristina; Zbilut, Joseph P.
2008-01-01
A biological system, like any complex system, blends stochastic and deterministic features, displaying properties of both. In a certain sense, this blend is exactly what we perceive as the “essence of complexity” given we tend to consider as non-complex both an ideal gas (fully stochastic and understandable at the statistical level in the thermodynamic limit of a huge number of particles) and a frictionless pendulum (fully deterministic relative to its motion). In this commentary we make the statement that systems biology will have a relevant impact on nowadays biology if (and only if) will be able to capture the essential character of this blend that in our opinion is the generation of globally ordered collective modes supported by locally stochastic atomisms. PMID:19003440
Saleh, Khaled; Hossny, Mohammed; Nahavandi, Saeid
2018-06-12
Traffic collisions between kangaroos and motorists are on the rise on Australian roads. According to a recent report, it was estimated that there were more than 20,000 kangaroo vehicle collisions that occurred only during the year 2015 in Australia. In this work, we are proposing a vehicle-based framework for kangaroo detection in urban and highway traffic environment that could be used for collision warning systems. Our proposed framework is based on region-based convolutional neural networks (RCNN). Given the scarcity of labeled data of kangaroos in traffic environments, we utilized our state-of-the-art data generation pipeline to generate 17,000 synthetic depth images of traffic scenes with kangaroo instances annotated in them. We trained our proposed RCNN-based framework on a subset of the generated synthetic depth images dataset. The proposed framework achieved a higher average precision (AP) score of 92% over all the testing synthetic depth image datasets. We compared our proposed framework against other baseline approaches and we outperformed it with more than 37% in AP score over all the testing datasets. Additionally, we evaluated the generalization performance of the proposed framework on real live data and we achieved a resilient detection accuracy without any further fine-tuning of our proposed RCNN-based framework.
The seismic traffic footprint: Tracking trains, aircraft, and cars seismically
NASA Astrophysics Data System (ADS)
Riahi, Nima; Gerstoft, Peter
2015-04-01
Although naturally occurring vibrations have proven useful to probe the subsurface, the vibrations caused by traffic have not been explored much. Such data, however, are less sensitive to weather and low visibility compared to some common out-of-road traffic sensing systems. We study traffic-generated seismic noise measured by an array of 5200 geophones that covered a 7 × 10 km area in Long Beach (California, USA) with a receiver spacing of 100 m. This allows us to look into urban vibrations below the resolution of a typical city block. The spatiotemporal structure of the anthropogenic seismic noise intensity reveals the Blue Line Metro train activity, departing and landing aircraft in Long Beach Airport and their acceleration, and gives clues about traffic movement along the I-405 highway at night. As low-cost, stand-alone seismic sensors are becoming more common, these findings indicate that seismic data may be useful for traffic monitoring.
ATC simulation of helicopter IFR approaches into major terminal areas using RNAV, MLS, and CDTI
NASA Technical Reports Server (NTRS)
Tobias, L.; Lee, H. Q.; Peach, L. L.; Willett, F. M., Jr.; Obrien, P. J.
1981-01-01
The introduction of independent helicopter IFR routes at hub airports was investigated in a real time air traffic control system simulation involving a piloted helicopter simulator, computer generated air traffic, and air traffic controllers. The helicopter simulator was equipped to fly area navigation (RNAV) routes and microwave landing system approaches. Problems studied included: (1) pilot acceptance of the approach procedure and tracking accuracy; (2) ATC procedures for handling a mix of helicopter and fixed wing traffic; and (3) utility of the cockpit display of traffic information (CDTI) for the helicopter in the hub airport environment. Results indicate that the helicopter routes were acceptable to the subject pilots and were noninterfering with fixed wing traffic. Merging and spacing maneuvers using CDTI were successfully carried out by the pilots, but controllers had some reservations concerning the acceptability of the CDTI procedures.
Traffic intensity monitoring using multiple object detection with traffic surveillance cameras
NASA Astrophysics Data System (ADS)
Hamdan, H. G. Muhammad; Khalifah, O. O.
2017-11-01
Object detection and tracking is a field of research that has many applications in the current generation with increasing number of cameras on the streets and lower cost for Internet of Things(IoT). In this paper, a traffic intensity monitoring system is implemented based on the Macroscopic Urban Traffic model is proposed using computer vision as its source. The input of this program is extracted from a traffic surveillance camera which has another program running a neural network classification which can identify and differentiate the vehicle type is implanted. The neural network toolbox is trained with positive and negative input to increase accuracy. The accuracy of the program is compared to other related works done and the trends of the traffic intensity from a road is also calculated. relevant articles in literature searches, great care should be taken in constructing both. Lastly the limitation and the future work is concluded.
Developing Stochastic Models as Inputs for High-Frequency Ground Motion Simulations
NASA Astrophysics Data System (ADS)
Savran, William Harvey
High-frequency ( 10 Hz) deterministic ground motion simulations are challenged by our understanding of the small-scale structure of the earth's crust and the rupture process during an earthquake. We will likely never obtain deterministic models that can accurately describe these processes down to the meter scale length required for broadband wave propagation. Instead, we can attempt to explain the behavior, in a statistical sense, by including stochastic models defined by correlations observed in the natural earth and through physics based simulations of the earthquake rupture process. Toward this goal, we develop stochastic models to address both of the primary considerations for deterministic ground motion simulations: namely, the description of the material properties in the crust, and broadband earthquake source descriptions. Using borehole sonic log data recorded in Los Angeles basin, we estimate the spatial correlation structure of the small-scale fluctuations in P-wave velocities by determining the best-fitting parameters of a von Karman correlation function. We find that Hurst exponents, nu, between 0.0-0.2, vertical correlation lengths, az, of 15-150m, an standard deviation, sigma of about 5% characterize the variability in the borehole data. Usin these parameters, we generated a stochastic model of velocity and density perturbations and combined with leading seismic velocity models to perform a validation exercise for the 2008, Chino Hills, CA using heterogeneous media. We find that models of velocity and density perturbations can have significant effects on the wavefield at frequencies as low as 0.3 Hz, with ensemble median values of various ground motion metrics varying up to +/-50%, at certain stations, compared to those computed solely from the CVM. Finally, we develop a kinematic rupture generator based on dynamic rupture simulations on geometrically complex faults. We analyze 100 dynamic rupture simulations on strike-slip faults ranging from Mw 6.4-7.2. We find that our dynamic simulations follow empirical scaling relationships for inter-plate strike-slip events, and provide source spectra comparable with an o -2 model. Our rupture generator reproduces GMPE medians and intra-event standard deviations spectral accelerations for an ensemble of 10 Hz fully-deterministic ground motion simulations, as compared to NGA West2 GMPE relationships up to 0.2 seconds.
The role of vegetation in mitigating air quality impacts from traffic emissions--journal
On Apri1 27-28, 2019, a multi-disciplinary group of researchers and po1icymakers met to discuss the state-of-the-science regarding the potential of roadside vegetation to mitigate near-road air quality impacts. Concerns over population exposures to traffic-generated pollutants ne...
NASA Astrophysics Data System (ADS)
Balouchestani, Mohammadreza
2017-05-01
Network traffic or data traffic in a Wireless Local Area Network (WLAN) is the amount of network packets moving across a wireless network from each wireless node to another wireless node, which provide the load of sampling in a wireless network. WLAN's Network traffic is the main component for network traffic measurement, network traffic control and simulation. Traffic classification technique is an essential tool for improving the Quality of Service (QoS) in different wireless networks in the complex applications such as local area networks, wireless local area networks, wireless personal area networks, wireless metropolitan area networks, and wide area networks. Network traffic classification is also an essential component in the products for QoS control in different wireless network systems and applications. Classifying network traffic in a WLAN allows to see what kinds of traffic we have in each part of the network, organize the various kinds of network traffic in each path into different classes in each path, and generate network traffic matrix in order to Identify and organize network traffic which is an important key for improving the QoS feature. To achieve effective network traffic classification, Real-time Network Traffic Classification (RNTC) algorithm for WLANs based on Compressed Sensing (CS) is presented in this paper. The fundamental goal of this algorithm is to solve difficult wireless network management problems. The proposed architecture allows reducing False Detection Rate (FDR) to 25% and Packet Delay (PD) to 15 %. The proposed architecture is also increased 10 % accuracy of wireless transmission, which provides a good background for establishing high quality wireless local area networks.
Design and implementation of a telecommunication interface for the TAATM/TCV real-time experiment
NASA Technical Reports Server (NTRS)
Nolan, J. D.
1981-01-01
The traffic situation display experiment of the terminal configured vehicle (TCV) research program requires a bidirectional data communications tie line between an computer complex. The tie line is used in a real time environment on the CYBER 175 computer by the terminal area air traffic model (TAATM) simulation program. Aircraft position data are processed by TAATM with the resultant output sent to the facility for the generation of air traffic situation displays which are transmitted to a research aircraft.
Nextgen Technologies for Mid-Term and Far-Term Air Traffic Control Operations
NASA Technical Reports Server (NTRS)
Prevot, Thomas
2009-01-01
This paper describes technologies for mid-term and far-term air traffic control operations in the Next Generation Air Transportation System (NextGen). The technologies were developed and evaluated with human-in-the-loop simulations in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center. The simulations were funded by several research focus areas within NASA's Airspace Systems program and some were co-funded by the FAA's Air Traffic Organization for Planning, Research and Technology.
Virtualized Traffic: reconstructing traffic flows from discrete spatiotemporal data.
Sewall, Jason; van den Berg, Jur; Lin, Ming C; Manocha, Dinesh
2011-01-01
We present a novel concept, Virtualized Traffic, to reconstruct and visualize continuous traffic flows from discrete spatiotemporal data provided by traffic sensors or generated artificially to enhance a sense of immersion in a dynamic virtual world. Given the positions of each car at two recorded locations on a highway and the corresponding time instances, our approach can reconstruct the traffic flows (i.e., the dynamic motions of multiple cars over time) between the two locations along the highway for immersive visualization of virtual cities or other environments. Our algorithm is applicable to high-density traffic on highways with an arbitrary number of lanes and takes into account the geometric, kinematic, and dynamic constraints on the cars. Our method reconstructs the car motion that automatically minimizes the number of lane changes, respects safety distance to other cars, and computes the acceleration necessary to obtain a smooth traffic flow subject to the given constraints. Furthermore, our framework can process a continuous stream of input data in real time, enabling the users to view virtualized traffic events in a virtual world as they occur. We demonstrate our reconstruction technique with both synthetic and real-world input. © 2011 IEEE Published by the IEEE Computer Society
NASA Astrophysics Data System (ADS)
Guarnaccia, Claudio; Quartieri, Joseph; Tepedino, Carmine
2017-06-01
One of the most hazardous physical polluting agents, considering their effects on human health, is acoustical noise. Airports are a strong source of acoustical noise, due to the airplanes turbines, to the aero-dynamical noise of transits, to the acceleration or the breaking during the take-off and landing phases of aircrafts, to the road traffic around the airport, etc.. The monitoring and the prediction of the acoustical level emitted by airports can be very useful to assess the impact on human health and activities. In the airports noise scenario, thanks to flights scheduling, the predominant sources may have a periodic behaviour. Thus, a Time Series Analysis approach can be adopted, considering that a general trend and a seasonal behaviour can be highlighted and used to build a predictive model. In this paper, two different approaches are adopted, thus two predictive models are constructed and tested. The first model is based on deterministic decomposition and is built composing the trend, that is the long term behaviour, the seasonality, that is the periodic component, and the random variations. The second model is based on seasonal autoregressive moving average, and it belongs to the stochastic class of models. The two different models are fitted on an acoustical level dataset collected close to the Nice (France) international airport. Results will be encouraging and will show good prediction performances of both the adopted strategies. A residual analysis is performed, in order to quantify the forecasting error features.
Optimizing integrated airport surface and terminal airspace operations under uncertainty
NASA Astrophysics Data System (ADS)
Bosson, Christabelle S.
In airports and surrounding terminal airspaces, the integration of surface, arrival and departure scheduling and routing have the potential to improve the operations efficiency. Moreover, because both the airport surface and the terminal airspace are often altered by random perturbations, the consideration of uncertainty in flight schedules is crucial to improve the design of robust flight schedules. Previous research mainly focused on independently solving arrival scheduling problems, departure scheduling problems and surface management scheduling problems and most of the developed models are deterministic. This dissertation presents an alternate method to model the integrated operations by using a machine job-shop scheduling formulation. A multistage stochastic programming approach is chosen to formulate the problem in the presence of uncertainty and candidate solutions are obtained by solving sample average approximation problems with finite sample size. The developed mixed-integer-linear-programming algorithm-based scheduler is capable of computing optimal aircraft schedules and routings that reflect the integration of air and ground operations. The assembled methodology is applied to a Los Angeles case study. To show the benefits of integrated operations over First-Come-First-Served, a preliminary proof-of-concept is conducted for a set of fourteen aircraft evolving under deterministic conditions in a model of the Los Angeles International Airport surface and surrounding terminal areas. Using historical data, a representative 30-minute traffic schedule and aircraft mix scenario is constructed. The results of the Los Angeles application show that the integration of air and ground operations and the use of a time-based separation strategy enable both significant surface and air time savings. The solution computed by the optimization provides a more efficient routing and scheduling than the First-Come-First-Served solution. Additionally, a data driven analysis is performed for the Los Angeles environment and probabilistic distributions of pertinent uncertainty sources are obtained. A sensitivity analysis is then carried out to assess the methodology performance and find optimal sampling parameters. Finally, simulations of increasing traffic density in the presence of uncertainty are conducted first for integrated arrivals and departures, then for integrated surface and air operations. To compare the optimization results and show the benefits of integrated operations, two aircraft separation methods are implemented that offer different routing options. The simulations of integrated air operations and the simulations of integrated air and surface operations demonstrate that significant traveling time savings, both total and individual surface and air times, can be obtained when more direct routes are allowed to be traveled even in the presence of uncertainty. The resulting routings induce however extra take off delay for departing flights. As a consequence, some flights cannot meet their initial assigned runway slot which engenders runway position shifting when comparing resulting runway sequences computed under both deterministic and stochastic conditions. The optimization is able to compute an optimal runway schedule that represents an optimal balance between total schedule delays and total travel times.
Sailem, Heba; Bousgouni, Vicky; Cooper, Sam; Bakal, Chris
2014-01-22
One goal of cell biology is to understand how cells adopt different shapes in response to varying environmental and cellular conditions. Achieving a comprehensive understanding of the relationship between cell shape and environment requires a systems-level understanding of the signalling networks that respond to external cues and regulate the cytoskeleton. Classical biochemical and genetic approaches have identified thousands of individual components that contribute to cell shape, but it remains difficult to predict how cell shape is generated by the activity of these components using bottom-up approaches because of the complex nature of their interactions in space and time. Here, we describe the regulation of cellular shape by signalling systems using a top-down approach. We first exploit the shape diversity generated by systematic RNAi screening and comprehensively define the shape space a migratory cell explores. We suggest a simple Boolean model involving the activation of Rac and Rho GTPases in two compartments to explain the basis for all cell shapes in the dataset. Critically, we also generate a probabilistic graphical model to show how cells explore this space in a deterministic, rather than a stochastic, fashion. We validate the predictions made by our model using live-cell imaging. Our work explains how cross-talk between Rho and Rac can generate different cell shapes, and thus morphological heterogeneity, in genetically identical populations.
Costs of Limiting Route Optimization to Published Waypoints in the Traffic Aware Planner
NASA Technical Reports Server (NTRS)
Karr, David A.; Vivona, Robert A.; Wing, David J.
2013-01-01
The Traffic Aware Planner (TAP) is an airborne advisory tool that generates optimized, traffic-avoiding routes to support the aircraft crew in making strategic reroute requests to Air Traffic Control (ATC). TAP is derived from a research-prototype self-separation tool, the Autonomous Operations Planner (AOP), in which optimized route modifications that avoid conflicts with traffic and weather, using waypoints at explicit latitudes and longitudes (a technique supported by self-separation concepts), are generated by maneuver patterns applied to the existing route. For use in current-day operations in which trajectory changes must be requested from ATC via voice communication, TAP produces optimized routes described by advisories that use only published waypoints prior to a reconnection waypoint on the existing route. We describe how the relevant algorithms of AOP have been modified to implement this requirement. The modifications include techniques for finding appropriate published waypoints in a maneuver pattern and a method for combining the genetic algorithm of AOP with an exhaustive search of certain types of advisory. We demonstrate methods to investigate the increased computation required by these techniques and to estimate other costs (measured in terms such as time to destination and fuel burned) that may be incurred when only published waypoints are used.
NASA Astrophysics Data System (ADS)
Yan, Y.; Barth, A.; Beckers, J. M.; Candille, G.; Brankart, J. M.; Brasseur, P.
2015-07-01
Sea surface height, sea surface temperature, and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. Sixty ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. An incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with independent/semiindependent observations. For deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations, in order to diagnose the ensemble distribution properties in a deterministic way. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centered random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analyzed jointly. The consistency and complementarity between both validations are highlighted.
Deterministic ion beam material adding technology for high-precision optical surfaces.
Liao, Wenlin; Dai, Yifan; Xie, Xuhui; Zhou, Lin
2013-02-20
Although ion beam figuring (IBF) provides a highly deterministic method for the precision figuring of optical components, several problems still need to be addressed, such as the limited correcting capability for mid-to-high spatial frequency surface errors and low machining efficiency for pit defects on surfaces. We propose a figuring method named deterministic ion beam material adding (IBA) technology to solve those problems in IBF. The current deterministic optical figuring mechanism, which is dedicated to removing local protuberances on optical surfaces, is enriched and developed by the IBA technology. Compared with IBF, this method can realize the uniform convergence of surface errors, where the particle transferring effect generated in the IBA process can effectively correct the mid-to-high spatial frequency errors. In addition, IBA can rapidly correct the pit defects on the surface and greatly improve the machining efficiency of the figuring process. The verification experiments are accomplished on our experimental installation to validate the feasibility of the IBA method. First, a fused silica sample with a rectangular pit defect is figured by using IBA. Through two iterations within only 47.5 min, this highly steep pit is effectively corrected, and the surface error is improved from the original 24.69 nm root mean square (RMS) to the final 3.68 nm RMS. Then another experiment is carried out to demonstrate the correcting capability of IBA for mid-to-high spatial frequency surface errors, and the final results indicate that the surface accuracy and surface quality can be simultaneously improved.
Noise annoyance through railway traffic - a case study
2014-01-01
This paper describes an assessment of noise caused by railway traffic in a large Latin American city. Measurements were taken of noise levels generated by trains passing through residential neighborhoods with and without blowing their horns. Noise maps were also calculated showing noise pollution generated by the train traffic. In addition - annoyance of the residents - affected by railway noise, was evaluated based on interviews. The measurements indicated that the noise levels generated by the passage of the train with its horn blowing are extremely high, clearly exceeding the daytime limits of equivalent sound pressure level - Leq = 55 dB(A) - established by the municipal laws No 10.625 of the city of Curitiba. The Leq = 45 dB (A) which is the limit for the night period also are exceeded during the passage of trains. The residents reported feeling affected by the noise generated by passing trains, which causes irritability, headaches, poor concentration and insomnia, and 88% of them claimed that nocturnal noise pollution is the most distressing. This study showed that the vast majority of residents surveyed, (69%) believe that the noise of the train can devalue their property. PMID:24401735
Automated Weight-Window Generation for Threat Detection Applications Using ADVANTG
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mosher, Scott W; Miller, Thomas Martin; Evans, Thomas M
2009-01-01
Deterministic transport codes have been used for some time to generate weight-window parameters that can improve the efficiency of Monte Carlo simulations. As the use of this hybrid computational technique is becoming more widespread, the scope of applications in which it is being applied is expanding. An active source of new applications is the field of homeland security--particularly the detection of nuclear material threats. For these problems, automated hybrid methods offer an efficient alternative to trial-and-error variance reduction techniques (e.g., geometry splitting or the stochastic weight window generator). The ADVANTG code has been developed to automate the generation of weight-windowmore » parameters for MCNP using the Consistent Adjoint Driven Importance Sampling method and employs the TORT or Denovo 3-D discrete ordinates codes to generate importance maps. In this paper, we describe the application of ADVANTG to a set of threat-detection simulations. We present numerical results for an 'active-interrogation' problem in which a standard cargo container is irradiated by a deuterium-tritium fusion neutron generator. We also present results for two passive detection problems in which a cargo container holding a shielded neutron or gamma source is placed near a portal monitor. For the passive detection problems, ADVANTG obtains an O(10{sup 4}) speedup and, for a detailed gamma spectrum tally, an average O(10{sup 2}) speedup relative to implicit-capture-only simulations, including the deterministic calculation time. For the active-interrogation problem, an O(10{sup 4}) speedup is obtained when compared to a simulation with angular source biasing and crude geometry splitting.« less
DOT National Transportation Integrated Search
1993-10-01
This document describes the Concept of Operations and Generic System Requirements for : the next generation of Traffic Management Centers (TMC). Four major steps comprise the : development of this Concept of Operations. The first step was to survey t...
Towards Realistic Urban Traffic Experiments Using DFROUTER: Heuristic, Validation and Extensions.
Zambrano-Martinez, Jorge Luis; Calafate, Carlos T; Soler, David; Cano, Juan-Carlos
2017-12-15
Traffic congestion is an important problem faced by Intelligent Transportation Systems (ITS), requiring models that allow predicting the impact of different solutions on urban traffic flow. Such an approach typically requires the use of simulations, which should be as realistic as possible. However, achieving high degrees of realism can be complex when the actual traffic patterns, defined through an Origin/Destination (O-D) matrix for the vehicles in a city, remain unknown. Thus, the main contribution of this paper is a heuristic for improving traffic congestion modeling. In particular, we propose a procedure that, starting from real induction loop measurements made available by traffic authorities, iteratively refines the output of DFROUTER, which is a module provided by the SUMO (Simulation of Urban MObility) tool. This way, it is able to generate an O-D matrix for traffic that resembles the real traffic distribution and that can be directly imported by SUMO. We apply our technique to the city of Valencia, and we then compare the obtained results against other existing traffic mobility data for the cities of Cologne (Germany) and Bologna (Italy), thereby validating our approach. We also use our technique to determine what degree of congestion is expectable if certain conditions cause additional traffic to circulate in the city, adopting both a uniform pattern and a hotspot-based pattern for traffic injection to demonstrate how to regulate the overall number of vehicles in the city. This study allows evaluating the impact of vehicle flow changes on the overall traffic congestion levels.
Airfreight forecasting methodology and results
NASA Technical Reports Server (NTRS)
1978-01-01
A series of econometric behavioral equations was developed to explain and forecast the evolution of airfreight traffic demand for the total U.S. domestic airfreight system, the total U.S. international airfreight system, and the total scheduled international cargo traffic carried by the top 44 foreign airlines. The basic explanatory variables used in these macromodels were the real gross national products of the countries involved and a measure of relative transportation costs. The results of the econometric analysis reveal that the models explain more than 99 percent of the historical evolution of freight traffic. The long term traffic forecasts generated with these models are based on scenarios of the likely economic outlook in the United States and 31 major foreign countries.
Facility requirements for cockpit traffic display research
NASA Technical Reports Server (NTRS)
Chappell, S. L.; Kreifeldt, J. G.
1982-01-01
It is pointed out that much research is being conducted regarding the use of a cockpit display of traffic information (CDTI) for safe and efficient air traffic flow. A CDTI is a graphic display which shows the pilot the position of other aircraft relative to his or her aircraft. The present investigation is concerned with the facility requirements for the CDTI research. The facilities currently used for this research vary in fidelity from one CDTI-equipped simulator with computer-generated traffic, to four simulators with autopilot-like controls, all having a CDTI. Three groups of subjects were employed in the conducted study. Each of the groups included one controller, and three airline and four general aviation pilots.
Spontaneous density fluctuations in granular flow and traffic
NASA Astrophysics Data System (ADS)
Herrmann, Hans J.
It is known that spontaneous density waves appear in granular material flowing through pipes or hoppers. A similar phenomenon is known from traffic jams on highways. Using numerical simulations we show that several types of waves exist and find that the density fluctuations follow a power law spectrum. We also investigate one-dimensional traffic models. If positions and velocities are continuous variables the model shows self-organized criticality driven by the slowest car. Lattice gas and lattice Boltzmann models reproduce the experimentally observed effects. Density waves are spontaneously generated when the viscosity has a non-linear dependence on density or shear rate as it is the case in traffic or granular flow.
Using random forests to diagnose aviation turbulence.
Williams, John K
Atmospheric turbulence poses a significant hazard to aviation, with severe encounters costing airlines millions of dollars per year in compensation, aircraft damage, and delays due to required post-event inspections and repairs. Moreover, attempts to avoid turbulent airspace cause flight delays and en route deviations that increase air traffic controller workload, disrupt schedules of air crews and passengers and use extra fuel. For these reasons, the Federal Aviation Administration and the National Aeronautics and Space Administration have funded the development of automated turbulence detection, diagnosis and forecasting products. This paper describes a methodology for fusing data from diverse sources and producing a real-time diagnosis of turbulence associated with thunderstorms, a significant cause of weather delays and turbulence encounters that is not well-addressed by current turbulence forecasts. The data fusion algorithm is trained using a retrospective dataset that includes objective turbulence reports from commercial aircraft and collocated predictor data. It is evaluated on an independent test set using several performance metrics including receiver operating characteristic curves, which are used for FAA turbulence product evaluations prior to their deployment. A prototype implementation fuses data from Doppler radar, geostationary satellites, a lightning detection network and a numerical weather prediction model to produce deterministic and probabilistic turbulence assessments suitable for use by air traffic managers, dispatchers and pilots. The algorithm is scheduled to be operationally implemented at the National Weather Service's Aviation Weather Center in 2014.
Improvement and empirical research on chaos control by theory of "chaos + chaos = order".
Fulai, Wang
2012-12-01
This paper focuses on advancing the understanding of Parrondian effects and their paradoxical behavior in nonlinear dynamical systems. Some examples are given to show that a dynamics combined by more than two discrete chaotic dynamics in deterministic manners can give rise to order when combined. The chaotic maps in our study are more general than those in the current literatures as far as "chaos + chaos = order" is concerned. Some problems left over in the current literatures are solved. It is proved both theoretically and numerically that, given any m chaotic dynamics generated by the one-dimensional real Mandelbrot maps, it is no possible to get a periodic system when all the m chaotic dynamics are alternated in random manner, but for any integer m(m ≥ 2) a dynamics combined in deterministic manner by m Mandelbrot chaotic dynamics can be found to give rise to a periodic dynamics of m periods. Numerical and mathematical analysis prove that the paradoxical phenomenon of "chaos + chaos = order" also exist in the dynamics generated by non-Mandelbrot maps.
Design of Center-TRACON Automation System
NASA Technical Reports Server (NTRS)
Erzberger, Heinz; Davis, Thomas J.; Green, Steven
1993-01-01
A system for the automated management and control of terminal area traffic, referred to as the Center-TRACON Automation System (CTAS), is being developed at NASA Ames Research Center. In a cooperative program, NASA and FAA have efforts underway to install and evaluate the system at the Denver area and Dallas/Ft. Worth area air traffic control facilities. This paper will review CTAS architecture, and automation functions as well as the integration of CTAS into the existing operational system. CTAS consists of three types of integrated tools that provide computer-generated advisories for both en-route and terminal area controllers to guide them in managing and controlling arrival traffic efficiently. One tool, the Traffic Management Advisor (TMA), generates runway assignments, landing sequences and landing times for all arriving aircraft, including those originating from nearby feeder airports. TMA also assists in runway configuration control and flow management. Another tool, the Descent Advisor (DA), generates clearances for the en-route controllers handling arrival flows to metering gates. The DA's clearances ensure fuel-efficient and conflict free descents to the metering gates at specified crossing times. In the terminal area, the Final Approach Spacing Tool (FAST) provides heading and speed advisories that help controllers produce an accurately spaced flow of aircraft on the final approach course. Data bases consisting of several hundred aircraft performance models, airline preferred operational procedures, and a three dimensional wind model support the operation of CTAS. The first component of CTAS, the Traffic Management Advisor, is being evaluated at the Denver TRACON and the Denver Air Route Traffic Control Center. The second component, the Final Approach Spacing Tool, will be evaluated in several stages at the Dallas/Fort Worth Airport beginning in October 1993. An initial stage of the Descent Advisor tool is being prepared for testing at the Denver Center in late 1994. Operational evaluations of all three integrated CTAS tools are expected to begin at the two field sites in 1995.
Additivity Principle in High-Dimensional Deterministic Systems
NASA Astrophysics Data System (ADS)
Saito, Keiji; Dhar, Abhishek
2011-12-01
The additivity principle (AP), conjectured by Bodineau and Derrida [Phys. Rev. Lett. 92, 180601 (2004)PRLTAO0031-900710.1103/PhysRevLett.92.180601], is discussed for the case of heat conduction in three-dimensional disordered harmonic lattices to consider the effects of deterministic dynamics, higher dimensionality, and different transport regimes, i.e., ballistic, diffusive, and anomalous transport. The cumulant generating function (CGF) for heat transfer is accurately calculated and compared with the one given by the AP. In the diffusive regime, we find a clear agreement with the conjecture even if the system is high dimensional. Surprisingly, even in the anomalous regime the CGF is also well fitted by the AP. Lower-dimensional systems are also studied and the importance of three dimensionality for the validity is stressed.
Multi-dimensional photonic states from a quantum dot
NASA Astrophysics Data System (ADS)
Lee, J. P.; Bennett, A. J.; Stevenson, R. M.; Ellis, D. J. P.; Farrer, I.; Ritchie, D. A.; Shields, A. J.
2018-04-01
Quantum states superposed across multiple particles or degrees of freedom offer an advantage in the development of quantum technologies. Creating these states deterministically and with high efficiency is an ongoing challenge. A promising approach is the repeated excitation of multi-level quantum emitters, which have been shown to naturally generate light with quantum statistics. Here we describe how to create one class of higher dimensional quantum state, a so called W-state, which is superposed across multiple time bins. We do this by repeated Raman scattering of photons from a charged quantum dot in a pillar microcavity. We show this method can be scaled to larger dimensions with no reduction in coherence or single-photon character. We explain how to extend this work to enable the deterministic creation of arbitrary time-bin encoded qudits.
Deterministic nonclassicality for quantum-mechanical oscillators in thermal states
NASA Astrophysics Data System (ADS)
Marek, Petr; Lachman, Lukáš; Slodička, Lukáš; Filip, Radim
2016-07-01
Quantum nonclassicality is the basic building stone for the vast majority of quantum information applications and methods of its generation are at the forefront of research. One of the obstacles any method needs to clear is the looming presence of decoherence and noise which act against the nonclassicality and often erase it completely. In this paper we show that nonclassical states of a quantum harmonic oscillator initially in thermal equilibrium states can be deterministically created by coupling it to a single two-level system. This can be achieved even in the absorption regime in which the two-level system is initially in the ground state. The method is resilient to noise and it may actually benefit from it, as witnessed by the systems with higher thermal energy producing more nonclassical states.
Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas
2015-09-01
Penetration of renewable energy resources, such as wind and solar power, into power systems significantly increases the uncertainties on system operation, stability, and reliability in smart grids. In this paper, the nonparametric neural network-based prediction intervals (PIs) are implemented for forecast uncertainty quantification. Instead of a single level PI, wind power forecast uncertainties are represented in a list of PIs. These PIs are then decomposed into quantiles of wind power. A new scenario generation method is proposed to handle wind power forecast uncertainties. For each hour, an empirical cumulative distribution function (ECDF) is fitted to these quantile points. The Monte Carlo simulation method is used to generate scenarios from the ECDF. Then the wind power scenarios are incorporated into a stochastic security-constrained unit commitment (SCUC) model. The heuristic genetic algorithm is utilized to solve the stochastic SCUC problem. Five deterministic and four stochastic case studies incorporated with interval forecasts of wind power are implemented. The results of these cases are presented and discussed together. Generation costs, and the scheduled and real-time economic dispatch reserves of different unit commitment strategies are compared. The experimental results show that the stochastic model is more robust than deterministic ones and, thus, decreases the risk in system operations of smart grids.
Li, Wei; Wu, Jun
2014-01-01
Objectives. We assessed how traffic and mobile-source air pollution impacts are distributed across racial/ethnic and socioeconomically diverse groups in port-adjacent communities in southern Los Angeles County, which may experience divergent levels of exposure to port-related heavy-duty diesel truck traffic because of existing residential and land use patterns. Methods. We used spatial regression techniques to assess the association of neighborhood racial/ethnic and socioeconomic composition with residential parcel-level traffic and vehicle-related fine particulate matter exposure after accounting for built environment and land use factors. Results. After controlling for factors associated with traffic generation, we found that a higher percentage of nearby Black and Asian/Pacific Islander residents was associated with higher exposure, a higher percentage of Hispanic residents was associated with higher traffic exposure but lower vehicle particulate matter exposure, and areas with lower socioeconomic status experienced lower exposure. Conclusions. Disparities in traffic and vehicle particulate matter exposure are nuanced depending on the exposure metric used, the distribution of the traffic and emissions, and pollutant dispersal patterns. Future comparative research is needed to assess potential disparities in other transportation and goods movement corridors. PMID:23678919
Houston, Douglas; Li, Wei; Wu, Jun
2014-01-01
We assessed how traffic and mobile-source air pollution impacts are distributed across racial/ethnic and socioeconomically diverse groups in port-adjacent communities in southern Los Angeles County, which may experience divergent levels of exposure to port-related heavy-duty diesel truck traffic because of existing residential and land use patterns. We used spatial regression techniques to assess the association of neighborhood racial/ethnic and socioeconomic composition with residential parcel-level traffic and vehicle-related fine particulate matter exposure after accounting for built environment and land use factors. After controlling for factors associated with traffic generation, we found that a higher percentage of nearby Black and Asian/Pacific Islander residents was associated with higher exposure, a higher percentage of Hispanic residents was associated with higher traffic exposure but lower vehicle particulate matter exposure, and areas with lower socioeconomic status experienced lower exposure. Disparities in traffic and vehicle particulate matter exposure are nuanced depending on the exposure metric used, the distribution of the traffic and emissions, and pollutant dispersal patterns. Future comparative research is needed to assess potential disparities in other transportation and goods movement corridors.
Air Traffic Management Research at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Lee, Katharine
2005-01-01
Since the late 1980's, NASA Ames researchers have been investigating ways to improve the air transportation system through the development of decision support automation. These software advances, such as the Center-TRACON Automation System (eTAS) have been developed with teams of engineers, software developers, human factors experts, and air traffic controllers; some ASA Ames decision support tools are currently operational in Federal Aviation Administration (FAA) facilities and some are in use by the airlines. These tools have provided air traffic controllers and traffic managers the capabilities to help reduce overall delays and holding, and provide significant cost savings to the airlines as well as more manageable workload levels for air traffic service providers. NASA is continuing to collaborate with the FAA, as well as other government agencies, to plan and develop the next generation of decision support tools that will support anticipated changes in the air transportation system, including a projected increase to three times today's air-traffic levels by 2025. The presentation will review some of NASA Ames' recent achievements in air traffic management research, and discuss future tool developments and concepts currently under consideration.
Imaging Vesicular Traffic at the Immune Synapse.
Bouchet, Jérôme; Del Río-Iñiguez, Iratxe; Alcover, Andrés
2017-01-01
Immunological synapse formation is the result of a profound T cell polarization process that involves the coordinated action of the actin and microtubule cytoskeleton, as well as intracellular vesicle traffic. Endosomal vesicle traffic ensures the targeting of the T cell receptor (TCR) and various signaling molecules to the synapse, being necessary for the generation of signaling complexes downstream of the TCR. Here we describe the microscopy imaging methods that we currently use to unveil how TCR and signaling molecules are associated with endosomal compartments and deliver their cargo to the immunological synapse.
Intelligent Traffic Quantification System
NASA Astrophysics Data System (ADS)
Mohanty, Anita; Bhanja, Urmila; Mahapatra, Sudipta
2017-08-01
Currently, city traffic monitoring and controlling is a big issue in almost all cities worldwide. Vehicular ad-hoc Network (VANET) technique is an efficient tool to minimize this problem. Usually, different types of on board sensors are installed in vehicles to generate messages characterized by different vehicle parameters. In this work, an intelligent system based on fuzzy clustering technique is developed to reduce the number of individual messages by extracting important features from the messages of a vehicle. Therefore, the proposed fuzzy clustering technique reduces the traffic load of the network. The technique also reduces congestion and quantifies congestion.
Deterministic modelling and stochastic simulation of biochemical pathways using MATLAB.
Ullah, M; Schmidt, H; Cho, K H; Wolkenhauer, O
2006-03-01
The analysis of complex biochemical networks is conducted in two popular conceptual frameworks for modelling. The deterministic approach requires the solution of ordinary differential equations (ODEs, reaction rate equations) with concentrations as continuous state variables. The stochastic approach involves the simulation of differential-difference equations (chemical master equations, CMEs) with probabilities as variables. This is to generate counts of molecules for chemical species as realisations of random variables drawn from the probability distribution described by the CMEs. Although there are numerous tools available, many of them free, the modelling and simulation environment MATLAB is widely used in the physical and engineering sciences. We describe a collection of MATLAB functions to construct and solve ODEs for deterministic simulation and to implement realisations of CMEs for stochastic simulation using advanced MATLAB coding (Release 14). The program was successfully applied to pathway models from the literature for both cases. The results were compared to implementations using alternative tools for dynamic modelling and simulation of biochemical networks. The aim is to provide a concise set of MATLAB functions that encourage the experimentation with systems biology models. All the script files are available from www.sbi.uni-rostock.de/ publications_matlab-paper.html.
Computation of a Canadian SCWR unit cell with deterministic and Monte Carlo codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrisson, G.; Marleau, G.
2012-07-01
The Canadian SCWR has the potential to achieve the goals that the generation IV nuclear reactors must meet. As part of the optimization process for this design concept, lattice cell calculations are routinely performed using deterministic codes. In this study, the first step (self-shielding treatment) of the computation scheme developed with the deterministic code DRAGON for the Canadian SCWR has been validated. Some options available in the module responsible for the resonance self-shielding calculation in DRAGON 3.06 and different microscopic cross section libraries based on the ENDF/B-VII.0 evaluated nuclear data file have been tested and compared to a reference calculationmore » performed with the Monte Carlo code SERPENT under the same conditions. Compared to SERPENT, DRAGON underestimates the infinite multiplication factor in all cases. In general, the original Stammler model with the Livolant-Jeanpierre approximations are the most appropriate self-shielding options to use in this case of study. In addition, the 89 groups WIMS-AECL library for slight enriched uranium and the 172 groups WLUP library for a mixture of plutonium and thorium give the most consistent results with those of SERPENT. (authors)« less
Accessing the dark exciton spin in deterministic quantum-dot microlenses
NASA Astrophysics Data System (ADS)
Heindel, Tobias; Thoma, Alexander; Schwartz, Ido; Schmidgall, Emma R.; Gantz, Liron; Cogan, Dan; Strauß, Max; Schnauber, Peter; Gschrey, Manuel; Schulze, Jan-Hindrik; Strittmatter, Andre; Rodt, Sven; Gershoni, David; Reitzenstein, Stephan
2017-12-01
The dark exciton state in semiconductor quantum dots (QDs) constitutes a long-lived solid-state qubit which has the potential to play an important role in implementations of solid-state-based quantum information architectures. In this work, we exploit deterministically fabricated QD microlenses which promise enhanced photon extraction, to optically prepare and read out the dark exciton spin and observe its coherent precession. The optical access to the dark exciton is provided via spin-blockaded metastable biexciton states acting as heralding states, which are identified by deploying polarization-sensitive spectroscopy as well as time-resolved photon cross-correlation experiments. Our experiments reveal a spin-precession period of the dark exciton of (0.82 ± 0.01) ns corresponding to a fine-structure splitting of (5.0 ± 0.7) μeV between its eigenstates |↑ ⇑ ±↓ ⇓ ⟩. By exploiting microlenses deterministically fabricated above pre-selected QDs, our work demonstrates the possibility to scale up implementations of quantum information processing schemes using the QD-confined dark exciton spin qubit, such as the generation of photonic cluster states or the realization of a solid-state-based quantum memory.
Design Flexibility for Uncertain Distributed Generation from Photovoltaics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Krishnamurthy, Dheepak; Wu, Hongyu
2016-12-12
Uncertainty in the future adoption patterns for distributed energy resources (DERs) introduces a challenge for electric distribution system planning. This paper explores the potential for flexibility in design - also known as real options - to identify design solutions that may never emerge when future DER patterns are treated as deterministic. A test case for storage system design with uncertain distributed generation for solar photovoltaics (DGPV) demonstrates this approach and is used to study sensitivities to a range of techno-economic assumptions.
NASA Astrophysics Data System (ADS)
Ibarra Espinosa, S.; Ynoue, R.; Giannotti, M., , Dr
2017-12-01
It has been shown the importance of emissions inventories for air quality studies and environmental planning at local, regional (REAS), hemispheric (CLRTAP) and global (IPCC) scales. It has been shown also that vehicules are becoming the most important sources in urban centers. Several efforts has been made in order to model vehicular emissions to obtain more accurate emission factors based on Vehicular Specific Power (VPS) with IVE and MOVES based on VSP, MOBILE, VERSIT and COPERT based on average speed, or ARTEMIS and HBEFA based on traffic situations. However, little effort has been made to improve traffic activity data. In this study we are proposing using a novel approach to develop vehicular emissions inventory including point data from MAPLINK a company that feeds with traffic data to Google. This includes working and transforming massive amount of data to generate traffic flow and speeds. The region of study is the south east of Brazil including São Paulo metropolitan areas. To estimate vehicular emissions we are using the open source model VEIN available at https://CRAN.R-project.org/package=vein. We generated hourly traffic between 2010-04-21 and 2010-10-22, totalizing 145 hours. This data consists GPS readings from vehicles with assurance policy, applications and other sources. This type data presents spacial bias meaning that only a part of the vehicles are tracked. We corrected this bias using the calculated speed as proxy of traffic flow using measurements of traffic flow and speed per lane made in São Paulo. Then we calibrated the total traffic estimating Fuel Consumption with VEIN and comparing Fuel Sales for the region. We estimated the hourly vehicular emissions and produced emission maps and data-bases. In addition, we simulated atmospheric simulations using WRF-Chem to identify which inventory produces better agreement with air pollutant observations. New technologies and big data provides opportunities to improve vehicular emissions inventories.
An efficient method to detect periodic behavior in botnet traffic by analyzing control plane traffic
AsSadhan, Basil; Moura, José M.F.
2013-01-01
Botnets are large networks of bots (compromised machines) that are under the control of a small number of bot masters. They pose a significant threat to Internet’s communications and applications. A botnet relies on command and control (C2) communications channels traffic between its members for its attack execution. C2 traffic occurs prior to any attack; hence, the detection of botnet’s C2 traffic enables the detection of members of the botnet before any real harm happens. We analyze C2 traffic and find that it exhibits a periodic behavior. This is due to the pre-programmed behavior of bots that check for updates to download them every T seconds. We exploit this periodic behavior to detect C2 traffic. The detection involves evaluating the periodogram of the monitored traffic. Then applying Walker’s large sample test to the periodogram’s maximum ordinate in order to determine if it is due to a periodic component or not. If the periodogram of the monitored traffic contains a periodic component, then it is highly likely that it is due to a bot’s C2 traffic. The test looks only at aggregate control plane traffic behavior, which makes it more scalable than techniques that involve deep packet inspection (DPI) or tracking the communication flows of different hosts. We apply the test to two types of botnet, tinyP2P and IRC that are generated by SLINGbot. We verify the periodic behavior of their C2 traffic and compare it to the results we get on real traffic that is obtained from a secured enterprise network. We further study the characteristics of the test in the presence of injected HTTP background traffic and the effect of the duty cycle on the periodic behavior. PMID:25685512
Traffic Aware Planner for Cockpit-Based Trajectory Optimization
NASA Technical Reports Server (NTRS)
Woods, Sharon E.; Vivona, Robert A.; Henderson, Jeffrey; Wing, David J.; Burke, Kelly A.
2016-01-01
The Traffic Aware Planner (TAP) software application is a cockpit-based advisory tool designed to be hosted on an Electronic Flight Bag and to enable and test the NASA concept of Traffic Aware Strategic Aircrew Requests (TASAR). The TASAR concept provides pilots with optimized route changes (including altitude) that reduce fuel burn and/or flight time, avoid interactions with known traffic, weather and restricted airspace, and may be used by the pilots to request a route and/or altitude change from Air Traffic Control. Developed using an iterative process, TAP's latest improvements include human-machine interface design upgrades and added functionality based on the results of human-in-the-loop simulation experiments and flight trials. Architectural improvements have been implemented to prepare the system for operational-use trials with partner commercial airlines. Future iterations will enhance coordination with airline dispatch and add functionality to improve the acceptability of TAP-generated route-change requests to pilots, dispatchers, and air traffic controllers.
Earthquake mechanism and seafloor deformation for tsunami generation
Geist, Eric L.; Oglesby, David D.; Beer, Michael; Kougioumtzoglou, Ioannis A.; Patelli, Edoardo; Siu-Kui Au, Ivan
2014-01-01
Tsunamis are generated in the ocean by rapidly displacing the entire water column over a significant area. The potential energy resulting from this disturbance is balanced with the kinetic energy of the waves during propagation. Only a handful of submarine geologic phenomena can generate tsunamis: large-magnitude earthquakes, large landslides, and volcanic processes. Asteroid and subaerial landslide impacts can generate tsunami waves from above the water. Earthquakes are by far the most common generator of tsunamis. Generally, earthquakes greater than magnitude (M) 6.5–7 can generate tsunamis if they occur beneath an ocean and if they result in predominantly vertical displacement. One of the greatest uncertainties in both deterministic and probabilistic hazard assessments of tsunamis is computing seafloor deformation for earthquakes of a given magnitude.
The role of vegetation in mitigating air quality impacts from traffic emissions
R. Baldauf; L. Jackson; G. Hagler; I. Vlad; G. McPherson; D. Nowak; T. Cahill; M. Zhang; R. Cook; C. Bailey; P. Wood
2011-01-01
In April 2010, a multidisciplinary group of researchers and policy-makers met to discuss the state-of-the-science regarding the potential of roadside vegetation to mitigate near-road air quality impacts. Concerns over population exposures to traffic-generated pollutants near roads have grown with an increasing number of health studies reporting links between proximity...
25 CFR 170.411 - What may a long-range transportation plan include?
Code of Federal Regulations, 2012 CFR
2012-04-01
...; (b) Trip generation studies, including determination of traffic generators due to land use; (c) Social and economic development planning to identify transportation improvements or needs to accommodate...
25 CFR 170.411 - What may a long-range transportation plan include?
Code of Federal Regulations, 2011 CFR
2011-04-01
...; (b) Trip generation studies, including determination of traffic generators due to land use; (c) Social and economic development planning to identify transportation improvements or needs to accommodate...
25 CFR 170.411 - What may a long-range transportation plan include?
Code of Federal Regulations, 2013 CFR
2013-04-01
...; (b) Trip generation studies, including determination of traffic generators due to land use; (c) Social and economic development planning to identify transportation improvements or needs to accommodate...
25 CFR 170.411 - What may a long-range transportation plan include?
Code of Federal Regulations, 2014 CFR
2014-04-01
...; (b) Trip generation studies, including determination of traffic generators due to land use; (c) Social and economic development planning to identify transportation improvements or needs to accommodate...
Integrated Risk-Informed Decision-Making for an ALMR PRISM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muhlheim, Michael David; Belles, Randy; Denning, Richard S.
Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace ormore » supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute decision-making framework uses various sensor data (e.g., reactor outlet temperature, steam generator drum level) and calculates its position within the challenge state, its trajectory, and its margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. The metrics that are evaluated are based on reactor trip set points. The integration of the deterministic calculations using multi-physics analyses and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermalhydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies, and developing a user interface to mimic display panels at a modern nuclear power plant.« less
Pian, Chengnan
2017-09-01
Chinese and Japanese university students make an exchanging of opinions regarding the topic "making a mobile phone call in the bus". Both sides of the communication can achieve different changes of cognition through different ways. This paper focuses on Chinese university students, and analyzes their cognition of the traffic etiquette in Japan and China. Unlike Japanese university students' change of cognition, Chinese university students have made more negative evaluation on Japanese traffic etiquette after the communication. However, this does not mean to shield their traffic etiquette. They have the two-way changes of cognition in both social etiquette and personal behavior. These changes may be related to the unbalanced dialogue relationship, as well as the generation of hot issues. How to generate the hot issues, and promote the two-way movement of understanding are the important clues for the design of communication curriculum to enhance the cultural understanding.
Stacking the odds for Golgi cisternal maturation
Mani, Somya; Thattai, Mukund
2016-01-01
What is the minimal set of cell-biological ingredients needed to generate a Golgi apparatus? The compositions of eukaryotic organelles arise through a process of molecular exchange via vesicle traffic. Here we statistically sample tens of thousands of homeostatic vesicle traffic networks generated by realistic molecular rules governing vesicle budding and fusion. Remarkably, the plurality of these networks contain chains of compartments that undergo creation, compositional maturation, and dissipation, coupled by molecular recycling along retrograde vesicles. This motif precisely matches the cisternal maturation model of the Golgi, which was developed to explain many observed aspects of the eukaryotic secretory pathway. In our analysis cisternal maturation is a robust consequence of vesicle traffic homeostasis, independent of the underlying details of molecular interactions or spatial stacking. This architecture may have been exapted rather than selected for its role in the secretion of large cargo. DOI: http://dx.doi.org/10.7554/eLife.16231.001 PMID:27542195
NASA Astrophysics Data System (ADS)
Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter
2016-04-01
Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of actual forecasts with shorter lead times of up to 15 days shows the practical benefit of actual operational data. It appears that the use of stochastic optimization combined with ensemble forecasts leads to a significant higher level of flood protection without compromising the HPP's energy production.
NASA Astrophysics Data System (ADS)
González-Carrasco, J. F.; Gonzalez, G.; Aránguiz, R.; Yanez, G. A.; Melgar, D.; Salazar, P.; Shrivastava, M. N.; Das, R.; Catalan, P. A.; Cienfuegos, R.
2017-12-01
Plausible worst-case tsunamigenic scenarios definition plays a relevant role in tsunami hazard assessment focused in emergency preparedness and evacuation planning for coastal communities. During the last decade, the occurrence of major and moderate tsunamigenic earthquakes along worldwide subduction zones has given clues about critical parameters involved in near-field tsunami inundation processes, i.e. slip spatial distribution, shelf resonance of edge waves and local geomorphology effects. To analyze the effects of these seismic and hydrodynamic variables over the epistemic uncertainty of coastal inundation, we implement a combined methodology using deterministic and probabilistic approaches to construct 420 tsunamigenic scenarios in a mature seismic gap of southern Peru and northern Chile, extended from 17ºS to 24ºS. The deterministic scenarios are calculated using a regional distribution of trench-parallel gravity anomaly (TPGA) and trench-parallel topography anomaly (TPTA), three-dimensional Slab 1.0 worldwide subduction zones geometry model and published interseismic coupling (ISC) distributions. As result, we find four higher slip deficit zones interpreted as major seismic asperities of the gap, used in a hierarchical tree scheme to generate ten tsunamigenic scenarios with seismic magnitudes fluctuates between Mw 8.4 to Mw 8.9. Additionally, we construct ten homogeneous slip scenarios as inundation baseline. For the probabilistic approach, we implement a Karhunen - Loève expansion to generate 400 stochastic tsunamigenic scenarios over the maximum extension of the gap, with the same magnitude range of the deterministic sources. All the scenarios are simulated through a non-hydrostatic tsunami model Neowave 2D, using a classical nesting scheme, for five coastal major cities in northern Chile (Arica, Iquique, Tocopilla, Mejillones and Antofagasta) obtaining high resolution data of inundation depth, runup, coastal currents and sea level elevation. The probabilistic kinematic tsunamigenic scenarios give a more realistic slip patterns, similar to maximum slip amount of major past earthquakes. For all studied sites, the peak of slip location and shelf resonance is a first order control for the observed coastal inundation depths results.
Towards Realistic Urban Traffic Experiments Using DFROUTER: Heuristic, Validation and Extensions
2017-01-01
Traffic congestion is an important problem faced by Intelligent Transportation Systems (ITS), requiring models that allow predicting the impact of different solutions on urban traffic flow. Such an approach typically requires the use of simulations, which should be as realistic as possible. However, achieving high degrees of realism can be complex when the actual traffic patterns, defined through an Origin/Destination (O-D) matrix for the vehicles in a city, remain unknown. Thus, the main contribution of this paper is a heuristic for improving traffic congestion modeling. In particular, we propose a procedure that, starting from real induction loop measurements made available by traffic authorities, iteratively refines the output of DFROUTER, which is a module provided by the SUMO (Simulation of Urban MObility) tool. This way, it is able to generate an O-D matrix for traffic that resembles the real traffic distribution and that can be directly imported by SUMO. We apply our technique to the city of Valencia, and we then compare the obtained results against other existing traffic mobility data for the cities of Cologne (Germany) and Bologna (Italy), thereby validating our approach. We also use our technique to determine what degree of congestion is expectable if certain conditions cause additional traffic to circulate in the city, adopting both a uniform pattern and a hotspot-based pattern for traffic injection to demonstrate how to regulate the overall number of vehicles in the city. This study allows evaluating the impact of vehicle flow changes on the overall traffic congestion levels. PMID:29244762
Scattering effects of machined optical surfaces
NASA Astrophysics Data System (ADS)
Thompson, Anita Kotha
1998-09-01
Optical fabrication is one of the most labor-intensive industries in existence. Lensmakers use pitch to affix glass blanks to metal chucks that hold the glass as they grind it with tools that have not changed much in fifty years. Recent demands placed on traditional optical fabrication processes in terms of surface accuracy, smoothnesses, and cost effectiveness has resulted in the exploitation of precision machining technology to develop a new generation of computer numerically controlled (CNC) optical fabrication equipment. This new kind of precision machining process is called deterministic microgrinding. The most conspicuous feature of optical surfaces manufactured by the precision machining processes (such as single-point diamond turning or deterministic microgrinding) is the presence of residual cutting tool marks. These residual tool marks exhibit a highly structured topography of periodic azimuthal or radial deterministic marks in addition to random microroughness. These distinct topographic features give rise to surface scattering effects that can significantly degrade optical performance. In this dissertation project we investigate the scattering behavior of machined optical surfaces and their imaging characteristics. In particular, we will characterize the residual optical fabrication errors and relate the resulting scattering behavior to the tool and machine parameters in order to evaluate and improve the deterministic microgrinding process. Other desired information derived from the investigation of scattering behavior is the optical fabrication tolerances necessary to satisfy specific image quality requirements. Optical fabrication tolerances are a major cost driver for any precision optical manufacturing technology. The derivation and control of the optical fabrication tolerances necessary for different applications and operating wavelength regimes will play a unique and central role in establishing deterministic microgrinding as a preferred and a cost-effective optical fabrication process. Other well understood optical fabrication processes will also be reviewed and a performance comparison with the conventional grinding and polishing technique will be made to determine any inherent advantages in the optical quality of surfaces produced by other techniques.
Experimental system for computer network via satellite /CS/. III - Network control processor
NASA Astrophysics Data System (ADS)
Kakinuma, Y.; Ito, A.; Takahashi, H.; Uchida, K.; Matsumoto, K.; Mitsudome, H.
1982-03-01
A network control processor (NCP) has the functions of generating traffics, the control of links and the control of transmitting bursts. The NCP executes protocols, monitors of experiments, gathering and compiling data of measurements, of which programs are loaded on a minicomputer (MELCOM 70/40) with 512KB of memories. The NCP acts as traffic generators, instead of a host computer, in the experiment. For this purpose, 15 fake stations are realized by the software in each user station. This paper describes the configuration of the NCP and the implementation of the protocols for the experimental system.
An Initial Study of Airport Arrival Heinz Capacity Benefits Due to Improved Scheduling Accuracy
NASA Technical Reports Server (NTRS)
Meyn, Larry; Erzberger, Heinz
2005-01-01
The long-term growth rate in air-traffic demand leads to future air-traffic densities that are unmanageable by today's air-traffic control system. I n order to accommodate such growth, new technology and operational methods will be needed in the next generation air-traffic control system. One proposal for such a system is the Automated Airspace Concept (AAC). One of the precepts of AAC is to direct aircraft using trajectories that are sent via an air-ground data link. This greatly improves the accuracy in directing aircraft to specific waypoints at specific times. Studies of the Center-TRACON Automation System (CTAS) have shown that increased scheduling accuracy enables increased arrival capacity at CTAS equipped airports.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hedstrom, Gerald; Beck, Bret; Mattoon, Caleb
2016-10-01
Merced performs a multi-dimensional integral tl generate so-called 'transfer matrices' for use in deterministic radiation transport applications. It produces transfer matrices on the user-defind energy grid. The angular dependence of outgoing products is captured in a Legendre expansion, up to a user-specified maximun Legendre order. Merced calculations can use multi-threading for enhanced performance on a single compute node.
Scientific and Technological Progress, Political Beliefs and Environmental Sustainability
ERIC Educational Resources Information Center
Makrakis, Vassilios
2012-01-01
With the development of science and technology, a basically optimistic ideology of progress has emerged. This deterministic attitude has been challenged in recent decades as a result of harmful side-effects generated by the way technology and science have been approached and used. The study presented here is a part of a larger international and…
Dual Roles for Spike Signaling in Cortical Neural Populations
Ballard, Dana H.; Jehee, Janneke F. M.
2011-01-01
A prominent feature of signaling in cortical neurons is that of randomness in the action potential. The output of a typical pyramidal cell can be well fit with a Poisson model, and variations in the Poisson rate repeatedly have been shown to be correlated with stimuli. However while the rate provides a very useful characterization of neural spike data, it may not be the most fundamental description of the signaling code. Recent data showing γ frequency range multi-cell action potential correlations, together with spike timing dependent plasticity, are spurring a re-examination of the classical model, since precise timing codes imply that the generation of spikes is essentially deterministic. Could the observed Poisson randomness and timing determinism reflect two separate modes of communication, or do they somehow derive from a single process? We investigate in a timing-based model whether the apparent incompatibility between these probabilistic and deterministic observations may be resolved by examining how spikes could be used in the underlying neural circuits. The crucial component of this model draws on dual roles for spike signaling. In learning receptive fields from ensembles of inputs, spikes need to behave probabilistically, whereas for fast signaling of individual stimuli, the spikes need to behave deterministically. Our simulations show that this combination is possible if deterministic signals using γ latency coding are probabilistically routed through different members of a cortical cell population at different times. This model exhibits standard features characteristic of Poisson models such as orientation tuning and exponential interval histograms. In addition, it makes testable predictions that follow from the γ latency coding. PMID:21687798
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biondo, Elliott D; Ibrahim, Ahmad M; Mosher, Scott W
2015-01-01
Detailed radiation transport calculations are necessary for many aspects of the design of fusion energy systems (FES) such as ensuring occupational safety, assessing the activation of system components for waste disposal, and maintaining cryogenic temperatures within superconducting magnets. Hybrid Monte Carlo (MC)/deterministic techniques are necessary for this analysis because FES are large, heavily shielded, and contain streaming paths that can only be resolved with MC. The tremendous complexity of FES necessitates the use of CAD geometry for design and analysis. Previous ITER analysis has required the translation of CAD geometry to MCNP5 form in order to use the AutomateD VAriaNcemore » reducTion Generator (ADVANTG) for hybrid MC/deterministic transport. In this work, ADVANTG was modified to support CAD geometry, allowing hybrid (MC)/deterministic transport to be done automatically and eliminating the need for this translation step. This was done by adding a new ray tracing routine to ADVANTG for CAD geometries using the Direct Accelerated Geometry Monte Carlo (DAGMC) software library. This new capability is demonstrated with a prompt dose rate calculation for an ITER computational benchmark problem using both the Consistent Adjoint Driven Importance Sampling (CADIS) method an the Forward Weighted (FW)-CADIS method. The variance reduction parameters produced by ADVANTG are shown to be the same using CAD geometry and standard MCNP5 geometry. Significant speedups were observed for both neutrons (as high as a factor of 7.1) and photons (as high as a factor of 59.6).« less
Scoping analysis of the Advanced Test Reactor using SN2ND
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolters, E.; Smith, M.; SC)
2012-07-26
A detailed set of calculations was carried out for the Advanced Test Reactor (ATR) using the SN2ND solver of the UNIC code which is part of the SHARP multi-physics code being developed under the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program in DOE-NE. The primary motivation of this work is to assess whether high fidelity deterministic transport codes can tackle coupled dynamics simulations of the ATR. The successful use of such codes in a coupled dynamics simulation can impact what experiments are performed and what power levels are permitted during those experiments at the ATR. The advantages of themore » SN2ND solver over comparable neutronics tools are its superior parallel performance and demonstrated accuracy on large scale homogeneous and heterogeneous reactor geometries. However, it should be noted that virtually no effort from this project was spent constructing a proper cross section generation methodology for the ATR usable in the SN2ND solver. While attempts were made to use cross section data derived from SCALE, the minimal number of compositional cross section sets were generated to be consistent with the reference Monte Carlo input specification. The accuracy of any deterministic transport solver is impacted by such an approach and clearly it causes substantial errors in this work. The reasoning behind this decision is justified given the overall funding dedicated to the task (two months) and the real focus of the work: can modern deterministic tools actually treat complex facilities like the ATR with heterogeneous geometry modeling. SN2ND has been demonstrated to solve problems with upwards of one trillion degrees of freedom which translates to tens of millions of finite elements, hundreds of angles, and hundreds of energy groups, resulting in a very high-fidelity model of the system unachievable by most deterministic transport codes today. A space-angle convergence study was conducted to determine the meshing and angular cubature requirements for the ATR, and also to demonstrate the feasibility of performing this analysis with a deterministic transport code capable of modeling heterogeneous geometries. The work performed indicates that a minimum of 260,000 linear finite elements combined with a L3T11 cubature (96 angles on the sphere) is required for both eigenvalue and flux convergence of the ATR. A critical finding was that the fuel meat and water channels must each be meshed with at least 3 'radial zones' for accurate flux convergence. A small number of 3D calculations were also performed to show axial mesh and eigenvalue convergence for a full core problem. Finally, a brief analysis was performed with different cross sections sets generated from DRAGON and SCALE, and the findings show that more effort will be required to improve the multigroup cross section generation process. The total number of degrees of freedom for a converged 27 group, 2D ATR problem is {approx}340 million. This number increases to {approx}25 billion for a 3D ATR problem. This scoping study shows that both 2D and 3D calculations are well within the capabilities of the current SN2ND solver, given the availability of a large-scale computing center such as BlueGene/P. However, dynamics calculations are not realistic without the implementation of improvements in the solver.« less
MC2-3 / DIF3D Analysis for the ZPPR-15 Doppler and Sodium Void Worth Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Micheal A.; Lell, Richard M.; Lee, Changho
This manuscript covers validation efforts for our deterministic codes at Argonne National Laboratory. The experimental results come from the ZPPR-15 work in 1985-1986 which was focused on the accuracy of physics data for the integral fast reactor concept. Results for six loadings are studied in this document and focus on Doppler sample worths and sodium void worths. The ZPPR-15 loadings are modeled using the MC2-3/DIF3D codes developed and maintained at ANL and the MCNP code from LANL. The deterministic models are generated by processing the as-built geometry information, i.e. MCNP input, and generating MC2-3 cross section generation instructions and amore » drawer homogenized equivalence problem. The Doppler reactivity worth measurements are small heated samples which insert very small amounts of reactivity into the system (< 2 pcm). The results generated by the MC2-3/DIF3D codes were excellent for ZPPR-15A and ZPPR-15B and good for ZPPR-15D, compared to the MCNP solutions. In all cases, notable improvements were made over the analysis techniques applied to the same problems in 1987. The sodium void worths from MC2-3/DIF3D were quite good at 37.5 pcm while MCNP result was 33 pcm and the measured result was 31.5 pcm. Copyright © (2015) by the American Nuclear Society All rights reserved.« less
DOT National Transportation Integrated Search
2001-01-01
The increased importance of truck activity in both transportation engineering and planning : has created a need for truck-oriented analytical tools. A particular planning need is for trip : generation data that can be used to estimate truck traffic p...
Visualization of Traffic Accidents
NASA Technical Reports Server (NTRS)
Wang, Jie; Shen, Yuzhong; Khattak, Asad
2010-01-01
Traffic accidents have tremendous impact on society. Annually approximately 6.4 million vehicle accidents are reported by police in the US and nearly half of them result in catastrophic injuries. Visualizations of traffic accidents using geographic information systems (GIS) greatly facilitate handling and analysis of traffic accidents in many aspects. Environmental Systems Research Institute (ESRI), Inc. is the world leader in GIS research and development. ArcGIS, a software package developed by ESRI, has the capabilities to display events associated with a road network, such as accident locations, and pavement quality. But when event locations related to a road network are processed, the existing algorithm used by ArcGIS does not utilize all the information related to the routes of the road network and produces erroneous visualization results of event locations. This software bug causes serious problems for applications in which accurate location information is critical for emergency responses, such as traffic accidents. This paper aims to address this problem and proposes an improved method that utilizes all relevant information of traffic accidents, namely, route number, direction, and mile post, and extracts correct event locations for accurate traffic accident visualization and analysis. The proposed method generates a new shape file for traffic accidents and displays them on top of the existing road network in ArcGIS. Visualization of traffic accidents along Hampton Roads Bridge Tunnel is included to demonstrate the effectiveness of the proposed method.
El Niño$-$Southern Oscillation frequency cascade
Stuecker, Malte F.; Jin, Fei -Fei; Timmermann, Axel
2015-10-19
The El Niño$-$Southern Oscillation (ENSO) phenomenon, the most pronounced feature of internally generated climate variability, occurs on interannual timescales and impacts the global climate system through an interaction with the annual cycle. The tight coupling between ENSO and the annual cycle is particularly pronounced over the tropical Western Pacific. In this paper, we show that this nonlinear interaction results in a frequency cascade in the atmospheric circulation, which is characterized by deterministic high-frequency variability on near-annual and subannual timescales. Finally, through climate model experiments and observational analysis, it is documented that a substantial fraction of the anomalous Northwest Pacific anticyclonemore » variability, which is the main atmospheric link between ENSO and the East Asian Monsoon system, can be explained by these interactions and is thus deterministic and potentially predictable.« less
Current fluctuations in periodically driven systems
NASA Astrophysics Data System (ADS)
Barato, Andre C.; Chetrite, Raphael
2018-05-01
Small nonequelibrium systems driven by an external periodic protocol can be described by Markov processes with time-periodic transition rates. In general, current fluctuations in such small systems are large and may play a crucial role. We develop a theoretical formalism to evaluate the rate of such large deviations in periodically driven systems. We show that the scaled cumulant generating function that characterizes current fluctuations is given by a maximal Floquet exponent. Comparing deterministic protocols with stochastic protocols, we show that, with respect to large deviations, systems driven by a stochastic protocol with an infinitely large number of jumps are equivalent to systems driven by deterministic protocols. Our results are illustrated with three case studies: a two-state model for a heat engine, a three-state model for a molecular pump, and a biased random walk with a time-periodic affinity.
Deterministic Coupling of Quantum Emitters in 2D Materials to Plasmonic Nanocavity Arrays.
Tran, Toan Trong; Wang, Danqing; Xu, Zai-Quan; Yang, Ankun; Toth, Milos; Odom, Teri W; Aharonovich, Igor
2017-04-12
Quantum emitters in two-dimensional materials are promising candidates for studies of light-matter interaction and next generation, integrated on-chip quantum nanophotonics. However, the realization of integrated nanophotonic systems requires the coupling of emitters to optical cavities and resonators. In this work, we demonstrate hybrid systems in which quantum emitters in 2D hexagonal boron nitride (hBN) are deterministically coupled to high-quality plasmonic nanocavity arrays. The plasmonic nanoparticle arrays offer a high-quality, low-loss cavity in the same spectral range as the quantum emitters in hBN. The coupled emitters exhibit enhanced emission rates and reduced fluorescence lifetimes, consistent with Purcell enhancement in the weak coupling regime. Our results provide the foundation for a versatile approach for achieving scalable, integrated hybrid systems based on low-loss plasmonic nanoparticle arrays and 2D materials.
NASA Astrophysics Data System (ADS)
Adya Zizwan, Putra; Zarlis, Muhammad; Budhiarti Nababan, Erna
2017-12-01
The determination of Centroid on K-Means Algorithm directly affects the quality of the clustering results. Determination of centroid by using random numbers has many weaknesses. The GenClust algorithm that combines the use of Genetic Algorithms and K-Means uses a genetic algorithm to determine the centroid of each cluster. The use of the GenClust algorithm uses 50% chromosomes obtained through deterministic calculations and 50% is obtained from the generation of random numbers. This study will modify the use of the GenClust algorithm in which the chromosomes used are 100% obtained through deterministic calculations. The results of this study resulted in performance comparisons expressed in Mean Square Error influenced by centroid determination on K-Means method by using GenClust method, modified GenClust method and also classic K-Means.
El Niño$-$Southern Oscillation frequency cascade
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuecker, Malte F.; Jin, Fei -Fei; Timmermann, Axel
The El Niño$-$Southern Oscillation (ENSO) phenomenon, the most pronounced feature of internally generated climate variability, occurs on interannual timescales and impacts the global climate system through an interaction with the annual cycle. The tight coupling between ENSO and the annual cycle is particularly pronounced over the tropical Western Pacific. In this paper, we show that this nonlinear interaction results in a frequency cascade in the atmospheric circulation, which is characterized by deterministic high-frequency variability on near-annual and subannual timescales. Finally, through climate model experiments and observational analysis, it is documented that a substantial fraction of the anomalous Northwest Pacific anticyclonemore » variability, which is the main atmospheric link between ENSO and the East Asian Monsoon system, can be explained by these interactions and is thus deterministic and potentially predictable.« less
Hong-Ou-Mandel Interference between Two Deterministic Collective Excitations in an Atomic Ensemble
NASA Astrophysics Data System (ADS)
Li, Jun; Zhou, Ming-Ti; Jing, Bo; Wang, Xu-Jie; Yang, Sheng-Jun; Jiang, Xiao; Mølmer, Klaus; Bao, Xiao-Hui; Pan, Jian-Wei
2016-10-01
We demonstrate deterministic generation of two distinct collective excitations in one atomic ensemble, and we realize the Hong-Ou-Mandel interference between them. Using Rydberg blockade we create single collective excitations in two different Zeeman levels, and we use stimulated Raman transitions to perform a beam-splitter operation between the excited atomic modes. By converting the atomic excitations into photons, the two-excitation interference is measured by photon coincidence detection with a visibility of 0.89(6). The Hong-Ou-Mandel interference witnesses an entangled NOON state of the collective atomic excitations, and we demonstrate its two times enhanced sensitivity to a magnetic field compared with a single excitation. Our work implements a minimal instance of boson sampling and paves the way for further multimode and multiexcitation studies with collective excitations of atomic ensembles.
Automated Conflict Resolution For Air Traffic Control
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
2005-01-01
The ability to detect and resolve conflicts automatically is considered to be an essential requirement for the next generation air traffic control system. While systems for automated conflict detection have been used operationally by controllers for more than 20 years, automated resolution systems have so far not reached the level of maturity required for operational deployment. Analytical models and algorithms for automated resolution have been traffic conditions to demonstrate that they can handle the complete spectrum of conflict situations encountered in actual operations. The resolution algorithm described in this paper was formulated to meet the performance requirements of the Automated Airspace Concept (AAC). The AAC, which was described in a recent paper [1], is a candidate for the next generation air traffic control system. The AAC's performance objectives are to increase safety and airspace capacity and to accommodate user preferences in flight operations to the greatest extent possible. In the AAC, resolution trajectories are generated by an automation system on the ground and sent to the aircraft autonomously via data link .The algorithm generating the trajectories must take into account the performance characteristics of the aircraft, the route structure of the airway system, and be capable of resolving all types of conflicts for properly equipped aircraft without requiring supervision and approval by a controller. Furthermore, the resolution trajectories should be compatible with the clearances, vectors and flight plan amendments that controllers customarily issue to pilots in resolving conflicts. The algorithm described herein, although formulated specifically to meet the needs of the AAC, provides a generic engine for resolving conflicts. Thus, it can be incorporated into any operational concept that requires a method for automated resolution, including concepts for autonomous air to air resolution.
Neo-deterministic seismic hazard scenarios for India—a preventive tool for disaster mitigation
NASA Astrophysics Data System (ADS)
Parvez, Imtiyaz A.; Magrin, Andrea; Vaccari, Franco; Ashish; Mir, Ramees R.; Peresan, Antonella; Panza, Giuliano Francesco
2017-11-01
Current computational resources and physical knowledge of the seismic wave generation and propagation processes allow for reliable numerical and analytical models of waveform generation and propagation. From the simulation of ground motion, it is easy to extract the desired earthquake hazard parameters. Accordingly, a scenario-based approach to seismic hazard assessment has been developed, namely the neo-deterministic seismic hazard assessment (NDSHA), which allows for a wide range of possible seismic sources to be used in the definition of reliable scenarios by means of realistic waveforms modelling. Such reliable and comprehensive characterization of expected earthquake ground motion is essential to improve building codes, particularly for the protection of critical infrastructures and for land use planning. Parvez et al. (Geophys J Int 155:489-508, 2003) published the first ever neo-deterministic seismic hazard map of India by computing synthetic seismograms with input data set consisting of structural models, seismogenic zones, focal mechanisms and earthquake catalogues. As described in Panza et al. (Adv Geophys 53:93-165, 2012), the NDSHA methodology evolved with respect to the original formulation used by Parvez et al. (Geophys J Int 155:489-508, 2003): the computer codes were improved to better fit the need of producing realistic ground shaking maps and ground shaking scenarios, at different scale levels, exploiting the most significant pertinent progresses in data acquisition and modelling. Accordingly, the present study supplies a revised NDSHA map for India. The seismic hazard, expressed in terms of maximum displacement (Dmax), maximum velocity (Vmax) and design ground acceleration (DGA), has been extracted from the synthetic signals and mapped on a regular grid over the studied territory.
NASA Astrophysics Data System (ADS)
Quintero-Chavarria, E.; Ochoa Gutierrez, L. H.
2016-12-01
Applications of the Self-potential Method in the fields of Hydrogeology and Environmental Sciences have had significant developments during the last two decades with a strong use on groundwater flows identification. Although only few authors deal with the forward problem's solution -especially in geophysics literature- different inversion procedures are currently being developed but in most cases they are compared with unconventional groundwater velocity fields and restricted to structured meshes. This research solves the forward problem based on the finite element method using the St. Venant's Principle to transform a point dipole, which is the field generated by a single vector, into a distribution of electrical monopoles. Then, two simple aquifer models were generated with specific boundary conditions and head potentials, velocity fields and electric potentials in the medium were computed. With the model's surface electric potential, the inverse problem is solved to retrieve the source of electric potential (vector field associated to groundwater flow) using deterministic and stochastic approaches. The first approach was carried out by implementing a Tikhonov regularization with a stabilized operator adapted to the finite element mesh while for the second a hierarchical Bayesian model based on Markov chain Monte Carlo (McMC) and Markov Random Fields (MRF) was constructed. For all implemented methods, the result between the direct and inverse models was contrasted in two ways: 1) shape and distribution of the vector field, and 2) magnitude's histogram. Finally, it was concluded that inversion procedures are improved when the velocity field's behavior is considered, thus, the deterministic method is more suitable for unconfined aquifers than confined ones. McMC has restricted applications and requires a lot of information (particularly in potentials fields) while MRF has a remarkable response especially when dealing with confined aquifers.
Sadiq, Abderrahmane; El Fazziki, Abdelaziz; Ouarzazi, Jamal; Sadgal, Mohamed
2016-01-01
This paper presents an integrated and adaptive problem-solving approach to control the on-road air quality by modeling the road infrastructure, managing traffic based on pollution level and generating recommendations for road users. The aim is to reduce vehicle emissions in the most polluted road segments and optimizing the pollution levels. For this we propose the use of historical and real time pollution records and contextual data to calculate the air quality index on road networks and generate recommendations for reassigning traffic flow in order to improve the on-road air quality. The resulting air quality indexes are used in the system's traffic network generation, which the cartography is represented by a weighted graph. The weights evolve according to the pollution indexes and path properties and the graph is therefore dynamic. Furthermore, the systems use the available pollution data and meteorological records in order to predict the on-road pollutant levels by using an artificial neural network based prediction model. The proposed approach combines the benefits of multi-agent systems, Big data technology, machine learning tools and the available data sources. For the shortest path searching in the road network, we use the Dijkstra algorithm over Hadoop MapReduce framework. The use Hadoop framework in the data retrieve and analysis process has significantly improved the performance of the proposed system. Also, the agent technology allowed proposing a suitable solution in terms of robustness and agility.
NASA Astrophysics Data System (ADS)
Abidi, Dhafer
TTEthernet is a deterministic network technology that makes enhancements to Layer 2 Quality-of-Service (QoS) for Ethernet. The components that implement its services enrich the Ethernet functionality with distributed fault-tolerant synchronization, robust temporal partitioning bandwidth and synchronous communication with fixed latency and low jitter. TTEthernet services can facilitate the design of scalable, robust, less complex distributed systems and architectures tolerant to faults. Simulation is nowadays an essential step in critical systems design process and represents a valuable support for validation and performance evaluation. CoRE4INET is a project bringing together all TTEthernet simulation models currently available. It is based on the extension of models of OMNeT ++ INET framework. Our objective is to study and simulate the TTEthernet protocol on a flight management subsystem (FMS). The idea is to use CoRE4INET to design the simulation model of the target system. The problem is that CoRE4INET does not offer a task scheduling tool for TTEthernet network. To overcome this problem we propose an adaptation for simulation purposes of a task scheduling approach based on formal specification of network constraints. The use of Yices solver allowed the translation of the formal specification into an executable program to generate the desired transmission plan. A case study allowed us at the end to assess the impact of the arrangement of Time-Triggered frames offsets on the performance of each type of the system traffic.
Quantum cryptography using coherent states: Randomized encryption and key generation
NASA Astrophysics Data System (ADS)
Corndorf, Eric
With the advent of the global optical-telecommunications infrastructure, an increasing number of individuals, companies, and agencies communicate information with one another over public networks or physically-insecure private networks. While the majority of the traffic flowing through these networks requires little or no assurance of secrecy, the same cannot be said for certain communications between banks, between government agencies, within the military, and between corporations. In these arenas, the need to specify some level of secrecy in communications is a high priority. While the current approaches to securing sensitive information (namely the public-key-cryptography infrastructure and deterministic private-key ciphers like AES and 3DES) seem to be cryptographically strong based on empirical evidence, there exist no mathematical proofs of secrecy for any widely deployed cryptosystem. As an example, the ubiquitous public-key cryptosystems infer all of their secrecy from the assumption that factoring of the product of two large primes is necessarily time consuming---something which has not, and perhaps cannot, be proven. Since the 1980s, the possibility of using quantum-mechanical features of light as a physical mechanism for satisfying particular cryptographic objectives has been explored. This research has been fueled by the hopes that cryptosystems based on quantum systems may provide provable levels of secrecy which are at least as valid as quantum mechanics itself. Unfortunately, the most widely considered quantum-cryptographic protocols (BB84 and the Ekert protocol) have serious implementation problems. Specifically, they require quantum-mechanical states which are not readily available, and they rely on unproven relations between intrusion-level detection and the information available to an attacker. As a result, the secrecy level provided by these experimental implementations is entirely unspecified. In an effort to provably satisfy the cryptographic objectives of key generation and direct data-encryption, a new quantum cryptographic principle is demonstrated wherein keyed coherent-state signal sets are employed. Taking advantage of the fundamental and irreducible quantum-measurement noise of coherent states, these schemes do not require the users to measure the influence of an attacker. Experimental key-generation and data encryption schemes based on these techniques, which are compatible with today's WDM fiber-optic telecommunications infrastructure, are implemented and analyzed.
Deterministic chaotic dynamics of Raba River flow (Polish Carpathian Mountains)
NASA Astrophysics Data System (ADS)
Kędra, Mariola
2014-02-01
Is the underlying dynamics of river flow random or deterministic? If it is deterministic, is it deterministic chaotic? This issue is still controversial. The application of several independent methods, techniques and tools for studying daily river flow data gives consistent, reliable and clear-cut results to the question. The outcomes point out that the investigated discharge dynamics is not random but deterministic. Moreover, the results completely confirm the nonlinear deterministic chaotic nature of the studied process. The research was conducted on daily discharge from two selected gauging stations of the mountain river in southern Poland, the Raba River.
Signal Waveform Generator Performance Specifications and Certification Requirements
DOT National Transportation Integrated Search
1998-01-01
This report provides important information for users of the National Highway Traffic Safety Administration (NHTSA) signal waveform generator (SWG) and for those organizations that would perform testing to certify the accuracy of SWG signals. The perf...
Modeling the Environmental Impact of Air Traffic Operations
NASA Technical Reports Server (NTRS)
Chen, Neil
2011-01-01
There is increased interest to understand and mitigate the impacts of air traffic on the climate, since greenhouse gases, nitrogen oxides, and contrails generated by air traffic can have adverse impacts on the climate. The models described in this presentation are useful for quantifying these impacts and for studying alternative environmentally aware operational concepts. These models have been developed by leveraging and building upon existing simulation and optimization techniques developed for the design of efficient traffic flow management strategies. Specific enhancements to the existing simulation and optimization techniques include new models that simulate aircraft fuel flow, emissions and contrails. To ensure that these new models are beneficial to the larger climate research community, the outputs of these new models are compatible with existing global climate modeling tools like the FAA's Aviation Environmental Design Tool.
NASA Astrophysics Data System (ADS)
Contreras, Arturo Javier
This dissertation describes a novel Amplitude-versus-Angle (AVA) inversion methodology to quantitatively integrate pre-stack seismic data, well logs, geologic data, and geostatistical information. Deterministic and stochastic inversion algorithms are used to characterize flow units of deepwater reservoirs located in the central Gulf of Mexico. A detailed fluid/lithology sensitivity analysis was conducted to assess the nature of AVA effects in the study area. Standard AVA analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generate typical Class III AVA responses. Layer-dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution, indicating that presence of light saturating fluids clearly affects the elastic response of sands. Accordingly, AVA deterministic and stochastic inversions, which combine the advantages of AVA analysis with those of inversion, have provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties and fluid-sensitive modulus attributes (P-Impedance, S-Impedance, density, and LambdaRho, in the case of deterministic inversion; and P-velocity, S-velocity, density, and lithotype (sand-shale) distributions, in the case of stochastic inversion). The quantitative use of rock/fluid information through AVA seismic data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, provides accurate 3D models of petrophysical properties such as porosity, permeability, and water saturation. Pre-stack stochastic inversion provides more realistic and higher-resolution results than those obtained from analogous deterministic techniques. Furthermore, 3D petrophysical models can be more accurately co-simulated from AVA stochastic inversion results. By combining AVA sensitivity analysis techniques with pre-stack stochastic inversion, geologic data, and awareness of inversion pitfalls, it is possible to substantially reduce the risk in exploration and development of conventional and non-conventional reservoirs. From the final integration of deterministic and stochastic inversion results with depositional models and analogous examples, the M-series reservoirs have been interpreted as stacked terminal turbidite lobes within an overall fan complex (the Miocene MCAVLU Submarine Fan System); this interpretation is consistent with previous core data interpretations and regional stratigraphic/depositional studies.
Design and evaluation of an air traffic control Final Approach Spacing Tool
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Erzberger, Heinz; Green, Steven M.; Nedell, William
1991-01-01
This paper describes the design and simulator evaluation of an automation tool for assisting terminal radar approach controllers in sequencing and spacing traffic onto the final approach course. The automation tool, referred to as the Final Approach Spacing Tool (FAST), displays speed and heading advisories for arriving aircraft as well as sequencing information on the controller's radar display. The main functional elements of FAST are a scheduler that schedules and sequences the traffic, a four-dimensional trajectory synthesizer that generates the advisories, and a graphical interface that displays the information to the controller. FAST has been implemented on a high-performance workstation. It can be operated as a stand-alone in the terminal radar approach control facility or as an element of a system integrated with automation tools in the air route traffic control center. FAST was evaluated by experienced air traffic controllers in a real-time air traffic control simulation. simulation results summarized in the paper show that the automation tools significantly reduced controller work load and demonstrated a potential for an increase in landing rate.
Research on three-phase traffic flow modeling based on interaction range
NASA Astrophysics Data System (ADS)
Zeng, Jun-Wei; Yang, Xu-Gang; Qian, Yong-Sheng; Wei, Xu-Ting
2017-12-01
On the basis of the multiple velocity difference effect (MVDE) model and under short-range interaction, a new three-phase traffic flow model (S-MVDE) is proposed through careful consideration of the influence of the relationship between the speeds of the two adjacent cars on the running state of the rear car. The random slowing rule in the MVDE model is modified in order to emphasize the influence of vehicle interaction between two vehicles on the probability of vehicles’ deceleration. A single-lane model which without bottleneck structure under periodic boundary conditions is simulated, and it is proved that the traffic flow simulated by S-MVDE model will generate the synchronous flow of three-phase traffic theory. Under the open boundary, the model is expanded by adding an on-ramp, the congestion pattern caused by the bottleneck is simulated at different main road flow rates and on-ramp flow rates, which is compared with the traffic congestion pattern observed by Kerner et al. and it is found that the results are consistent with the congestion characteristics in the three-phase traffic flow theory.
Fuel efficient traffic signal operation and evaluation: Garden Grove Demonstration Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1983-02-01
The procedures and results of a case study of fuel efficient traffic signal operation and evaluation in the City of Garden Grove, California are documented. Improved traffic signal timing was developed for a 70-intersection test network in Garden Grove using an optimization tool called the TRANSYT Version 8 computer program. Full-scale field testing of five alternative timing plans was conducted using two instrumented vehicles equipped to measure traffic performance characteristics and fuel consumption. The field tests indicated that significant improvements in traffic flow and fuel consumption result from the use of timing plans generated by the TRANSYT optimization model. Changingmore » from pre-existing to an optimized timing plan yields a networkwide 5 percent reduction in total travel time, more than 10 percent reduction in both the number of stops and stopped delay time, and 6 percent reduction in fuel consumption. Projections are made of the benefits and costs of implementing such a program at the 20,000 traffic signals in networks throughout the State of California.« less
Engineering Social Justice into Traffic Control for Self-Driving Vehicles?
Mladenovic, Milos N; McPherson, Tristram
2016-08-01
The convergence of computing, sensing, and communication technology will soon permit large-scale deployment of self-driving vehicles. This will in turn permit a radical transformation of traffic control technology. This paper makes a case for the importance of addressing questions of social justice in this transformation, and sketches a preliminary framework for doing so. We explain how new forms of traffic control technology have potential implications for several dimensions of social justice, including safety, sustainability, privacy, efficiency, and equal access. Our central focus is on efficiency and equal access as desiderata for traffic control design. We explain the limitations of conventional traffic control in meeting these desiderata, and sketch a preliminary vision for a next-generation traffic control tailored to address better the demands of social justice. One component of this vision is cooperative, hierarchically distributed self-organization among vehicles. Another component of this vision is a priority system enabling selection of priority levels by the user for each vehicle trip in the network, based on the supporting structure of non-monetary credits.
NASA Technical Reports Server (NTRS)
Khambatta, Cyrus F.
2007-01-01
A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.
A Small Aircraft Transportation System (SATS) Demand Model
NASA Technical Reports Server (NTRS)
Long, Dou; Lee, David; Johnson, Jesse; Kostiuk, Peter; Yackovetsky, Robert (Technical Monitor)
2001-01-01
The Small Aircraft Transportation System (SATS) demand modeling is a tool that will be useful for decision-makers to analyze SATS demands in both airport and airspace. We constructed a series of models following the general top-down, modular principles in systems engineering. There are three principal models, SATS Airport Demand Model (SATS-ADM), SATS Flight Demand Model (SATS-FDM), and LMINET-SATS. SATS-ADM models SATS operations, by aircraft type, from the forecasts in fleet, configuration and performance, utilization, and traffic mixture. Given the SATS airport operations such as the ones generated by SATS-ADM, SATS-FDM constructs the SATS origin and destination (O&D) traffic flow based on the solution of the gravity model, from which it then generates SATS flights using the Monte Carlo simulation based on the departure time-of-day profile. LMINET-SATS, an extension of LMINET, models SATS demands at airspace and airport by all aircraft operations in US The models use parameters to provide the user with flexibility and ease of use to generate SATS demand for different scenarios. Several case studies are included to illustrate the use of the models, which are useful to identify the need for a new air traffic management system to cope with SATS.
2014-09-18
Operations and Developing Issues . . . . . . . . . . . . . . . . . . 6 2.1.2 Next-Generation Air Transportation System (NextGen...Air Traffic Management ESP Euclidean Shortest Path FAA Federal Aviation Administration FCFS First-Come-First-Served HCS Hybrid Control System KKT...Karush-Kuhn-Tucker LGR Legendre-Gauss-Radau MLD Minimum Lateral Distance NAS National Airspace System NASA National Aeronautics and Space Administration
Convective Weather Avoidance with Uncertain Weather Forecasts
NASA Technical Reports Server (NTRS)
Karahan, Sinan; Windhorst, Robert D.
2009-01-01
Convective weather events have a disruptive impact on air traffic both in terminal area and in en-route airspaces. In order to make sure that the national air transportation system is safe and efficient, it is essential to respond to convective weather events effectively. Traffic flow control initiatives in response to convective weather include ground delay, airborne delay, miles-in-trail restrictions as well as tactical and strategic rerouting. The rerouting initiatives can potentially increase traffic density and complexity in regions neighboring the convective weather activity. There is a need to perform rerouting in an intelligent and efficient way such that the disruptive effects of rerouting are minimized. An important area of research is to study the interaction of in-flight rerouting with traffic congestion or complexity and developing methods that quantitatively measure this interaction. Furthermore, it is necessary to find rerouting solutions that account for uncertainties in weather forecasts. These are important steps toward managing complexity during rerouting operations, and the paper is motivated by these research questions. An automated system is developed for rerouting air traffic in order to avoid convective weather regions during the 20- minute - 2-hour time horizon. Such a system is envisioned to work in concert with separation assurance (0 - 20-minute time horizon), and longer term air traffic management (2-hours and beyond) to provide a more comprehensive solution to complexity and safety management. In this study, weather is dynamic and uncertain; it is represented as regions of airspace that pilots are likely to avoid. Algorithms are implemented in an air traffic simulation environment to support the research study. The algorithms used are deterministic but periodically revise reroutes to account for weather forecast updates. In contrast to previous studies, in this study convective weather is represented as regions of airspace that pilots are likely to avoid. The automated system periodically updates forecasts and reassesses rerouting decisions in order to account for changing weather predictions. The main objectives are to reroute flights to avoid convective weather regions and determine the resulting complexity due to rerouting. The eventual goal is to control and reduce complexity while rerouting flights during the 20 minute - 2 hour planning period. A three-hour simulation is conducted using 4800 flights in the national airspace. The study compares several metrics against a baseline scenario using the same traffic and weather but with rerouting disabled. The results show that rerouting can have a negative impact on congestion in some sectors, as expected. The rerouting system provides accurate measurements of the resulting complexity in the congested sectors. Furthermore, although rerouting is performed only in the 20-minute - 2-hour range, it results in a 30% reduction in encounters with nowcast weather polygons (100% being the ideal for perfectly predictable and accurate weather). In the simulations, rerouting was performed for the 20-minute - 2-hour flight time horizon, and for the en-route segment of air traffic. The implementation uses CWAM, a set of polygons that represent probabilities of pilot deviation around weather. The algorithms were implemented in a software-based air traffic simulation system. Initial results of the system's performance and effectiveness were encouraging. Simulation results showed that when flights were rerouted in the 20-minute - 2-hour flight time horizon of air traffic, there were fewer weather encounters in the first 20 minutes than for flights that were not rerouted. Some preliminary results were also obtained that showed that rerouting will also increase complexity. More simulations will be conducted in order to report conclusive results on the effects of rerouting on complexity. Thus, the use of the 20-minute - 2-hour flight time horizon weather avoidance teniques performed in the simulation is expected to provide benefits for short-term weather avoidance.
NASA Astrophysics Data System (ADS)
Huang, Duruo; Du, Wenqi; Zhu, Hong
2017-10-01
In performance-based seismic design, ground-motion time histories are needed for analyzing dynamic responses of nonlinear structural systems. However, the number of ground-motion data at design level is often limited. In order to analyze seismic performance of structures, ground-motion time histories need to be either selected from recorded strong-motion database or numerically simulated using stochastic approaches. In this paper, a detailed procedure to select proper acceleration time histories from the Next Generation Attenuation (NGA) database for several cities in Taiwan is presented. Target response spectra are initially determined based on a local ground-motion prediction equation under representative deterministic seismic hazard analyses. Then several suites of ground motions are selected for these cities using the Design Ground Motion Library (DGML), a recently proposed interactive ground-motion selection tool. The selected time histories are representatives of the regional seismic hazard and should be beneficial to earthquake studies when comprehensive seismic hazard assessments and site investigations are unavailable. Note that this method is also applicable to site-specific motion selections with the target spectra near the ground surface considering the site effect.
Speed and path control for conflict-free flight in high air traffic demand in terminal airspace
NASA Astrophysics Data System (ADS)
Rezaei, Ali
To accommodate the growing air traffic demand, flights will need to be planned and navigated with a much higher level of precision than today's aircraft flight path. The Next Generation Air Transportation System (NextGen) stands to benefit significantly in safety and efficiency from such movement of aircraft along precisely defined paths. Air Traffic Operations (ATO) relying on such precision--the Precision Air Traffic Operations or PATO--are the foundation of high throughput capacity envisioned for the future airports. In PATO, the preferred method is to manage the air traffic by assigning a speed profile to each aircraft in a given fleet in a given airspace (in practice known as (speed control). In this research, an algorithm has been developed, set in the context of a Hybrid Control System (HCS) model, that determines whether a speed control solution exists for a given fleet of aircraft in a given airspace and if so, computes this solution as a collective speed profile that assures separation if executed without deviation. Uncertainties such as weather are not considered but the algorithm can be modified to include uncertainties. The algorithm first computes all feasible sequences (i.e., all sequences that allow the given fleet of aircraft to reach destinations without violating the FAA's separation requirement) by looking at all pairs of aircraft. Then, the most likely sequence is determined and the speed control solution is constructed by a backward trajectory generation, starting with the aircraft last out and proceeds to the first out. This computation can be done for different sequences in parallel which helps to reduce the computation time. If such a solution does not exist, then the algorithm calculates a minimal path modification (known as path control) that will allow separation-compliance speed control. We will also prove that the algorithm will modify the path without creating a new separation violation. The new path will be generated by adding new waypoints in the airspace. As a byproduct, instead of minimal path modification, one can use the aircraft arrival time schedule to generate the sequence in which the aircraft reach their destinations.
A two-stage flow-based intrusion detection model for next-generation networks.
Umer, Muhammad Fahad; Sher, Muhammad; Bi, Yaxin
2018-01-01
The next-generation network provides state-of-the-art access-independent services over converged mobile and fixed networks. Security in the converged network environment is a major challenge. Traditional packet and protocol-based intrusion detection techniques cannot be used in next-generation networks due to slow throughput, low accuracy and their inability to inspect encrypted payload. An alternative solution for protection of next-generation networks is to use network flow records for detection of malicious activity in the network traffic. The network flow records are independent of access networks and user applications. In this paper, we propose a two-stage flow-based intrusion detection system for next-generation networks. The first stage uses an enhanced unsupervised one-class support vector machine which separates malicious flows from normal network traffic. The second stage uses a self-organizing map which automatically groups malicious flows into different alert clusters. We validated the proposed approach on two flow-based datasets and obtained promising results.
A two-stage flow-based intrusion detection model for next-generation networks
2018-01-01
The next-generation network provides state-of-the-art access-independent services over converged mobile and fixed networks. Security in the converged network environment is a major challenge. Traditional packet and protocol-based intrusion detection techniques cannot be used in next-generation networks due to slow throughput, low accuracy and their inability to inspect encrypted payload. An alternative solution for protection of next-generation networks is to use network flow records for detection of malicious activity in the network traffic. The network flow records are independent of access networks and user applications. In this paper, we propose a two-stage flow-based intrusion detection system for next-generation networks. The first stage uses an enhanced unsupervised one-class support vector machine which separates malicious flows from normal network traffic. The second stage uses a self-organizing map which automatically groups malicious flows into different alert clusters. We validated the proposed approach on two flow-based datasets and obtained promising results. PMID:29329294
A PC-based magnetometer-only attitude and rate determination system for gyroless spacecraft
NASA Technical Reports Server (NTRS)
Challa, M.; Natanson, G.; Deutschmann, J.; Galal, K.
1995-01-01
This paper describes a prototype PC-based system that uses measurements from a three-axis magnetometer (TAM) to estimate the state (three-axis attitude and rates) of a spacecraft given no a priori information other than the mass properties. The system uses two algorithms that estimate the spacecraft's state - a deterministic magnetic-field only algorithm and a Kalman filter for gyroless spacecraft. The algorithms are combined by invoking the deterministic algorithm to generate the spacecraft state at epoch using a small batch of data and then using this deterministic epoch solution as the initial condition for the Kalman filter during the production run. System input comprises processed data that includes TAM and reference magnetic field data. Additional information, such as control system data and measurements from line-of-sight sensors, can be input to the system if available. Test results are presented using in-flight data from two three-axis stabilized spacecraft: Solar, Anomalous, and Magnetospheric Particle Explorer (SAMPEX) (gyroless, Sun-pointing) and Earth Radiation Budget Satellite (ERBS) (gyro-based, Earth-pointing). The results show that, using as little as 700 s of data, the system is capable of accuracies of 1.5 deg in attitude and 0.01 deg/s in rates; i.e., within SAMPEX mission requirements.
Raychaudhuri, Subhadip; Raychaudhuri, Somkanya C
2013-01-01
Apoptotic cell death is coordinated through two distinct (type 1 and type 2) intracellular signaling pathways. How the type 1/type 2 choice is made remains a central problem in the biology of apoptosis and has implications for apoptosis related diseases and therapy. We study the problem of type 1/type 2 choice in silico utilizing a kinetic Monte Carlo model of cell death signaling. Our results show that the type 1/type 2 choice is linked to deterministic versus stochastic cell death activation, elucidating a unique regulatory control of the apoptotic pathways. Consistent with previous findings, our results indicate that caspase 8 activation level is a key regulator of the choice between deterministic type 1 and stochastic type 2 pathways, irrespective of cell types. Expression levels of signaling molecules downstream also regulate the type 1/type 2 choice. A simplified model of DISC clustering elucidates the mechanism of increased active caspase 8 generation and type 1 activation in cancer cells having increased sensitivity to death receptor activation. We demonstrate that rapid deterministic activation of the type 1 pathway can selectively target such cancer cells, especially if XIAP is also inhibited; while inherent cell-to-cell variability would allow normal cells stay protected. PMID:24709706
Deterministic influences exceed dispersal effects on hydrologically-connected microbiomes.
Graham, Emily B; Crump, Alex R; Resch, Charles T; Fansler, Sarah; Arntzen, Evan; Kennedy, David W; Fredrickson, Jim K; Stegen, James C
2017-04-01
Subsurface groundwater-surface water mixing zones (hyporheic zones) have enhanced biogeochemical activity, but assembly processes governing subsurface microbiomes remain a critical uncertainty in understanding hyporheic biogeochemistry. To address this obstacle, we investigated (a) biogeographical patterns in attached and waterborne microbiomes across three hydrologically-connected, physicochemically-distinct zones (inland hyporheic, nearshore hyporheic and river); (b) assembly processes that generated these patterns; (c) groups of organisms that corresponded to deterministic changes in the environment; and (d) correlations between these groups and hyporheic metabolism. All microbiomes remained dissimilar through time, but consistent presence of similar taxa suggested dispersal and/or common selective pressures among zones. Further, we demonstrated a pronounced impact of deterministic assembly in all microbiomes as well as seasonal shifts from heterotrophic to autotrophic microorganisms associated with increases in groundwater discharge. The abundance of one statistical cluster of organisms increased with active biomass and respiration, revealing organisms that may strongly influence hyporheic biogeochemistry. Based on our results, we propose a conceptualization of hyporheic zone metabolism in which increased organic carbon concentrations during surface water intrusion support heterotrophy, which succumbs to autotrophy under groundwater discharge. These results provide new opportunities to enhance microbially-explicit ecosystem models describing hyporheic zone biogeochemistry and its influence over riverine ecosystem function. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.
Deterministic delivery of remote entanglement on a quantum network.
Humphreys, Peter C; Kalb, Norbert; Morits, Jaco P J; Schouten, Raymond N; Vermeulen, Raymond F L; Twitchen, Daniel J; Markham, Matthew; Hanson, Ronald
2018-06-01
Large-scale quantum networks promise to enable secure communication, distributed quantum computing, enhanced sensing and fundamental tests of quantum mechanics through the distribution of entanglement across nodes 1-7 . Moving beyond current two-node networks 8-13 requires the rate of entanglement generation between nodes to exceed the decoherence (loss) rate of the entanglement. If this criterion is met, intrinsically probabilistic entangling protocols can be used to provide deterministic remote entanglement at pre-specified times. Here we demonstrate this using diamond spin qubit nodes separated by two metres. We realize a fully heralded single-photon entanglement protocol that achieves entangling rates of up to 39 hertz, three orders of magnitude higher than previously demonstrated two-photon protocols on this platform 14 . At the same time, we suppress the decoherence rate of remote-entangled states to five hertz through dynamical decoupling. By combining these results with efficient charge-state control and mitigation of spectral diffusion, we deterministically deliver a fresh remote state with an average entanglement fidelity of more than 0.5 at every clock cycle of about 100 milliseconds without any pre- or post-selection. These results demonstrate a key building block for extended quantum networks and open the door to entanglement distribution across multiple remote nodes.
Population density equations for stochastic processes with memory kernels
NASA Astrophysics Data System (ADS)
Lai, Yi Ming; de Kamps, Marc
2017-06-01
We present a method for solving population density equations (PDEs)-a mean-field technique describing homogeneous populations of uncoupled neurons—where the populations can be subject to non-Markov noise for arbitrary distributions of jump sizes. The method combines recent developments in two different disciplines that traditionally have had limited interaction: computational neuroscience and the theory of random networks. The method uses a geometric binning scheme, based on the method of characteristics, to capture the deterministic neurodynamics of the population, separating the deterministic and stochastic process cleanly. We can independently vary the choice of the deterministic model and the model for the stochastic process, leading to a highly modular numerical solution strategy. We demonstrate this by replacing the master equation implicit in many formulations of the PDE formalism by a generalization called the generalized Montroll-Weiss equation—a recent result from random network theory—describing a random walker subject to transitions realized by a non-Markovian process. We demonstrate the method for leaky- and quadratic-integrate and fire neurons subject to spike trains with Poisson and gamma-distributed interspike intervals. We are able to model jump responses for both models accurately to both excitatory and inhibitory input under the assumption that all inputs are generated by one renewal process.
Using MCBEND for neutron or gamma-ray deterministic calculations
NASA Astrophysics Data System (ADS)
Geoff, Dobson; Adam, Bird; Brendan, Tollit; Paul, Smith
2017-09-01
MCBEND 11 is the latest version of the general radiation transport Monte Carlo code from AMEC Foster Wheeler's ANSWERS® Software Service. MCBEND is well established in the UK shielding community for radiation shielding and dosimetry assessments. MCBEND supports a number of acceleration techniques, for example the use of an importance map in conjunction with Splitting/Russian Roulette. MCBEND has a well established automated tool to generate this importance map, commonly referred to as the MAGIC module using a diffusion adjoint solution. This method is fully integrated with the MCBEND geometry and material specification, and can easily be run as part of a normal MCBEND calculation. An often overlooked feature of MCBEND is the ability to use this method for forward scoping calculations, which can be run as a very quick deterministic method. Additionally, the development of the Visual Workshop environment for results display provides new capabilities for the use of the forward calculation as a productivity tool. In this paper, we illustrate the use of the combination of the old and new in order to provide an enhanced analysis capability. We also explore the use of more advanced deterministic methods for scoping calculations used in conjunction with MCBEND, with a view to providing a suite of methods to accompany the main Monte Carlo solver.
Distribution and regulation of stochasticity and plasticity in Saccharomyces cerevisiae
Dar, R. D.; Karig, D. K.; Cooke, J. F.; ...
2010-09-01
Stochasticity is an inherent feature of complex systems with nanoscale structure. In such systems information is represented by small collections of elements (e.g. a few electrons on a quantum dot), and small variations in the populations of these elements may lead to big uncertainties in the information. Unfortunately, little is known about how to work within this inherently noisy environment to design robust functionality into complex nanoscale systems. Here, we look to the biological cell as an intriguing model system where evolution has mediated the trade-offs between fluctuations and function, and in particular we look at the relationships and trade-offsmore » between stochastic and deterministic responses in the gene expression of budding yeast (Saccharomyces cerevisiae). We find gene regulatory arrangements that control the stochastic and deterministic components of expression, and show that genes that have evolved to respond to stimuli (stress) in the most strongly deterministic way exhibit the most noise in the absence of the stimuli. We show that this relationship is consistent with a bursty 2-state model of gene expression, and demonstrate that this regulatory motif generates the most uncertainty in gene expression when there is the greatest uncertainty in the optimal level of gene expression.« less
Soares, Helena; Lasserre, Rémi; Alcover, Andrés
2013-11-01
Immunological synapses are specialized cell-cell contacts formed between T lymphocytes and antigen-presenting cells. They are induced upon antigen recognition and are crucial for T-cell activation and effector functions. The generation and function of immunological synapses depend on an active T-cell polarization process, which results from a finely orchestrated crosstalk between the antigen receptor signal transduction machinery, the actin and microtubule cytoskeletons, and controlled vesicle traffic. Although we understand how some of these particular events are regulated, we still lack knowledge on how these multiple cellular elements are harmonized to ensure appropriate T-cell responses. We discuss here our view on how T-cell receptor signal transduction initially commands cytoskeletal and vesicle traffic polarization, which in turn sets the immunological synapse molecular design that regulates T-cell activation. We also discuss how the human immunodeficiency virus (HIV-1) hijacks some of these processes impairing immunological synapse generation and function. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Brand, Stephan; Petri, Maximilian; Haas, Philipp; Krettek, Christian; Haasper, Carl
2013-01-01
Due to resource scarcity, the number of low-noise and electric cars is expected to increase rapidly. The frequent use of these cars will lead to a significant reduction of traffic related noise and pollution. On the other hand, due to the adaption and conditioning of vulnerable road users the number of traffic accidents involving pedestrians and bicyclists is postulated to increase as well. Children, older people with reduced eyesight and the blind are especially reliant on a combination of acoustic and visual warning signals with approaching or accelerating vehicles. This is even more evident in urban areas where the engine sound is the dominating sound up to 30 kph (kilometres per hour). Above this, tyre-road interaction is the main cause of traffic noise. With the missing typical engine sound a new sound design is necessary to prevent traffic accidents in urban areas. Drivers should not be able to switch the sound generator off.
The impact of image storage organization on the effectiveness of PACS.
Hindel, R
1990-11-01
Picture archiving communication system (PACS) requires efficient handling of large amounts of data. Mass storage systems are cost effective but slow, while very fast systems, like frame buffers and parallel transfer disks, are expensive. The image traffic can be divided into inbound traffic generated by diagnostic modalities and outbound traffic into workstations. At the contact points with medical professionals, the responses must be fast. Archiving, on the other hand, can employ slower but less expensive storage systems, provided that the primary activities are not impeded. This article illustrates a segmentation architecture meeting these requirements based on a clearly defined PACS concept.
Chance of Necessity: Modeling Origins of Life
NASA Technical Reports Server (NTRS)
Pohorille, Andrew
2006-01-01
The fundamental nature of processes that led to the emergence of life has been a subject of long-standing debate. One view holds that the origin of life is an event governed by chance, and the result of so many random events is unpredictable. This view was eloquently expressed by Jacques Monod in his book Chance or Necessity. In an alternative view, the origin of life is considered a deterministic event. Its details need not be deterministic in every respect, but the overall behavior is predictable. A corollary to the deterministic view is that the emergence of life must have been determined primarily by universal chemistry and biochemistry rather than by subtle details of environmental conditions. In my lecture I will explore two different paradigms for the emergence of life and discuss their implications for predictability and universality of life-forming processes. The dominant approach is that the origin of life was guided by information stored in nucleic acids (the RNA World hypothesis). In this view, selection of improved combinations of nucleic acids obtained through random mutations drove evolution of biological systems from their conception. An alternative hypothesis states that the formation of protocellular metabolism was driven by non-genomic processes. Even though these processes were highly stochastic the outcome was largely deterministic, strongly constrained by laws of chemistry. I will argue that self-replication of macromolecules was not required at the early stages of evolution; the reproduction of cellular functions alone was sufficient for self-maintenance of protocells. In fact, the precise transfer of information between successive generations of the earliest protocells was unnecessary and could have impeded the discovery of cellular metabolism. I will also show that such concepts as speciation and fitness to the environment, developed in the context of genomic evolution also hold in the absence of a genome.
NASA Astrophysics Data System (ADS)
Yan, Yajing; Barth, Alexander; Beckers, Jean-Marie; Candille, Guillem; Brankart, Jean-Michel; Brasseur, Pierre
2015-04-01
Sea surface height, sea surface temperature and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. 60 ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. Incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with observations used in the assimilation experiments and independent observations, which goes further than most previous studies and constitutes one of the original points of this paper. Regarding the deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations in order to diagnose the ensemble distribution properties in a deterministic way. Regarding the probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analysed jointly. The consistency and complementarity between both validations are highlighted. High reliable situations, in which the RMS error and the CRPS give the same information, are identified for the first time in this paper.
Cao, Pengxing; Tan, Xiahui; Donovan, Graham; Sanderson, Michael J; Sneyd, James
2014-08-01
The inositol trisphosphate receptor ([Formula: see text]) is one of the most important cellular components responsible for oscillations in the cytoplasmic calcium concentration. Over the past decade, two major questions about the [Formula: see text] have arisen. Firstly, how best should the [Formula: see text] be modeled? In other words, what fundamental properties of the [Formula: see text] allow it to perform its function, and what are their quantitative properties? Secondly, although calcium oscillations are caused by the stochastic opening and closing of small numbers of [Formula: see text], is it possible for a deterministic model to be a reliable predictor of calcium behavior? Here, we answer these two questions, using airway smooth muscle cells (ASMC) as a specific example. Firstly, we show that periodic calcium waves in ASMC, as well as the statistics of calcium puffs in other cell types, can be quantitatively reproduced by a two-state model of the [Formula: see text], and thus the behavior of the [Formula: see text] is essentially determined by its modal structure. The structure within each mode is irrelevant for function. Secondly, we show that, although calcium waves in ASMC are generated by a stochastic mechanism, [Formula: see text] stochasticity is not essential for a qualitative prediction of how oscillation frequency depends on model parameters, and thus deterministic [Formula: see text] models demonstrate the same level of predictive capability as do stochastic models. We conclude that, firstly, calcium dynamics can be accurately modeled using simplified [Formula: see text] models, and, secondly, to obtain qualitative predictions of how oscillation frequency depends on parameters it is sufficient to use a deterministic model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jie; Draxl, Caroline; Hopson, Thomas
Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less
Comparison of three controllers applied to helicopter vibration
NASA Technical Reports Server (NTRS)
Leyland, Jane A.
1992-01-01
A comparison was made of the applicability and suitability of the deterministic controller, the cautious controller, and the dual controller for the reduction of helicopter vibration by using higher harmonic blade pitch control. A randomly generated linear plant model was assumed and the performance index was defined to be a quadratic output metric of this linear plant. A computer code, designed to check out and evaluate these controllers, was implemented and used to accomplish this comparison. The effects of random measurement noise, the initial estimate of the plant matrix, and the plant matrix propagation rate were determined for each of the controllers. With few exceptions, the deterministic controller yielded the greatest vibration reduction (as characterized by the quadratic output metric) and operated with the greatest reliability. Theoretical limitations of these controllers were defined and appropriate candidate alternative methods, including one method particularly suitable to the cockpit, were identified.
NASA Astrophysics Data System (ADS)
Baumann, Erwin W.; Williams, David L.
1993-08-01
Artificial neural networks capable of learning and recalling stochastic associations between non-deterministic quantities have received relatively little attention to date. One potential application of such stochastic associative networks is the generation of sensory 'expectations' based on arbitrary subsets of sensor inputs to support anticipatory and investigate behavior in sensor-based robots. Another application of this type of associative memory is the prediction of how a scene will look in one spectral band, including noise, based upon its appearance in several other wavebands. This paper describes a semi-supervised neural network architecture composed of self-organizing maps associated through stochastic inter-layer connections. This 'Stochastic Associative Memory' (SAM) can learn and recall non-deterministic associations between multi-dimensional probability density functions. The stochastic nature of the network also enables it to represent noise distributions that are inherent in any true sensing process. The SAM architecture, training process, and initial application to sensor image prediction are described. Relationships to Fuzzy Associative Memory (FAM) are discussed.
Hyperchaotic Dynamics for Light Polarization in a Laser Diode
NASA Astrophysics Data System (ADS)
Bonatto, Cristian
2018-04-01
It is shown that a highly randomlike behavior of light polarization states in the output of a free-running laser diode, covering the whole Poincaré sphere, arises as a result from a fully deterministic nonlinear process, which is characterized by a hyperchaotic dynamics of two polarization modes nonlinearly coupled with a semiconductor medium, inside the optical cavity. A number of statistical distributions were found to describe the deterministic data of the low-dimensional nonlinear flow, such as lognormal distribution for the light intensity, Gaussian distributions for the electric field components and electron densities, Rice and Rayleigh distributions, and Weibull and negative exponential distributions, for the modulus and intensity of the orthogonal linear components of the electric field, respectively. The presented results could be relevant for the generation of single units of compact light source devices to be used in low-dimensional optical hyperchaos-based applications.
A Surrogate Technique for Investigating Deterministic Dynamics in Discrete Human Movement.
Taylor, Paul G; Small, Michael; Lee, Kwee-Yum; Landeo, Raul; O'Meara, Damien M; Millett, Emma L
2016-10-01
Entropy is an effective tool for investigation of human movement variability. However, before applying entropy, it can be beneficial to employ analyses to confirm that observed data are not solely the result of stochastic processes. This can be achieved by contrasting observed data with that produced using surrogate methods. Unlike continuous movement, no appropriate method has been applied to discrete human movement. This article proposes a novel surrogate method for discrete movement data, outlining the processes for determining its critical values. The proposed technique reliably generated surrogates for discrete joint angle time series, destroying fine-scale dynamics of the observed signal, while maintaining macro structural characteristics. Comparison of entropy estimates indicated observed signals had greater regularity than surrogates and were not only the result of stochastic but also deterministic processes. The proposed surrogate method is both a valid and reliable technique to investigate determinism in other discrete human movement time series.
No-go theorem for passive single-rail linear optical quantum computing.
Wu, Lian-Ao; Walther, Philip; Lidar, Daniel A
2013-01-01
Photonic quantum systems are among the most promising architectures for quantum computers. It is well known that for dual-rail photons effective non-linearities and near-deterministic non-trivial two-qubit gates can be achieved via the measurement process and by introducing ancillary photons. While in principle this opens a legitimate path to scalable linear optical quantum computing, the technical requirements are still very challenging and thus other optical encodings are being actively investigated. One of the alternatives is to use single-rail encoded photons, where entangled states can be deterministically generated. Here we prove that even for such systems universal optical quantum computing using only passive optical elements such as beam splitters and phase shifters is not possible. This no-go theorem proves that photon bunching cannot be passively suppressed even when extra ancilla modes and arbitrary number of photons are used. Our result provides useful guidance for the design of optical quantum computers.
Surpassing the no-cloning limit with a heralded hybrid linear amplifier for coherent states
Haw, Jing Yan; Zhao, Jie; Dias, Josephine; Assad, Syed M.; Bradshaw, Mark; Blandino, Rémi; Symul, Thomas; Ralph, Timothy C.; Lam, Ping Koy
2016-01-01
The no-cloning theorem states that an unknown quantum state cannot be cloned exactly and deterministically due to the linearity of quantum mechanics. Associated with this theorem is the quantitative no-cloning limit that sets an upper bound to the quality of the generated clones. However, this limit can be circumvented by abandoning determinism and using probabilistic methods. Here, we report an experimental demonstration of probabilistic cloning of arbitrary coherent states that clearly surpasses the no-cloning limit. Our scheme is based on a hybrid linear amplifier that combines an ideal deterministic linear amplifier with a heralded measurement-based noiseless amplifier. We demonstrate the production of up to five clones with the fidelity of each clone clearly exceeding the corresponding no-cloning limit. Moreover, since successful cloning events are heralded, our scheme has the potential to be adopted in quantum repeater, teleportation and computing applications. PMID:27782135
NASA Astrophysics Data System (ADS)
Fung, Chi-Hang Fred; Ma, Xiongfeng; Chau, H. F.; Cai, Qing-Yu
2012-03-01
Privacy amplification (PA) is an essential postprocessing step in quantum key distribution (QKD) for removing any information an eavesdropper may have on the final secret key. In this paper, we consider delaying PA of the final key after its use in one-time pad encryption and prove its security. We prove that the security and the key generation rate are not affected by delaying PA. Delaying PA has two applications: it serves as a tool for significantly simplifying the security proof of QKD with a two-way quantum channel, and also it is useful in QKD networks with trusted relays. To illustrate the power of the delayed PA idea, we use it to prove the security of a qubit-based two-way deterministic QKD protocol which uses four states and four encoding operations.
Deterministic secure quantum communication using a single d-level system.
Jiang, Dong; Chen, Yuanyuan; Gu, Xuemei; Xie, Ling; Chen, Lijun
2017-03-22
Deterministic secure quantum communication (DSQC) can transmit secret messages between two parties without first generating a shared secret key. Compared with quantum key distribution (QKD), DSQC avoids the waste of qubits arising from basis reconciliation and thus reaches higher efficiency. In this paper, based on data block transmission and order rearrangement technologies, we propose a DSQC protocol. It utilizes a set of single d-level systems as message carriers, which are used to directly encode the secret message in one communication process. Theoretical analysis shows that these employed technologies guarantee the security, and the use of a higher dimensional quantum system makes our protocol achieve higher security and efficiency. Since only quantum memory is required for implementation, our protocol is feasible with current technologies. Furthermore, Trojan horse attack (THA) is taken into account in our protocol. We give a THA model and show that THA significantly increases the multi-photon rate and can thus be detected.
He, Yu-Ming; Liu, Jin; Maier, Sebastian; Emmerling, Monika; Gerhardt, Stefan; Davanço, Marcelo; Srinivasan, Kartik; Schneider, Christian; Höfling, Sven
2017-07-20
Deterministic techniques enabling the implementation and engineering of bright and coherent solid-state quantum light sources are key for the reliable realization of a next generation of quantum devices. Such a technology, at best, should allow one to significantly scale up the number of implemented devices within a given processing time. In this work, we discuss a possible technology platform for such a scaling procedure, relying on the application of nanoscale quantum dot imaging to the pillar microcavity architecture, which promises to combine very high photon extraction efficiency and indistinguishability. We discuss the alignment technology in detail, and present the optical characterization of a selected device which features a strongly Purcell-enhanced emission output. This device, which yields an extraction efficiency of η = (49 ± 4) %, facilitates the emission of photons with (94 ± 2.7) % indistinguishability.
Health risk assessment of inorganic arsenic intake of Ronphibun residents via duplicate diet study.
Saipan, Piyawat; Ruangwises, Suthep
2009-06-01
To assess health risk from exposure to inorganic arsenic via duplicate portion sampling method in Ronphibun residents. A hundred and forty samples (140 subject-days) were collected from participants in Ronphibun sub-district. Inorganic arsenic in duplicate diet sample was determined by acid digestion and hydride generation-atomic absorption spectrometry. Deterministic risk assessment is referenced throughout the present paper using United States Environmental Protection Agency (U.S. EPA) guidelines. The average daily dose and lifetime average daily dose of inorganic arsenic via duplicate diet were 0.0021 mg/kg/d and 0.00084 mg/kg/d, respectively. The risk estimates in terms of hazard quotient was 6.98 and cancer risk was 1.26 x 10(-3). The results of deterministic risk characterization both hazard quotient and cancer risk from exposure inorganic arsenic in duplicate diets were greater than safety risk levels of hazard quotient (1) and cancer risk (1 x 10(-4)).
Unifying Complexity and Information
NASA Astrophysics Data System (ADS)
Ke, Da-Guan
2013-04-01
Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.
Deterministic control of radiative processes by shaping the mode field
NASA Astrophysics Data System (ADS)
Pellegrino, D.; Pagliano, F.; Genco, A.; Petruzzella, M.; van Otten, F. W.; Fiore, A.
2018-04-01
Quantum dots (QDs) interacting with confined light fields in photonic crystal cavities represent a scalable light source for the generation of single photons and laser radiation in the solid-state platform. The complete control of light-matter interaction in these sources is needed to fully exploit their potential, but it has been challenging due to the small length scales involved. In this work, we experimentally demonstrate the control of the radiative interaction between InAs QDs and one mode of three coupled nanocavities. By non-locally moulding the mode field experienced by the QDs inside one of the cavities, we are able to deterministically tune, and even inhibit, the spontaneous emission into the mode. The presented method will enable the real-time switching of Rabi oscillations, the shaping of the temporal waveform of single photons, and the implementation of unexplored nanolaser modulation schemes.
NASA Technical Reports Server (NTRS)
Anderson, W. W.; Will, R. W.; Grantham, C.
1972-01-01
A concept for automating the control of air traffic in the terminal area in which the primary man-machine interface is the cockpit is described. The ground and airborne inputs required for implementing this concept are discussed. Digital data link requirements of 10,000 bits per second are explained. A particular implementation of this concept including a sequencing and separation algorithm which generates flight paths and implements a natural order landing sequence is presented. Onboard computer/display avionics utilizing a traffic situation display is described. A preliminary simulation of this concept has been developed which includes a simple, efficient sequencing algorithm and a complete aircraft dynamics model. This simulated jet transport was flown through automated terminal-area traffic situations by pilots using relatively sophisticated displays, and pilot performance and observations are discussed.
1990-12-01
DECLASSIFICATIONIDOWNGRADING SCHEDULE Approved for public release; distribution unlimited. 4. PERFORMING ORGANIZATION REPORT NUMBER(S) S. MONITORING...Technical Paper 20, US Bureau of Sport Fisheries and Wildlife, Washington,_DC. 10 Helwig, P. C. 1969. An Experimental Study of Ship-generated Water Waves...1974. Stream Drift as a Chronobiological Phenomenon in Running Water Ecosystems, Annual Review of Ecology and Systematics, Vol 5, pp 309-323. Muncy, R
Air Traffic Sector Configuration Change Frequency
NASA Technical Reports Server (NTRS)
Chatterji, Gano Broto; Drew, Michael
2009-01-01
Several techniques for partitioning airspace have been developed in the literature. The question of whether a region of airspace created by such methods can be used with other days of traffic, and the number of times a different partition is needed during the day is examined in this paper. Both these aspects are examined for the Fort Worth Center airspace sectors. A Mixed Integer Linear Programming method is used with actual air traffic data of ten high-volume low-weather-delay days for creating sectors. Nine solutions were obtained for each two-hour period of the day by partitioning the center airspace into two through 18 sectors in steps of two sectors. Actual track-data were played back with the generated partitions for creating histograms of the traffic-counts. The best partition for each two-hour period was then identified based on the nine traffic-count distributions. Numbers of sectors in such partitions were analyzed to determine the number of times a different configuration is needed during the day. One to three partitions were selected for the 24-hour period, and traffic data from ten days were played back to test if the traffic-counts stayed below the threshold values associated with these partitions. Results show that these partitions are robust and can be used for longer durations than they were designed for
CTAS: Computer intelligence for air traffic control in the terminal area
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
1992-01-01
A system for the automated management and control of arrival traffic, referred to as the Center-TRACON Automation System (CTAS), has been designed by the ATC research group at NASA Ames research center. In a cooperative program, NASA and the FAA have efforts underway to install and evaluate the system at the Denver and Dallas/Ft. Worth airports. CTAS consists of three types of integrated tools that provide computer-generated intelligence for both Center and TRACON controllers to guide them in managing and controlling arrival traffic efficiently. One tool, the Traffic Management Advisor (TMA), establishes optimized landing sequences and landing times for aircraft arriving in the center airspace several hundred miles from the airport. In TRACON, TMA frequencies missed approach aircraft and unanticipated arrivals. Another tool, the Descent Advisor (DA), generates clearances for the center controllers handling at crossing times provided by TMA. In the TRACON, the final approach spacing tool (FAST) provides heading and speed clearances that produce and accurately spaced flow of aircraft on the final approach course. A data base consisting of aircraft performance models, airline preferred operational procedures and real time wind measurements contribute to the effective operation of CTAS. Extensive simulator evaluations of CTAS have demonstrated controller acceptance, delay reductions, and fuel savings.
NASA Technical Reports Server (NTRS)
Pritchett, Amy R.; Hansman, R. John
1996-01-01
An experimental flight simulator study was conducted to examine the mental alerting logic and thresholds used by subjects to issue an alert and execute an avoidance maneuver. Subjects flew a series of autopilot landing approaches with traffic on a closely-spaced parallel approach; during some runs, the traffic would deviate towards the subject and the subject was to indicate the point when they recognized the potential traffic conflict, and then indicate a direction of flight for an avoidance maneuver. A variety of subjects, including graduate students, general aviation pilots and airline pilots, were tested. Five traffic displays were evaluated, with a moving map TCAS-type traffic display as a baseline. A side-task created both high and low workload situations. Subjects appeared to use the lateral deviation of the intruder aircraft from its approach path as the criteria for an alert regardless of the display available. However, with displays showing heading and/or trend information, their alerting thresholds were significantly lowered. This type of range-only schema still resulted in many near misses, as a high convergence rate was often established by the time of the subject's alert. Therefore, the properties of the intruder's trajectory had the greatest effect on the resultant near miss rate; no display system reliably caused alerts timely enough for certain collision avoidance. Subjects' performance dropped significantly on a side-task while they analyzed the need for an alert, showing alert generation can be a high workload situation at critical times. No variation was found between subjects with and with out piloting experience. These results suggest the design of automatic alerting systems should take into account the range-type alerting schema used by the human, such that the rationale for the automatic alert should be obvious to, and trusted by, the operator. Although careful display design may help generate pilot/automation trust, issues such as user non-conformance to automatically generated commands can remain a possibility.
NASA Technical Reports Server (NTRS)
HarrisonFleming, Cody; Spencer, Melissa; Leveson, Nancy; Wilkinson, Chris
2012-01-01
The generation of minimum operational, safety, performance, and interoperability requirements is an important aspect of safely integrating new NextGen components into the Communication Navigation Surveillance and Air Traffic Management (CNS/ATM) system. These requirements are used as part of the implementation and approval processes. In addition, they provide guidance to determine the levels of design assurance and performance that are needed for each element of the new NextGen procedures, including aircraft, operator, and Air Navigation and Service Provider. Using the enhanced Airborne Traffic Situational Awareness for InTrail Procedure (ATSA-ITP) as an example, this report describes some limitations of the current process used for generating safety requirements and levels of required design assurance. An alternative process is described, as well as the argument for why the alternative can generate more comprehensive requirements and greater safety assurance than the current approach.
Liu, Xiaofeng; Bai, Fang; Ouyang, Sisheng; Wang, Xicheng; Li, Honglin; Jiang, Hualiang
2009-03-31
Conformation generation is a ubiquitous problem in molecule modelling. Many applications require sampling the broad molecular conformational space or perceiving the bioactive conformers to ensure success. Numerous in silico methods have been proposed in an attempt to resolve the problem, ranging from deterministic to non-deterministic and systemic to stochastic ones. In this work, we described an efficient conformation sampling method named Cyndi, which is based on multi-objective evolution algorithm. The conformational perturbation is subjected to evolutionary operation on the genome encoded with dihedral torsions. Various objectives are designated to render the generated Pareto optimal conformers to be energy-favoured as well as evenly scattered across the conformational space. An optional objective concerning the degree of molecular extension is added to achieve geometrically extended or compact conformations which have been observed to impact the molecular bioactivity (J Comput -Aided Mol Des 2002, 16: 105-112). Testing the performance of Cyndi against a test set consisting of 329 small molecules reveals an average minimum RMSD of 0.864 A to corresponding bioactive conformations, indicating Cyndi is highly competitive against other conformation generation methods. Meanwhile, the high-speed performance (0.49 +/- 0.18 seconds per molecule) renders Cyndi to be a practical toolkit for conformational database preparation and facilitates subsequent pharmacophore mapping or rigid docking. The copy of precompiled executable of Cyndi and the test set molecules in mol2 format are accessible in Additional file 1. On the basis of MOEA algorithm, we present a new, highly efficient conformation generation method, Cyndi, and report the results of validation and performance studies comparing with other four methods. The results reveal that Cyndi is capable of generating geometrically diverse conformers and outperforms other four multiple conformer generators in the case of reproducing the bioactive conformations against 329 structures. The speed advantage indicates Cyndi is a powerful alternative method for extensive conformational sampling and large-scale conformer database preparation.
Automatic Traffic-Based Internet Control Message Protocol (ICMP) Model Generation for ns-3
2015-12-01
through visiting the inferred automata o Fuzzing of an implementation by generating altered message formats We tested with 3 versions of Netzob. First...relationships. Afterwards, we used the Automata module to generate state machines using different functions: “generateChainedStateAutomata...The “generatePTAAutomata” takes as input several communication sessions and then identifies common paths and merges these into a single automata . The
A fuzzy reinforcement learning approach to power control in wireless transmitters.
Vengerov, David; Bambos, Nicholas; Berenji, Hamid R
2005-08-01
We address the issue of power-controlled shared channel access in wireless networks supporting packetized data traffic. We formulate this problem using the dynamic programming framework and present a new distributed fuzzy reinforcement learning algorithm (ACFRL-2) capable of adequately solving a class of problems to which the power control problem belongs. Our experimental results show that the algorithm converges almost deterministically to a neighborhood of optimal parameter values, as opposed to a very noisy stochastic convergence of earlier algorithms. The main tradeoff facing a transmitter is to balance its current power level with future backlog in the presence of stochastically changing interference. Simulation experiments demonstrate that the ACFRL-2 algorithm achieves significant performance gains over the standard power control approach used in CDMA2000. Such a large improvement is explained by the fact that ACFRL-2 allows transmitters to learn implicit coordination policies, which back off under stressful channel conditions as opposed to engaging in escalating "power wars."
Optimal design of mixed-media packet-switching networks - Routing and capacity assignment
NASA Technical Reports Server (NTRS)
Huynh, D.; Kuo, F. F.; Kobayashi, H.
1977-01-01
This paper considers a mixed-media packet-switched computer communication network which consists of a low-delay terrestrial store-and-forward subnet combined with a low-cost high-bandwidth satellite subnet. We show how to route traffic via ground and/or satellite links by means of static, deterministic procedures and assign capacities to channels subject to a given linear cost such that the network average delay is minimized. Two operational schemes for this network model are investigated: one is a scheme in which the satellite channel is used as a slotted ALOHA channel; the other is a new multiaccess scheme we propose in which whenever a channel collision occurs, retransmission of the involved packets will route through ground links to their destinations. The performance of both schemes is evaluated and compared in terms of cost and average packet delay tradeoffs for some examples. The results offer guidelines for the design and optimal utilization of mixed-media networks.
Topology optimization under stochastic stiffness
NASA Astrophysics Data System (ADS)
Asadpoure, Alireza
Topology optimization is a systematic computational tool for optimizing the layout of materials within a domain for engineering design problems. It allows variation of structural boundaries and connectivities. This freedom in the design space often enables discovery of new, high performance designs. However, solutions obtained by performing the optimization in a deterministic setting may be impractical or suboptimal when considering real-world engineering conditions with inherent variabilities including (for example) variabilities in fabrication processes and operating conditions. The aim of this work is to provide a computational methodology for topology optimization in the presence of uncertainties associated with structural stiffness, such as uncertain material properties and/or structural geometry. Existing methods for topology optimization under deterministic conditions are first reviewed. Modifications are then proposed to improve the numerical performance of the so-called Heaviside Projection Method (HPM) in continuum domains. Next, two approaches, perturbation and Polynomial Chaos Expansion (PCE), are proposed to account for uncertainties in the optimization procedure. These approaches are intrusive, allowing tight and efficient coupling of the uncertainty quantification with the optimization sensitivity analysis. The work herein develops a robust topology optimization framework aimed at reducing the sensitivity of optimized solutions to uncertainties. The perturbation-based approach combines deterministic topology optimization with a perturbation method for the quantification of uncertainties. The use of perturbation transforms the problem of topology optimization under uncertainty to an augmented deterministic topology optimization problem. The PCE approach combines the spectral stochastic approach for the representation and propagation of uncertainties with an existing deterministic topology optimization technique. The resulting compact representations for the response quantities allow for efficient and accurate calculation of sensitivities of response statistics with respect to the design variables. The proposed methods are shown to be successful at generating robust optimal topologies. Examples from topology optimization in continuum and discrete domains (truss structures) under uncertainty are presented. It is also shown that proposed methods lead to significant computational savings when compared to Monte Carlo-based optimization which involve multiple formations and inversions of the global stiffness matrix and that results obtained from the proposed method are in excellent agreement with those obtained from a Monte Carlo-based optimization algorithm.
Evolutionary Concepts for Decentralized Air Traffic Flow Management
NASA Technical Reports Server (NTRS)
Adams, Milton; Kolitz, Stephan; Milner, Joseph; Odoni, Amedeo
1997-01-01
Alternative concepts for modifying the policies and procedures under which the air traffic flow management system operates are described, and an approach to the evaluation of those concepts is discussed. Here, air traffic flow management includes all activities related to the management of the flow of aircraft and related system resources from 'block to block.' The alternative concepts represent stages in the evolution from the current system, in which air traffic management decision making is largely centralized within the FAA, to a more decentralized approach wherein the airlines and other airspace users collaborate in air traffic management decision making with the FAA. The emphasis in the discussion is on a viable medium-term partially decentralized scenario representing a phase of this evolution that is consistent with the decision-making approaches embodied in proposed Free Flight concepts for air traffic management. System-level metrics for analyzing and evaluating the various alternatives are defined, and a simulation testbed developed to generate values for those metrics is described. The fundamental issue of modeling airline behavior in decentralized environments is also raised, and an example of such a model, which deals with the preservation of flight bank integrity in hub airports, is presented.
Multiple curved descending approaches and the air traffic control problem
NASA Technical Reports Server (NTRS)
Hart, S. G.; Mcpherson, D.; Kreifeldt, J.; Wemple, T. E.
1977-01-01
A terminal area air traffic control simulation was designed to study ways of accommodating increased air traffic density. The concepts that were investigated assumed the availability of the microwave landing system and data link and included: (1) multiple curved descending final approaches; (2) parallel runways certified for independent and simultaneous operation under IFR conditions; (3) closer spacing between successive aircraft; and (4) a distributed management system between the air and ground. Three groups each consisting of three pilots and two air traffic controllers flew a combined total of 350 approaches. Piloted simulators were supplied with computer generated traffic situation displays and flight instruments. The controllers were supplied with a terminal area map and digital status information. Pilots and controllers also reported that the distributed management procedure was somewhat more safe and orderly than the centralized management procedure. Flying precision increased as the amount of turn required to intersect the outer mark decreased. Pilots reported that they preferred the alternative of multiple curved descending approaches with wider spacing between aircraft to closer spacing on single, straight in finals while controllers preferred the latter option. Both pilots and controllers felt that parallel runways are an acceptable way to accommodate increased traffic density safely and expeditiously.
Web application and database modeling of traffic impact analysis using Google Maps
NASA Astrophysics Data System (ADS)
Yulianto, Budi; Setiono
2017-06-01
Traffic impact analysis (TIA) is a traffic study that aims at identifying the impact of traffic generated by development or change in land use. In addition to identifying the traffic impact, TIA is also equipped with mitigation measurement to minimize the arising traffic impact. TIA has been increasingly important since it was defined in the act as one of the requirements in the proposal of Building Permit. The act encourages a number of TIA studies in various cities in Indonesia, including Surakarta. For that reason, it is necessary to study the development of TIA by adopting the concept Transportation Impact Control (TIC) in the implementation of the TIA standard document and multimodal modeling. It includes TIA's standardization for technical guidelines, database and inspection by providing TIA checklists, monitoring and evaluation. The research was undertaken by collecting the historical data of junctions, modeling of the data in the form of relational database, building a user interface for CRUD (Create, Read, Update and Delete) the TIA data in the form of web programming with Google Maps libraries. The result research is a system that provides information that helps the improvement and repairment of TIA documents that exist today which is more transparent, reliable and credible.
NCC Simulation Model: Simulating the operations of the network control center, phase 2
NASA Technical Reports Server (NTRS)
Benjamin, Norman M.; Paul, Arthur S.; Gill, Tepper L.
1992-01-01
The simulation of the network control center (NCC) is in the second phase of development. This phase seeks to further develop the work performed in phase one. Phase one concentrated on the computer systems and interconnecting network. The focus of phase two will be the implementation of the network message dialogues and the resources controlled by the NCC. These resources are requested, initiated, monitored and analyzed via network messages. In the NCC network messages are presented in the form of packets that are routed across the network. These packets are generated, encoded, decoded and processed by the network host processors that generate and service the message traffic on the network that connects these hosts. As a result, the message traffic is used to characterize the work done by the NCC and the connected network. Phase one of the model development represented the NCC as a network of bi-directional single server queues and message generating sources. The generators represented the external segment processors. The served based queues represented the host processors. The NCC model consists of the internal and external processors which generate message traffic on the network that links these hosts. To fully realize the objective of phase two it is necessary to identify and model the processes in each internal processor. These processes live in the operating system of the internal host computers and handle tasks such as high speed message exchanging, ISN and NFE interface, event monitoring, network monitoring, and message logging. Inter process communication is achieved through the operating system facilities. The overall performance of the host is determined by its ability to service messages generated by both internal and external processors.
A simplified analytic form for generation of axisymmetric plasma boundaries
Luce, Timothy C.
2017-02-23
An improved method has been formulated for generating analytic boundary shapes as input for axisymmetric MHD equilibria. This method uses the family of superellipses as the basis function, as previously introduced. The improvements are a simplified notation, reduction of the number of simultaneous nonlinear equations to be solved, and the realization that not all combinations of input parameters admit a solution to the nonlinear constraint equations. The method tests for the existence of a self-consistent solution and, when no solution exists, it uses a deterministic method to find a nearby solution. As a result, examples of generation of boundaries, includingmore » tests with an equilibrium solver, are given.« less
A simplified analytic form for generation of axisymmetric plasma boundaries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luce, Timothy C.
An improved method has been formulated for generating analytic boundary shapes as input for axisymmetric MHD equilibria. This method uses the family of superellipses as the basis function, as previously introduced. The improvements are a simplified notation, reduction of the number of simultaneous nonlinear equations to be solved, and the realization that not all combinations of input parameters admit a solution to the nonlinear constraint equations. The method tests for the existence of a self-consistent solution and, when no solution exists, it uses a deterministic method to find a nearby solution. As a result, examples of generation of boundaries, includingmore » tests with an equilibrium solver, are given.« less
Bird's-eye view on noise-based logic.
Kish, Laszlo B; Granqvist, Claes G; Horvath, Tamas; Klappenecker, Andreas; Wen, He; Bezrukov, Sergey M
2014-01-01
Noise-based logic is a practically deterministic logic scheme inspired by the randomness of neural spikes and uses a system of uncorrelated stochastic processes and their superposition to represent the logic state. We briefly discuss various questions such as ( i ) What does practical determinism mean? ( ii ) Is noise-based logic a Turing machine? ( iii ) Is there hope to beat (the dreams of) quantum computation by a classical physical noise-based processor, and what are the minimum hardware requirements for that? Finally, ( iv ) we address the problem of random number generators and show that the common belief that quantum number generators are superior to classical (thermal) noise-based generators is nothing but a myth.
Bird's-eye view on noise-based logic
NASA Astrophysics Data System (ADS)
Kish, Laszlo B.; Granqvist, Claes G.; Horvath, Tamas; Klappenecker, Andreas; Wen, He; Bezrukov, Sergey M.
2014-09-01
Noise-based logic is a practically deterministic logic scheme inspired by the randomness of neural spikes and uses a system of uncorrelated stochastic processes and their superposition to represent the logic state. We briefly discuss various questions such as (i) What does practical determinism mean? (ii) Is noise-based logic a Turing machine? (iii) Is there hope to beat (the dreams of) quantum computation by a classical physical noise-based processor, and what are the minimum hardware requirements for that? Finally, (iv) we address the problem of random number generators and show that the common belief that quantum number generators are superior to classical (thermal) noise-based generators is nothing but a myth.
Sonification of network traffic flow for monitoring and situational awareness
2018-01-01
Maintaining situational awareness of what is happening within a computer network is challenging, not only because the behaviour happens within machines, but also because data traffic speeds and volumes are beyond human ability to process. Visualisation techniques are widely used to present information about network traffic dynamics. Although they provide operators with an overall view and specific information about particular traffic or attacks on the network, they often still fail to represent the events in an understandable way. Also, because they require visual attention they are not well suited to continuous monitoring scenarios in which network administrators must carry out other tasks. Here we present SoNSTAR (Sonification of Networks for SiTuational AwaReness), a real-time sonification system for monitoring computer networks to support network administrators’ situational awareness. SoNSTAR provides an auditory representation of all the TCP/IP traffic within a network based on the different traffic flows between between network hosts. A user study showed that SoNSTAR raises situational awareness levels by enabling operators to understand network behaviour and with the benefit of lower workload demands (as measured by the NASA TLX method) than visual techniques. SoNSTAR identifies network traffic features by inspecting the status flags of TCP/IP packet headers. Combinations of these features define particular traffic events which are mapped to recorded sounds to generate a soundscape that represents the real-time status of the network traffic environment. The sequence, timing, and loudness of the different sounds allow the network to be monitored and anomalous behaviour to be detected without the need to continuously watch a monitor screen. PMID:29672543
Sonification of network traffic flow for monitoring and situational awareness.
Debashi, Mohamed; Vickers, Paul
2018-01-01
Maintaining situational awareness of what is happening within a computer network is challenging, not only because the behaviour happens within machines, but also because data traffic speeds and volumes are beyond human ability to process. Visualisation techniques are widely used to present information about network traffic dynamics. Although they provide operators with an overall view and specific information about particular traffic or attacks on the network, they often still fail to represent the events in an understandable way. Also, because they require visual attention they are not well suited to continuous monitoring scenarios in which network administrators must carry out other tasks. Here we present SoNSTAR (Sonification of Networks for SiTuational AwaReness), a real-time sonification system for monitoring computer networks to support network administrators' situational awareness. SoNSTAR provides an auditory representation of all the TCP/IP traffic within a network based on the different traffic flows between between network hosts. A user study showed that SoNSTAR raises situational awareness levels by enabling operators to understand network behaviour and with the benefit of lower workload demands (as measured by the NASA TLX method) than visual techniques. SoNSTAR identifies network traffic features by inspecting the status flags of TCP/IP packet headers. Combinations of these features define particular traffic events which are mapped to recorded sounds to generate a soundscape that represents the real-time status of the network traffic environment. The sequence, timing, and loudness of the different sounds allow the network to be monitored and anomalous behaviour to be detected without the need to continuously watch a monitor screen.
Fast and optimized methodology to generate road traffic emission inventories and their uncertainties
NASA Astrophysics Data System (ADS)
Blond, N.; Ho, B. Q.; Clappier, A.
2012-04-01
Road traffic emissions are one of the main sources of air pollution in the cities. They are also the main sources of uncertainties in the air quality numerical models used to forecast and define abatement strategies. Until now, the available models for generating road traffic emission always required a big effort, money and time. This inhibits decisions to preserve air quality, especially in developing countries where road traffic emissions are changing very fast. In this research, we developed a new model designed to fast produce road traffic emission inventories. This model, called EMISENS, combines the well-known top-down and bottom-up approaches to force them to be coherent. A Monte Carlo methodology is included for computing emission uncertainties and the uncertainty rate due to each input parameters. This paper presents the EMISENS model and a demonstration of its capabilities through an application over Strasbourg region (Alsace), France. Same input data as collected for Circul'air model (using bottom-up approach) which has been applied for many years to forecast and study air pollution by the Alsatian air quality agency, ASPA, are used to evaluate the impact of several simplifications that a user could operate . These experiments give the possibility to review older methodologies and evaluate EMISENS results when few input data are available to produce emission inventories, as in developing countries and assumptions need to be done. We show that same average fraction of mileage driven with a cold engine can be used for all the cells of the study domain and one emission factor could replace both cold and hot emission factors.
Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE). Volume 1: User's guide
NASA Technical Reports Server (NTRS)
Dupnick, E.; Wiggins, D.
1980-01-01
An interactive computer program for automatically generating traffic models for the Space Transportation System (STS) is presented. Information concerning run stream construction, input data, and output data is provided. The flow of the interactive data stream is described. Error messages are specified, along with suggestions for remedial action. In addition, formats and parameter definitions for the payload data set (payload model), feasible combination file, and traffic model are documented.
Mei, Haibo; Poslad, Stefan; Du, Shuang
2017-12-11
Intelligent Transportation Systems (ITSs) can be applied to inform and incentivize travellers to help them make cognizant choices concerning their trip routes and transport modality use for their daily travel whilst achieving more sustainable societal and transport authority goals. However, in practice, it is challenging for an ITS to enable incentive generation that is context-driven and personalized, whilst supporting multi-dimensional travel goals. This is because an ITS has to address the situation where different travellers have different travel preferences and constraints for route and modality, in the face of dynamically-varying traffic conditions. Furthermore, personalized incentive generation also needs to dynamically achieve different travel goals from multiple travellers, in the face of their conducts being a mix of both competitive and cooperative behaviours. To address this challenge, a Rule-based Incentive Framework (RIF) is proposed in this paper that utilizes both decision tree and evolutionary game theory to process travel information and intelligently generate personalized incentives for travellers. The travel information processed includes travellers' mobile patterns, travellers' modality preferences and route traffic volume information. A series of MATLAB simulations of RIF was undertaken to validate RIF to show that it is potentially an effective way to incentivize travellers to change travel routes and modalities as an essential smart city service.
Structural Deterministic Safety Factors Selection Criteria and Verification
NASA Technical Reports Server (NTRS)
Verderaime, V.
1992-01-01
Though current deterministic safety factors are arbitrarily and unaccountably specified, its ratio is rooted in resistive and applied stress probability distributions. This study approached the deterministic method from a probabilistic concept leading to a more systematic and coherent philosophy and criterion for designing more uniform and reliable high-performance structures. The deterministic method was noted to consist of three safety factors: a standard deviation multiplier of the applied stress distribution; a K-factor for the A- or B-basis material ultimate stress; and the conventional safety factor to ensure that the applied stress does not operate in the inelastic zone of metallic materials. The conventional safety factor is specifically defined as the ratio of ultimate-to-yield stresses. A deterministic safety index of the combined safety factors was derived from which the corresponding reliability proved the deterministic method is not reliability sensitive. The bases for selecting safety factors are presented and verification requirements are discussed. The suggested deterministic approach is applicable to all NASA, DOD, and commercial high-performance structures under static stresses.
Water use, waste generation, and traffic counts at interstate rest areas in Louisiana.
DOT National Transportation Integrated Search
2003-06-30
Surprisingly, little current information for design purposes exists regarding water use and waste generation at interstate rest areas. The Waterways Experiment Station of the U.S. Army Corps of Engineers carried out the last major study in 1974. This...
The nutrient load from food waste generated onboard ships in the Baltic Sea.
Wilewska-Bien, Magda; Granhag, Lena; Andersson, Karin
2016-04-15
The combination of the sensitive characteristics of the Baltic Sea and the intense maritime traffic makes the marine environment vulnerable to anthropogenic influences. The theoretical scenario calculated in this study shows that the annually generated food waste onboard ships in traffic in the Baltic Sea contains about 182tonnes of nitrogen and 34tonnes of phosphorus. Today, all food waste generated onboard can be legally discharged into the marine environment at a distance of 12NM from the nearest land. The annual load of nitrogen contained in the food waste corresponds to 52% of load of nitrogen from the ship-generated sewage. Future regulations for sewage discharge in the Baltic Sea will require significant reduction of total nitrogen and phosphorus released. The contribution of nutrients from food waste compared to sewage will therefore be relatively larger in the future, if food waste still can be legally discharged. Copyright © 2015 Elsevier Ltd. All rights reserved.
Cost-effectiveness of traffic enforcement: case study from Uganda.
Bishai, D; Asiimwe, B; Abbas, S; Hyder, A A; Bazeyo, W
2008-08-01
In October 2004, the Ugandan Police department deployed enhanced traffic safety patrols on the four major roads to the capital Kampala. To assess the costs and potential effectiveness of increasing traffic enforcement in Uganda. Record review and key informant interviews were conducted at 10 police stations along the highways that were patrolled. Monthly data on traffic citations and casualties were reviewed for January 2001 to December 2005; time series (ARIMA) regression was used to assess for a statistically significant change in traffic deaths. Costs were computed from the perspective of the police department in $US 2005. Cost offsets from savings to the health sector were not included. The annual cost of deploying the four squads of traffic patrols (20 officers, four vehicles, equipment, administration) is estimated at $72,000. Since deployment, the number of citations has increased substantially with a value of $327 311 annually. Monthly crash data pre- and post-intervention show a statistically significant 17% drop in road deaths after the intervention. The average cost-effectiveness of better road safety enforcement in Uganda is $603 per death averted or $27 per life year saved discounted at 3% (equivalent to 9% of Uganda's $300 GDP per capita). The costs of traffic safety enforcement are low in comparison to the potential number of lives saved and revenue generated. Increasing enforcement of existing traffic safety norms can prove to be an extremely cost-effective public health intervention in low-income countries, even from a government perspective.
Encryption key distribution via chaos synchronization
NASA Astrophysics Data System (ADS)
Keuninckx, Lars; Soriano, Miguel C.; Fischer, Ingo; Mirasso, Claudio R.; Nguimdo, Romain M.; van der Sande, Guy
2017-02-01
We present a novel encryption scheme, wherein an encryption key is generated by two distant complex nonlinear units, forced into synchronization by a chaotic driver. The concept is sufficiently generic to be implemented on either photonic, optoelectronic or electronic platforms. The method for generating the key bitstream from the chaotic signals is reconfigurable. Although derived from a deterministic process, the obtained bit series fulfill the randomness conditions as defined by the National Institute of Standards test suite. We demonstrate the feasibility of our concept on an electronic delay oscillator circuit and test the robustness against attacks using a state-of-the-art system identification method.
Bell-state generation on remote superconducting qubits with dark photons
NASA Astrophysics Data System (ADS)
Hua, Ming; Tao, Ming-Jie; Alsaedi, Ahmed; Hayat, Tasawar; Wei, Hai-Rui; Deng, Fu-Guo
2018-06-01
We present a scheme to generate the Bell state deterministically on remote transmon qubits coupled to different 1D superconducting resonators connected by a long superconducting transmission line. Using the coherent evolution of the entire system in the all-resonance regime, the transmission line need not to be populated with microwave photons which can robust against the long transmission line loss. This lets the scheme more applicable to the distributed quantum computing on superconducting quantum circuit. Besides, the influence from the small anharmonicity of the energy levels of the transmon qubits can be ignored safely.
NASA Technical Reports Server (NTRS)
Lohr, Gary W.; Williams, Daniel M.; Trujillo, Anna C.; Johnson, Edward J.; Domino, David A.
2008-01-01
A concept focusing on wind dependent departure operations has been developed the current version of this concept is called the Wake Turbulence Mitigation for Departures (WTMD). This concept takes advantage the fact that cross winds of sufficient velocity blow wakes generated by "heavy" and B757 category aircraft on the downwind runway away from the upwind runway. Supervisory Air Traffic Controllers would be responsible for authorization of the Procedure. An investigation of the information requirements necessary to for Supervisors to approve monitor and terminate the Procedure was conducted. Results clearly indicated that the requisite information is currently available in air traffic control towers and that additional information was not required.
Runway Scheduling for Charlotte Douglas International Airport
NASA Technical Reports Server (NTRS)
Malik, Waqar A.; Lee, Hanbong; Jung, Yoon C.
2016-01-01
This paper describes the runway scheduler that was used in the 2014 SARDA human-in-the-loop simulations for CLT. The algorithm considers multiple runways and computes optimal runway times for departures and arrivals. In this paper, we plan to run additional simulation on the standalone MRS algorithm and compare the performance of the algorithm against a FCFS heuristic where aircraft avail of runway slots based on a priority given by their positions in the FCFS sequence. Several traffic scenarios corresponding to current day traffic level and demand profile will be generated. We also plan to examine the effect of increase in traffic level (1.2x and 1.5x) and observe trends in algorithm performance.
An effective write policy for software coherence schemes
NASA Technical Reports Server (NTRS)
Chen, Yung-Chin; Veidenbaum, Alexander V.
1992-01-01
The authors study the write behavior and evaluate the performance of various write strategies and buffering techniques for a MIN-based multiprocessor system using the simple software coherence scheme. Hit ratios, memory latencies, total execution time, and total write traffic are used as the performance indices. The write-through write-allocate no-fetch cache using a write-back write buffer is shown to have a better performance than both write-through and write-back caches. This type of write buffer is effective in reducing the volume as well as bursts of write traffic. On average, the use of a write-back cache reduces by 60 percent the total write traffic generated by a write-through cache.
Automatic Data Traffic Control on DSM Architecture
NASA Technical Reports Server (NTRS)
Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry; Kwak, Dochan (Technical Monitor)
2000-01-01
We study data traffic on distributed shared memory machines and conclude that data placement and grouping improve performance of scientific codes. We present several methods which user can employ to improve data traffic in his code. We report on implementation of a tool which detects the code fragments causing data congestions and advises user on improvements of data routing in these fragments. The capabilities of the tool include deduction of data alignment and affinity from the source code; detection of the code constructs having abnormally high cache or TLB misses; generation of data placement constructs. We demonstrate the capabilities of the tool on experiments with NAS parallel benchmarks and with a simple computational fluid dynamics application ARC3D.
ROSE: the road simulation environment
NASA Astrophysics Data System (ADS)
Liatsis, Panos; Mitronikas, Panogiotis
1997-05-01
Evaluation of advanced sensing systems for autonomous vehicle navigation (AVN) is currently carried out off-line with prerecorded image sequences taken by physically attaching the sensors to the ego-vehicle. The data collection process is cumbersome and costly as well as highly restricted to specific road environments and weather conditions. This work proposes the use of scientific animation in modeling and representation of real-world traffic scenes and aims to produce an efficient, reliable and cost-effective concept evaluation suite for AVN sensing algorithms. ROSE is organized in a modular fashion consisting of the route generator, the journey generator, the sequence description generator and the renderer. The application was developed in MATLAB and POV-Ray was selected as the rendering module. User-friendly graphical user interfaces have been designed to allow easy selection of animation parameters and monitoring of the generation proces. The system, in its current form, allows the generation of various traffic scenarios, providing for an adequate number of static/dynamic objects, road types and environmental conditions. Initial tests on the robustness of various image processing algorithms to varying lighting and weather conditions have been already carried out.
NASA Astrophysics Data System (ADS)
Ispas, N.; Năstăsoiu, M.
2016-08-01
Reducing occupant injuries for cars involves in traffic accidents is a main target of today cars designers. Known as active or passive safety, many technological solutions were developing over the time for an actual better car's occupant safety. In the real world, in traffic accidents are often involved cars from different generations with various safety historical solutions. The main aim of these papers are to quantify the influences over the car driver chest loads in cases of same or different generation of cars involved in side car crashes. Both same and different cars generations were used for the study. Other goal of the paper was the study of in time loads conformity for diver's chests from both cars involved in crash. The paper's experimental results were obtained by support of DSD, Dr. Steffan Datentechnik GmbH - Linz, Austria. The described tests were performed in full test facility of DSD Linz, in “Easter 2015 PC-Crash Seminar”. In all crashes we obtaining results from both dummy placed in impacted and hits car. The novelty of the paper are the comparisons of data set from each of driver (dummy) of two cars involved in each of six experimental crashes. Another novelty of this paper consists in possibilities to analyse the influences of structural historical cars solutions over deformation and loads in cases of traffic accidents involved. Paper's conclusions can be future used for car passive safety improvement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Latanision, R.M.
1990-12-01
Electrochemical corrosion is pervasive in virtually all engineering systems and in virtually all industrial circumstances. Although engineers now understand how to design systems to minimize corrosion in many instances, many fundamental questions remain poorly understood and, therefore, the development of corrosion control strategies is based more on empiricism than on a deep understanding of the processes by which metals corrode in electrolytes. Fluctuations in potential, or current, in electrochemical systems have been observed for many years. To date, all investigations of this phenomenon have utilized non-deterministic analyses. In this work it is proposed to study electrochemical noise from a deterministicmore » viewpoint by comparison of experimental parameters, such as first and second order moments (non-deterministic), with computer simulation of corrosion at metal surfaces. In this way it is proposed to analyze the origins of these fluctuations and to elucidate the relationship between these fluctuations and kinetic parameters associated with metal dissolution and cathodic reduction reactions. This research program addresses in essence two areas of interest: (a) computer modeling of corrosion processes in order to study the electrochemical processes on an atomistic scale, and (b) experimental investigations of fluctuations in electrochemical systems and correlation of experimental results with computer modeling. In effect, the noise generated by mathematical modeling will be analyzed and compared to experimental noise in electrochemical systems. 1 fig.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bianchini, G.; Burgio, N.; Carta, M.
The GUINEVERE experiment (Generation of Uninterrupted Intense Neutrons at the lead Venus Reactor) is an experimental program in support of the ADS technology presently carried out at SCK-CEN in Mol (Belgium). In the experiment a modified lay-out of the original thermal VENUS critical facility is coupled to an accelerator, built by the French body CNRS in Grenoble, working in both continuous and pulsed mode and delivering 14 MeV neutrons by bombardment of deuterons on a tritium-target. The modified lay-out of the facility consists of a fast subcritical core made of 30% U-235 enriched metallic Uranium in a lead matrix. Severalmore » off-line and on-line reactivity measurement techniques will be investigated during the experimental campaign. This report is focused on the simulation by deterministic (ERANOS French code) and Monte Carlo (MCNPX US code) calculations of three reactivity measurement techniques, Slope ({alpha}-fitting), Area-ratio and Source-jerk, applied to a GUINEVERE subcritical configuration (namely SC1). The inferred reactivity, in dollar units, by the Area-ratio method shows an overall agreement between the two deterministic and Monte Carlo computational approaches, whereas the MCNPX Source-jerk results are affected by large uncertainties and allow only partial conclusions about the comparison. Finally, no particular spatial dependence of the results is observed in the case of the GUINEVERE SC1 subcritical configuration. (authors)« less
Impact of Periodic Unsteadiness on Performance and Heat Load in Axial Flow Turbomachines
NASA Technical Reports Server (NTRS)
Sharma, Om P.; Stetson, Gary M.; Daniels, William A,; Greitzer, Edward M.; Blair, Michael F.; Dring, Robert P.
1997-01-01
Results of an analytical and experimental investigation, directed at the understanding of the impact of periodic unsteadiness on the time-averaged flows in axial flow turbomachines, are presented. Analysis of available experimental data, from a large-scale rotating rig (LSRR) (low speed rig), shows that in the time-averaged axisymmetric equations the magnitude of the terms representing the effect of periodic unsteadiness (deterministic stresses) are as large or larger than those due to random unsteadiness (turbulence). Numerical experiments, conducted to highlight physical mechanisms associated with the migration of combustor generated hot-streaks in turbine rotors, indicated that the effect can be simulated by accounting for deterministic stress like terms in the time-averaged mass and energy conservation equations. The experimental portion of this program shows that the aerodynamic loss for the second stator in a 1-1/2 stage turbine are influenced by the axial spacing between the second stator leading edge and the rotor trailing edge. However, the axial spacing has little impact on the heat transfer coefficient. These performance changes are believed to be associated with the change in deterministic stress at the inlet to the second stator. Data were also acquired to quantify the impact of indexing the first stator relative to the second stator. For the range of parameters examined, this effect was found to be of the same order as the effect of axial spacing.
Chaos-order transition in foraging behavior of ants.
Li, Lixiang; Peng, Haipeng; Kurths, Jürgen; Yang, Yixian; Schellnhuber, Hans Joachim
2014-06-10
The study of the foraging behavior of group animals (especially ants) is of practical ecological importance, but it also contributes to the development of widely applicable optimization problem-solving techniques. Biologists have discovered that single ants exhibit low-dimensional deterministic-chaotic activities. However, the influences of the nest, ants' physical abilities, and ants' knowledge (or experience) on foraging behavior have received relatively little attention in studies of the collective behavior of ants. This paper provides new insights into basic mechanisms of effective foraging for social insects or group animals that have a home. We propose that the whole foraging process of ants is controlled by three successive strategies: hunting, homing, and path building. A mathematical model is developed to study this complex scheme. We show that the transition from chaotic to periodic regimes observed in our model results from an optimization scheme for group animals with a home. According to our investigation, the behavior of such insects is not represented by random but rather deterministic walks (as generated by deterministic dynamical systems, e.g., by maps) in a random environment: the animals use their intelligence and experience to guide them. The more knowledge an ant has, the higher its foraging efficiency is. When young insects join the collective to forage with old and middle-aged ants, it benefits the whole colony in the long run. The resulting strategy can even be optimal.
NASA Technical Reports Server (NTRS)
Johnson, Theodore F.; Waters, W. Allen; Singer, Thomas N.; Haftka, Raphael T.
2004-01-01
A next generation reusable launch vehicle (RLV) will require thermally efficient and light-weight cryogenic propellant tank structures. Since these tanks will be weight-critical, analytical tools must be developed to aid in sizing the thickness of insulation layers and structural geometry for optimal performance. Finite element method (FEM) models of the tank and insulation layers were created to analyze the thermal performance of the cryogenic insulation layer and thermal protection system (TPS) of the tanks. The thermal conditions of ground-hold and re-entry/soak-through for a typical RLV mission were used in the thermal sizing study. A general-purpose nonlinear FEM analysis code, capable of using temperature and pressure dependent material properties, was used as the thermal analysis code. Mechanical loads from ground handling and proof-pressure testing were used to size the structural geometry of an aluminum cryogenic tank wall. Nonlinear deterministic optimization and reliability optimization techniques were the analytical tools used to size the geometry of the isogrid stiffeners and thickness of the skin. The results from the sizing study indicate that a commercial FEM code can be used for thermal analyses to size the insulation thicknesses where the temperature and pressure were varied. The results from the structural sizing study show that using combined deterministic and reliability optimization techniques can obtain alternate and lighter designs than the designs obtained from deterministic optimization methods alone.
Godt, J.W.; Baum, R.L.; Savage, W.Z.; Salciarini, D.; Schulz, W.H.; Harp, E.L.
2008-01-01
Application of transient deterministic shallow landslide models over broad regions for hazard and susceptibility assessments requires information on rainfall, topography and the distribution and properties of hillside materials. We survey techniques for generating the spatial and temporal input data for such models and present an example using a transient deterministic model that combines an analytic solution to assess the pore-pressure response to rainfall infiltration with an infinite-slope stability calculation. Pore-pressures and factors of safety are computed on a cell-by-cell basis and can be displayed or manipulated in a grid-based GIS. Input data are high-resolution (1.8??m) topographic information derived from LiDAR data and simple descriptions of initial pore-pressure distribution and boundary conditions for a study area north of Seattle, Washington. Rainfall information is taken from a previously defined empirical rainfall intensity-duration threshold and material strength and hydraulic properties were measured both in the field and laboratory. Results are tested by comparison with a shallow landslide inventory. Comparison of results with those from static infinite-slope stability analyses assuming fixed water-table heights shows that the spatial prediction of shallow landslide susceptibility is improved using the transient analyses; moreover, results can be depicted in terms of the rainfall intensity and duration known to trigger shallow landslides in the study area.
Chaos–order transition in foraging behavior of ants
Li, Lixiang; Peng, Haipeng; Kurths, Jürgen; Yang, Yixian; Schellnhuber, Hans Joachim
2014-01-01
The study of the foraging behavior of group animals (especially ants) is of practical ecological importance, but it also contributes to the development of widely applicable optimization problem-solving techniques. Biologists have discovered that single ants exhibit low-dimensional deterministic-chaotic activities. However, the influences of the nest, ants’ physical abilities, and ants’ knowledge (or experience) on foraging behavior have received relatively little attention in studies of the collective behavior of ants. This paper provides new insights into basic mechanisms of effective foraging for social insects or group animals that have a home. We propose that the whole foraging process of ants is controlled by three successive strategies: hunting, homing, and path building. A mathematical model is developed to study this complex scheme. We show that the transition from chaotic to periodic regimes observed in our model results from an optimization scheme for group animals with a home. According to our investigation, the behavior of such insects is not represented by random but rather deterministic walks (as generated by deterministic dynamical systems, e.g., by maps) in a random environment: the animals use their intelligence and experience to guide them. The more knowledge an ant has, the higher its foraging efficiency is. When young insects join the collective to forage with old and middle-aged ants, it benefits the whole colony in the long run. The resulting strategy can even be optimal. PMID:24912159
Schlaier, Juergen R; Beer, Anton L; Faltermeier, Rupert; Fellner, Claudia; Steib, Kathrin; Lange, Max; Greenlee, Mark W; Brawanski, Alexander T; Anthofer, Judith M
2017-06-01
This study compared tractography approaches for identifying cerebellar-thalamic fiber bundles relevant to planning target sites for deep brain stimulation (DBS). In particular, probabilistic and deterministic tracking of the dentate-rubro-thalamic tract (DRTT) and differences between the spatial courses of the DRTT and the cerebello-thalamo-cortical (CTC) tract were compared. Six patients with movement disorders were examined by magnetic resonance imaging (MRI), including two sets of diffusion-weighted images (12 and 64 directions). Probabilistic and deterministic tractography was applied on each diffusion-weighted dataset to delineate the DRTT. Results were compared with regard to their sensitivity in revealing the DRTT and additional fiber tracts and processing time. Two sets of regions-of-interests (ROIs) guided deterministic tractography of the DRTT or the CTC, respectively. Tract distances to an atlas-based reference target were compared. Probabilistic fiber tracking with 64 orientations detected the DRTT in all twelve hemispheres. Deterministic tracking detected the DRTT in nine (12 directions) and in only two (64 directions) hemispheres. Probabilistic tracking was more sensitive in detecting additional fibers (e.g. ansa lenticularis and medial forebrain bundle) than deterministic tracking. Probabilistic tracking lasted substantially longer than deterministic. Deterministic tracking was more sensitive in detecting the CTC than the DRTT. CTC tracts were located adjacent but consistently more posterior to DRTT tracts. These results suggest that probabilistic tracking is more sensitive and robust in detecting the DRTT but harder to implement than deterministic approaches. Although sensitivity of deterministic tracking is higher for the CTC than the DRTT, targets for DBS based on these tracts likely differ. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Social Noise: Generating Random Numbers from Twitter Streams
NASA Astrophysics Data System (ADS)
Fernández, Norberto; Quintas, Fernando; Sánchez, Luis; Arias, Jesús
2015-12-01
Due to the multiple applications of random numbers in computer systems (cryptography, online gambling, computer simulation, etc.) it is important to have mechanisms to generate these numbers. True Random Number Generators (TRNGs) are commonly used for this purpose. TRNGs rely on non-deterministic sources to generate randomness. Physical processes (like noise in semiconductors, quantum phenomenon, etc.) play this role in state of the art TRNGs. In this paper, we depart from previous work and explore the possibility of defining social TRNGs using the stream of public messages of the microblogging service Twitter as randomness source. Thus, we define two TRNGs based on Twitter stream information and evaluate them using the National Institute of Standards and Technology (NIST) statistical test suite. The results of the evaluation confirm the feasibility of the proposed approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Da Cruz, D. F.; Rochman, D.; Koning, A. J.
2012-07-01
This paper discusses the uncertainty analysis on reactivity and inventory for a typical PWR fuel element as a result of uncertainties in {sup 235,238}U nuclear data. A typical Westinghouse 3-loop fuel assembly fuelled with UO{sub 2} fuel with 4.8% enrichment has been selected. The Total Monte-Carlo method has been applied using the deterministic transport code DRAGON. This code allows the generation of the few-groups nuclear data libraries by directly using data contained in the nuclear data evaluation files. The nuclear data used in this study is from the JEFF3.1 evaluation, and the nuclear data files for {sup 238}U and {supmore » 235}U (randomized for the generation of the various DRAGON libraries) are taken from the nuclear data library TENDL. The total uncertainty (obtained by randomizing all {sup 238}U and {sup 235}U nuclear data in the ENDF files) on the reactor parameters has been split into different components (different nuclear reaction channels). Results show that the TMC method in combination with a deterministic transport code constitutes a powerful tool for performing uncertainty and sensitivity analysis of reactor physics parameters. (authors)« less
Trajectory Assessment and Modification Tools for Next Generation Air Traffic Management Operations
NASA Technical Reports Server (NTRS)
Brasil, Connie; Lee, Paul; Mainini, Matthew; Lee, Homola; Lee, Hwasoo; Prevot, Thomas; Smith, Nancy
2011-01-01
This paper reviews three Next Generation Air Transportation System (NextGen) based high fidelity air traffic control human-in-the-loop (HITL) simulations, with a focus on the expected requirement of enhanced automated trajectory assessment and modification tools to support future air traffic flow management (ATFM) planning positions. The simulations were conducted at the National Aeronautics and Space Administration (NASA) Ames Research Centers Airspace Operations Laboratory (AOL) in 2009 and 2010. The test airspace for all three simulations assumed the mid-term NextGenEn-Route high altitude environment utilizing high altitude sectors from the Kansas City and Memphis Air Route Traffic Control Centers. Trajectory assessment, modification and coordination decision support tools were developed at the AOL in order to perform future ATFM tasks. Overall tool usage results and user acceptability ratings were collected across three areas of NextGen operatoins to evaluate the tools. In addition to the usefulness and usability feedback, feasibility issues, benefits, and future requirements were also addressed. Overall, the tool sets were rated very useful and usable, and many elements of the tools received high scores and were used frequently and successfully. Tool utilization results in all three HITLs showed both user and system benefits including better airspace throughput, reduced controller workload, and highly effective communication protocols in both full Data Comm and mixed-equipage environments.
Multi-resolution model-based traffic sign detection and tracking
NASA Astrophysics Data System (ADS)
Marinas, Javier; Salgado, Luis; Camplani, Massimo
2012-06-01
In this paper we propose an innovative approach to tackle the problem of traffic sign detection using a computer vision algorithm and taking into account real-time operation constraints, trying to establish intelligent strategies to simplify as much as possible the algorithm complexity and to speed up the process. Firstly, a set of candidates is generated according to a color segmentation stage, followed by a region analysis strategy, where spatial characteristic of previously detected objects are taken into account. Finally, temporal coherence is introduced by means of a tracking scheme, performed using a Kalman filter for each potential candidate. Taking into consideration time constraints, efficiency is achieved two-fold: on the one side, a multi-resolution strategy is adopted for segmentation, where global operation will be applied only to low-resolution images, increasing the resolution to the maximum only when a potential road sign is being tracked. On the other side, we take advantage of the expected spacing between traffic signs. Namely, the tracking of objects of interest allows to generate inhibition areas, which are those ones where no new traffic signs are expected to appear due to the existence of a TS in the neighborhood. The proposed solution has been tested with real sequences in both urban areas and highways, and proved to achieve higher computational efficiency, especially as a result of the multi-resolution approach.
Sahu, Manoranjan; Hu, Shaohua; Ryan, Patrick H; Le Masters, Grace; Grinshpun, Sergey A; Chow, Judith C; Biswas, Pratim
2011-06-01
Exposure to traffic-related pollution during childhood has been associated with asthma exacerbation, and asthma incidence. The objective of the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS) is to determine if the development of allergic and respiratory disease is associated with exposure to diesel engine exhaust particles. A detailed receptor model analyses was undertaken by applying positive matrix factorization (PMF) and UNMIX receptor models to two PM₂.₅ data sets: one consisting of two carbon fractions and the other of eight temperature-resolved carbon fractions. Based on the source profiles resolved from the analyses, markers of traffic-related air pollution were estimated: the elemental carbon attributed to traffic (ECAT) and elemental carbon attributed to diesel vehicle emission (ECAD). Application of UNMIX to the two data sets generated four source factors: combustion related sulfate, traffic, metal processing and soil/crustal. The PMF application generated six source factors derived from analyzing two carbon fractions and seven factors from temperature-resolved eight carbon fractions. The source factors (with source contribution estimates by mass concentrations in parentheses) are: combustion sulfate (46.8%), vegetative burning (15.8%), secondary sulfate (12.9%), diesel vehicle emission (10.9%), metal processing (7.5%), gasoline vehicle emission (5.6%) and soil/crustal (0.7%). Diesel and gasoline vehicle emission sources were separated using eight temperature-resolved organic and elemental carbon fractions. Application of PMF to both datasets also differentiated the sulfate rich source from the vegetative burning source, which are combined in a single factor by UNMIX modeling. Calculated ECAT and ECAD values at different locations indicated that traffic source impacts depend on factors such as traffic volumes, meteorological parameters, and the mode of vehicle operation apart from the proximity of the sites to highways. The difference in ECAT and ECAD, however, was less than one standard deviation. Thus, a cost benefit consideration should be used when deciding on the benefits of an eight or two carbon approach. Published by Elsevier B.V.
NASA Technical Reports Server (NTRS)
Lohr, Gary W.; Williams, Daniel M.; Trujillo, Anna C.
2008-01-01
Closely Space Parallel Runway (CSPR) configurations are capacity limited for departures due to the requirement to apply wake vortex separation standards from traffic departing on the adjacent parallel runway. To mitigate the effects of this constraint, a concept focusing on wind dependent departure operations has been developed, known as the Wake Turbulence Mitigation for Departures (WTMD). This concept takes advantage of the fact that crosswinds of sufficient velocity blow wakes generated by aircraft departing from the downwind runway away from the upwind runway. Consequently, under certain conditions, wake separations on the upwind runway would not be required based on wakes generated by aircraft on the downwind runway, as is currently the case. It follows that information requirements, and sources for this information, would need to be determined for airport traffic control tower (ATCT) supervisory personnel who would be charged with decisions regarding use of the procedure. To determine the information requirements, data were collected from ATCT supervisors and controller-in-charge qualified individuals at Lambert-St. Louis International Airport (STL) and George Bush Houston Intercontinental Airport (IAH). STL and IAH were chosen as data collection sites based on the implementation of a WTMD prototype system, operating in shadow mode, at these locations. The 17 total subjects (STL: 5, IAH: 12) represented a broad-base of air traffic experience. Results indicated that the following information was required to support the conduct of WTMD operations: current and forecast weather information, current and forecast traffic demand and traffic flow restrictions, and WTMD System status information and alerting. Subjects further indicated that the requisite information is currently available in the tower cab with the exception of the WTMD status and alerting. Subjects were given a demonstration of a display supporting the prototype systems and unanimously stated that the WTMD status information they felt important was represented. Overwhelmingly, subjects felt that approving, monitoring and terminating the WTMD procedure could be integrated into their supervisory workload.
23 CFR 661.57 - How is a list of deficient bridges to be generated?
Code of Federal Regulations, 2011 CFR
2011-04-01
... 23 Highways 1 2011-04-01 2011-04-01 false How is a list of deficient bridges to be generated? 661... AND TRAFFIC OPERATIONS INDIAN RESERVATION ROAD BRIDGE PROGRAM § 661.57 How is a list of deficient bridges to be generated? (a) In consultation with the BIA, a list of deficient BIA IRR bridges will be...
23 CFR 661.57 - How is a list of deficient bridges to be generated?
Code of Federal Regulations, 2010 CFR
2010-04-01
... 23 Highways 1 2010-04-01 2010-04-01 false How is a list of deficient bridges to be generated? 661... AND TRAFFIC OPERATIONS INDIAN RESERVATION ROAD BRIDGE PROGRAM § 661.57 How is a list of deficient bridges to be generated? (a) In consultation with the BIA, a list of deficient BIA IRR bridges will be...
NASA Astrophysics Data System (ADS)
Wang, Zhanyong; Lu, Feng; He, Hong-di; Lu, Qing-Chang; Wang, Dongsheng; Peng, Zhong-Ren
2015-03-01
At road intersections, vehicles frequently stop with idling engines during the red-light period and speed up rapidly in the green-light period, which generates higher velocity fluctuation and thus higher emission rates. Additionally, the frequent changes of wind direction further add the highly variable dispersion of pollutants at the street scale. It is, therefore, very difficult to estimate the distribution of pollutant concentrations using conventional deterministic causal models. For this reason, a hybrid model combining wavelet neural network and genetic algorithm (GA-WNN) is proposed for predicting 5-min series of carbon monoxide (CO) and fine particulate matter (PM2.5) concentrations in proximity to an intersection. The proposed model is examined based on the measured data under two situations. As the measured pollutant concentrations are found to be dependent on the distance to the intersection, the model is evaluated in three locations respectively, i.e. 110 m, 330 m and 500 m. Due to the different variation of pollutant concentrations on varied time, the model is also evaluated in peak and off-peak traffic time periods separately. Additionally, the proposed model, together with the back-propagation neural network (BPNN), is examined with the measured data in these situations. The proposed model is found to perform better in predictability and precision for both CO and PM2.5 than BPNN does, implying that the hybrid model can be an effective tool to improve the accuracy of estimating pollutants' distribution pattern at intersections. The outputs of these findings demonstrate the potential of the proposed model to be applicable to forecast the distribution pattern of air pollution in real-time in proximity to road intersection.
NASA Astrophysics Data System (ADS)
Parhad, Ashutosh
Intelligent transportation systems use in-pavement inductive loop sensors to collect real time traffic data. This method is very expensive in terms of installation and maintenance. Our research is focused on developing advanced algorithms capable of generating high amounts of energy that can charge a battery. This electromechanical energy conversion is an optimal way of energy scavenging that makes use of piezoelectric sensors. The power generated is sufficient to run the vehicle detection module that has several sensors embedded together. To achieve these goals, we have developed a simulation module using software's like LabVIEW and Multisim. The simulation module recreates a practical scenario that takes into consideration vehicle weight, speed, wheel width and frequency of the traffic.
NASA Astrophysics Data System (ADS)
Moncoulon, D.; Labat, D.; Ardon, J.; Onfroy, T.; Leblois, E.; Poulard, C.; Aji, S.; Rémy, A.; Quantin, A.
2013-07-01
The analysis of flood exposure at a national scale for the French insurance market must combine the generation of a probabilistic event set of all possible but not yet occurred flood situations with hazard and damage modeling. In this study, hazard and damage models are calibrated on a 1995-2012 historical event set, both for hazard results (river flow, flooded areas) and loss estimations. Thus, uncertainties in the deterministic estimation of a single event loss are known before simulating a probabilistic event set. To take into account at least 90% of the insured flood losses, the probabilistic event set must combine the river overflow (small and large catchments) with the surface runoff due to heavy rainfall, on the slopes of the watershed. Indeed, internal studies of CCR claim database has shown that approximately 45% of the insured flood losses are located inside the floodplains and 45% outside. 10% other percent are due to seasurge floods and groundwater rise. In this approach, two independent probabilistic methods are combined to create a single flood loss distribution: generation of fictive river flows based on the historical records of the river gauge network and generation of fictive rain fields on small catchments, calibrated on the 1958-2010 Météo-France rain database SAFRAN. All the events in the probabilistic event sets are simulated with the deterministic model. This hazard and damage distribution is used to simulate the flood losses at the national scale for an insurance company (MACIF) and to generate flood areas associated with hazard return periods. The flood maps concern river overflow and surface water runoff. Validation of these maps is conducted by comparison with the address located claim data on a small catchment (downstream Argens).
A model of jam formation in congested traffic
NASA Astrophysics Data System (ADS)
Bunzarova, N. Zh; Pesheva, N. C.; Priezzhev, V. B.; Brankov, J. G.
2017-12-01
We study a model of irreversible jam formation in congested vehicular traffic on an open segment of a single-lane road. The vehicles obey a stochastic discrete-time dynamics which is a limiting case of the generalized Totally Asymmetric Simple Exclusion Process. Its characteristic features are: (a) the existing clusters of jammed cars cannot break into parts; (b) when the leading vehicle of a cluster hops to the right, the whole cluster follows it deterministically, and (c) any two clusters of vehicles, occupying consecutive positions on the chain, may become nearest-neighbors and merge irreversibly into a single cluster. The above dynamics was used in a one-dimensional model of irreversible aggregation by Bunzarova and Pesheva [Phys. Rev. E 95, 052105 (2017)]. The model has three stationary non-equilibrium phases, depending on the probabilities of injection (α), ejection (β), and hopping (p) of particles: a many-particle one, MP, a completely jammed phase CF, and a mixed MP+CF phase. An exact expression for the stationary probability P(1) of a completely jammed configuration in the mixed MP+CF phase is obtained. The gap distribution between neighboring clusters of jammed cars at large lengths L of the road is studied. Three regimes of evolution of the width of a single gap are found: (i) growing gaps with length of the order O(L) when β > p; (ii) shrinking gaps with length of the order O(1) when β < p; and (iii) critical gaps at β = p, of the order O(L 1/2). These results are supported by extensive Monte Carlo calculations.
Frankel, A.
2009-01-01
Broadband (0.1-20 Hz) synthetic seismograms for finite-fault sources were produced for a model where stress drop is constant with seismic moment to see if they can match the magnitude dependence and distance decay of response spectral amplitudes found in the Next Generation Attenuation (NGA) relations recently developed from strong-motion data of crustal earthquakes in tectonically active regions. The broadband synthetics were constructed for earthquakes of M 5.5, 6.5, and 7.5 by combining deterministic synthetics for plane-layered models at low frequencies with stochastic synthetics at high frequencies. The stochastic portion used a source model where the Brune stress drop of 100 bars is constant with seismic moment. The deterministic synthetics were calculated using an average slip velocity, and hence, dynamic stress drop, on the fault that is uniform with magnitude. One novel aspect of this procedure is that the transition frequency between the deterministic and stochastic portions varied with magnitude, so that the transition frequency is inversely related to the rise time of slip on the fault. The spectral accelerations at 0.2, 1.0, and 3.0 sec periods from the synthetics generally agreed with those from the set of NGA relations for M 5.5-7.5 for distances of 2-100 km. At distances of 100-200 km some of the NGA relations for 0.2 sec spectral acceleration were substantially larger than the values of the synthetics for M 7.5 and M 6.5 earthquakes because these relations do not have a term accounting for Q. At 3 and 5 sec periods, the synthetics for M 7.5 earthquakes generally had larger spectral accelerations than the NGA relations, although there was large scatter in the results from the synthetics. The synthetics showed a sag in response spectra at close-in distances for M 5.5 between 0.3 and 0.7 sec that is not predicted from the NGA relations.
Impact of traffic-related air pollution on health.
Jakubiak-Lasocka, J; Lasocki, J; Siekmeier, R; Chłopek, Z
2015-01-01
Road transport contributes significantly to air quality problems through vehicle emissions, which have various detrimental impacts on public health and the environment. The aim of this study was to assess the impact of traffic-related air pollution on health of Warsaw citizens, following the basics of the Health Impact Assessment (HIA) method, and evaluate its social cost. PM10 was chosen as an indicator of traffic-related air pollution. Exposure-response functions between air pollution and health impacts were employed. The value of statistical life (VSL) approach was used for the estimation of the cost of mortality attributable to traffic-related air pollution. Costs of hospitalizations and restricted activity days were assessed basing on the cost of illness (COI) method. According to the calculations, about 827 Warsaw citizens die in a year as a result of traffic-related air pollution. Also, about 566 and 250 hospital admissions due to cardiovascular and respiratory diseases, respectively, and more than 128,453 restricted activity days can be attributed to the traffic emissions. From the social perspective, these losses generate the cost of 1,604 million PLN (1 EUR-approx. 4.2 PLN). This cost is very high and, therefore, more attention should be paid for the integrated environmental health policy.
Traffic flow behavior at un-signalized intersection with crossings pedestrians
NASA Astrophysics Data System (ADS)
Khallouk, A.; Echab, H.; Ez-Zahraouy, H.; Lakouari, N.
2018-02-01
Mixed traffic flux composed of crossing pedestrians and vehicles extensively exists in cities. To study the characteristics of the interference traffic flux, we develop a pedestrian-vehicle cellular automata model to present the interaction behaviors on a simple cross road. By realizing the fundamental parameters (i.e. injecting rates α1, α2, the extracting rate β and the pedestrian arrival rate αP), simulations are carried out. The vehicular traffic flux is calculated in terms of rates. The effect of the crosswalk can be regarded as a dynamic impurity. The system phase diagrams in the (α1 ,αP) plane are built. It is found that the phase diagrams consist essentially of four phases namely Free Flow, Congested, Maximal Current and Gridlock. The value of the Maximal current phase depends on the extracting rate β, while the Gridlock phase is achieved only when the pedestrians generating rate is higher than a critical value. Furthermore, the effect of vehicles changing lane (Pch1 ,Pch2) and the location of the crosswalk XP on the dynamic characteristics of vehicles flow are investigated. It is found that traffic situation in the system is slightly enhanced if the location of the crosswalks XP is far from the intersection. However, when Pch1, Pch2 increase, the traffic becomes congested and the Gridlock phase enlarges.
Traffic-aware energy saving scheme with modularization supporting in TWDM-PON
NASA Astrophysics Data System (ADS)
Xiong, Yu; Sun, Peng; Liu, Chuanbo; Guan, Jianjun
2017-01-01
Time and wavelength division multiplexed passive optical network (TWDM-PON) is considered to be a primary solution for next-generation passive optical network stage 2 (NG-PON2). Due to the feature of multi-wavelength transmission of TWDM-PON, some of the transmitters/receivers at the optical line terminal (OLT) could be shut down to reduce the energy consumption. Therefore, a novel scheme called traffic-aware energy saving scheme with modularization supporting is proposed. Through establishing the modular energy consumption model of OLT, the wavelength transmitters/receivers at OLT could be switched on or shut down adaptively depending on sensing the status of network traffic load, thus the energy consumption of OLT will be effectively reduced. Furthermore, exploring the technology of optical network unit (ONU) modularization, each module of ONU could be switched to sleep or active mode independently in order to reduce the energy consumption of ONU. Simultaneously, the polling sequence of ONU could be changed dynamically via sensing the packet arrival time. In order to guarantee the delay performance of network traffic, the sub-cycle division strategy is designed to transmit the real-time traffic preferentially. Finally, simulation results verify that the proposed scheme is able to reduce the energy consumption of the network while maintaining the traffic delay performance.
Masek, Pavel; Masek, Jan; Frantik, Petr; Fujdiak, Radek; Ometov, Aleksandr; Hosek, Jiri; Andreev, Sergey; Mlynek, Petr; Misurec, Jiri
2016-11-08
The unprecedented growth of today's cities together with increased population mobility are fueling the avalanche in the numbers of vehicles on the roads. This development led to the new challenges for the traffic management, including the mitigation of road congestion, accidents, and air pollution. Over the last decade, researchers have been focusing their efforts on leveraging the recent advances in sensing, communications, and dynamic adaptive technologies to prepare the deployed road traffic management systems (TMS) for resolving these important challenges in future smart cities. However, the existing solutions may still be insufficient to construct a reliable and secure TMS that is capable of handling the anticipated influx of the population and vehicles in urban areas. Along these lines, this work systematically outlines a perspective on a novel modular environment for traffic modeling, which allows to recreate the examined road networks in their full resemblance. Our developed solution is targeted to incorporate the progress in the Internet of Things (IoT) technologies, where low-power, embedded devices integrate as part of a next-generation TMS. To mimic the real traffic conditions, we recreated and evaluated a practical traffic scenario built after a complex road intersection within a large European city.
Masek, Pavel; Masek, Jan; Frantik, Petr; Fujdiak, Radek; Ometov, Aleksandr; Hosek, Jiri; Andreev, Sergey; Mlynek, Petr; Misurec, Jiri
2016-01-01
The unprecedented growth of today’s cities together with increased population mobility are fueling the avalanche in the numbers of vehicles on the roads. This development led to the new challenges for the traffic management, including the mitigation of road congestion, accidents, and air pollution. Over the last decade, researchers have been focusing their efforts on leveraging the recent advances in sensing, communications, and dynamic adaptive technologies to prepare the deployed road traffic management systems (TMS) for resolving these important challenges in future smart cities. However, the existing solutions may still be insufficient to construct a reliable and secure TMS that is capable of handling the anticipated influx of the population and vehicles in urban areas. Along these lines, this work systematically outlines a perspective on a novel modular environment for traffic modeling, which allows to recreate the examined road networks in their full resemblance. Our developed solution is targeted to incorporate the progress in the Internet of Things (IoT) technologies, where low-power, embedded devices integrate as part of a next-generation TMS. To mimic the real traffic conditions, we recreated and evaluated a practical traffic scenario built after a complex road intersection within a large European city. PMID:27834796
Deterministic quantum dense coding networks
NASA Astrophysics Data System (ADS)
Roy, Saptarshi; Chanda, Titas; Das, Tamoghna; Sen(De), Aditi; Sen, Ujjwal
2018-07-01
We consider the scenario of deterministic classical information transmission between multiple senders and a single receiver, when they a priori share a multipartite quantum state - an attempt towards building a deterministic dense coding network. Specifically, we prove that in the case of two or three senders and a single receiver, generalized Greenberger-Horne-Zeilinger (gGHZ) states are not beneficial for sending classical information deterministically beyond the classical limit, except when the shared state is the GHZ state itself. On the other hand, three- and four-qubit generalized W (gW) states with specific parameters as well as the four-qubit Dicke states can provide a quantum advantage of sending the information in deterministic dense coding. Interestingly however, numerical simulations in the three-qubit scenario reveal that the percentage of states from the GHZ-class that are deterministic dense codeable is higher than that of states from the W-class.
Preliminary Benefits Assessment of Traffic Aware Strategic Aircrew Requests (TASAR)
NASA Technical Reports Server (NTRS)
Henderson, Jeff; Idris, Husni; Wing, David J.
2012-01-01
While en route, aircrews submit trajectory change requests to air traffic control (ATC) to better meet their objectives including reduced delays, reduced fuel burn, and passenger comfort. Aircrew requests are currently made with limited to no information on surrounding traffic. Consequently, these requests are uninformed about a key ATC objective, ensuring traffic separation, and therefore less likely to be accepted than requests informed by surrounding traffic and that avoids creating conflicts. This paper studies the benefits of providing aircrews with on-board decision support to generate optimized trajectory requests that are probed and cleared of known separation violations prior to issuing the request to ATC. These informed requests are referred to as traffic aware strategic aircrew requests (TASAR) and leverage traffic surveillance information available through Automatic Dependent Surveillance Broadcast (ADS-B) In capability. Preliminary fast-time simulation results show increased benefits with longer stage lengths since beneficial trajectory changes can be applied over a longer distance. Also, larger benefits were experienced between large hub airports as compared to other airport sizes. On average, an aircraft equipped with TASAR reduced its travel time by about one to four minutes per operation and fuel burn by about 50 to 550 lbs per operation depending on the objective of the aircrew (time, fuel, or weighted combination of time and fuel), class of airspace user, and aircraft type. These preliminary results are based on analysis of approximately one week of traffic in July 2012 and additional analysis is planned on a larger data set to confirm these initial findings.
Concepts and algorithms for terminal-area traffic management
NASA Technical Reports Server (NTRS)
Erzberger, H.; Chapel, J. D.
1984-01-01
The nation's air-traffic-control system is the subject of an extensive modernization program, including the planned introduction of advanced automation techniques. This paper gives an overview of a concept for automating terminal-area traffic management. Four-dimensional (4D) guidance techniques, which play an essential role in the automated system, are reviewed. One technique, intended for on-board computer implementation, is based on application of optimal control theory. The second technique is a simplified approach to 4D guidance intended for ground computer implementation. It generates advisory messages to help the controller maintain scheduled landing times of aircraft not equipped with on-board 4D guidance systems. An operational system for the second technique, recently evaluated in a simulation, is also described.
NASA Technical Reports Server (NTRS)
Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)
1995-01-01
A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.
The Effect of Background Traffic Packet Size to VoIP Speech Quality
NASA Astrophysics Data System (ADS)
Triyason, Tuul; Kanthamanon, Prasert; Warasup, Kittipong; Yamsaengsung, Siam; Supattatham, Montri
VoIP is gaining acceptance into the corporate world especially, in small and medium sized business that want to save cost for gaining advantage over their competitors. The good voice quality is one of challenging task in deployment plan because VoIP voice quality was affected by packet loss and jitter delay. In this paper, we study the effect of background traffic packet size to voice quality. The background traffic was generated by Bricks software and the speech quality was assessed by MOS. The obtained result shows an interesting relationship between the voice quality and the number of TCP packets and their size. With the same amount of data smaller packets affect the voice's quality more than the larger packet.
Intelsat TDMA and its implementation in Australia
NASA Astrophysics Data System (ADS)
Howe, Stuart
The developmental history and characteristics of the Intelsat digital communications network are surveyed, with a focus on the current implementation status in Australia. Topics addressed include the fundamental principles and advantages of a TDMA system, the hardware required for a TDMA traffic station, and the Intelsat Indian Ocean Primary satellite network. Detailed consideration is given to TDMA test equipment (burst-mode link analyzers, PCM DSI test sets, burst-power meters, burst negators, and reference burst generators), terminal integration with terrestrial and RF links (TWT sharing and equalization), equipment compatibility, and traffic. Plans call for 950 TDMA circuits by the end of 1990, representing about 36.5 percent of the traffic between Australia and Europe, Africa, India and Pakistan, and the Middle East.
Generation of an arbitrary concatenated Greenberger-Horne-Zeilinger state with single photons
NASA Astrophysics Data System (ADS)
Chen, Shan-Shan; Zhou, Lan; Sheng, Yu-Bo
2017-02-01
The concatenated Greenberger-Horne-Zeilinger (C-GHZ) state is a new kind of logic-qubit entangled state, which may have extensive applications in future quantum communication. In this letter, we propose a protocol for constructing an arbitrary C-GHZ state with single photons. We exploit the cross-Kerr nonlinearity for this purpose. This protocol has some advantages over previous protocols. First, it only requires two kinds of cross-Kerr nonlinearities to generate single phase shifts ±θ. Second, it is not necessary to use sophisticated m-photon Toffoli gates. Third, this protocol is deterministic and can be used to generate an arbitrary C-GHZ state. This protocol may be useful in future quantum information processing based on the C-GHZ state.
Minimalist design of a robust real-time quantum random number generator
NASA Astrophysics Data System (ADS)
Kravtsov, K. S.; Radchenko, I. V.; Kulik, S. P.; Molotkov, S. N.
2015-08-01
We present a simple and robust construction of a real-time quantum random number generator (QRNG). Our minimalist approach ensures stable operation of the device as well as its simple and straightforward hardware implementation as a stand-alone module. As a source of randomness the device uses measurements of time intervals between clicks of a single-photon detector. The obtained raw sequence is then filtered and processed by a deterministic randomness extractor, which is realized as a look-up table. This enables high speed on-the-fly processing without the need of extensive computations. The overall performance of the device is around 1 random bit per detector click, resulting in 1.2 Mbit/s generation rate in our implementation.
2013-03-01
Sequencing; and 5) Taxi Routing (with Conformance Monitoring). Third, the impact of these DSTs on tower cab operational activities, sub-activities...keystroke or interface level. Fourth, the impact of the DSTs on aptitudes required of controllers is evaluated. The importance of the following aptitudes...Analysis of Mid-Term NextGen Impact on Aptitudes Required in the ATCT Cab ---------------- 36 Mid-Term DST Impact on Tower Cab Controller Roles
Stochastic Optimization for Unit Commitment-A Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Qipeng P.; Wang, Jianhui; Liu, Andrew L.
2015-07-01
Optimization models have been widely used in the power industry to aid the decision-making process of scheduling and dispatching electric power generation resources, a process known as unit commitment (UC). Since UC's birth, there have been two major waves of revolution on UC research and real life practice. The first wave has made mixed integer programming stand out from the early solution and modeling approaches for deterministic UC, such as priority list, dynamic programming, and Lagrangian relaxation. With the high penetration of renewable energy, increasing deregulation of the electricity industry, and growing demands on system reliability, the next wave ismore » focused on transitioning from traditional deterministic approaches to stochastic optimization for unit commitment. Since the literature has grown rapidly in the past several years, this paper is to review the works that have contributed to the modeling and computational aspects of stochastic optimization (SO) based UC. Relevant lines of future research are also discussed to help transform research advances into real-world applications.« less
Origins of Chaos in Autonomous Boolean Networks
NASA Astrophysics Data System (ADS)
Socolar, Joshua; Cavalcante, Hugo; Gauthier, Daniel; Zhang, Rui
2010-03-01
Networks with nodes consisting of ideal Boolean logic gates are known to display either steady states, periodic behavior, or an ultraviolet catastrophe where the number of logic-transition events circulating in the network per unit time grows as a power-law. In an experiment, non-ideal behavior of the logic gates prevents the ultraviolet catastrophe and may lead to deterministic chaos. We identify certain non-ideal features of real logic gates that enable chaos in experimental networks. We find that short-pulse rejection and the asymmetry between the logic states tends to engender periodic behavior. On the other hand, a memory effect termed ``degradation'' can generate chaos. Our results strongly suggest that deterministic chaos can be expected in a large class of experimental Boolean-like networks. Such devices may find application in a variety of technologies requiring fast complex waveforms or flat power spectra. The non-ideal effects identified here also have implications for the statistics of attractors in large complex networks.
Deterministic secure quantum communication using a single d-level system
Jiang, Dong; Chen, Yuanyuan; Gu, Xuemei; Xie, Ling; Chen, Lijun
2017-01-01
Deterministic secure quantum communication (DSQC) can transmit secret messages between two parties without first generating a shared secret key. Compared with quantum key distribution (QKD), DSQC avoids the waste of qubits arising from basis reconciliation and thus reaches higher efficiency. In this paper, based on data block transmission and order rearrangement technologies, we propose a DSQC protocol. It utilizes a set of single d-level systems as message carriers, which are used to directly encode the secret message in one communication process. Theoretical analysis shows that these employed technologies guarantee the security, and the use of a higher dimensional quantum system makes our protocol achieve higher security and efficiency. Since only quantum memory is required for implementation, our protocol is feasible with current technologies. Furthermore, Trojan horse attack (THA) is taken into account in our protocol. We give a THA model and show that THA significantly increases the multi-photon rate and can thus be detected. PMID:28327557
Hardware-efficient Bell state preparation using Quantum Zeno Dynamics in superconducting circuits
NASA Astrophysics Data System (ADS)
Flurin, Emmanuel; Blok, Machiel; Hacohen-Gourgy, Shay; Martin, Leigh S.; Livingston, William P.; Dove, Allison; Siddiqi, Irfan
By preforming a continuous joint measurement on a two qubit system, we restrict the qubit evolution to a chosen subspace of the total Hilbert space. This extension of the quantum Zeno effect, called Quantum Zeno Dynamics, has already been explored in various physical systems such as superconducting cavities, single rydberg atoms, atomic ensembles and Bose Einstein condensates. In this experiment, two superconducting qubits are strongly dispersively coupled to a high-Q cavity (χ >> κ) allowing for the doubly excited state | 11 〉 to be selectively monitored. The Quantum Zeno Dynamics in the complementary subspace enables us to coherently prepare a Bell state. As opposed to dissipation engineering schemes, we emphasize that our protocol is deterministic, does not rely direct coupling between qubits and functions only using single qubit controls and cavity readout. Such Quantum Zeno Dynamics can be generalized to larger Hilbert space enabling deterministic generation of many-body entangled states, and thus realizes a decoherence-free subspace allowing alternative noise-protection schemes.
Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.
Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J
2016-02-01
It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
Traffic Dimensioning and Performance Modeling of 4G LTE Networks
ERIC Educational Resources Information Center
Ouyang, Ye
2011-01-01
Rapid changes in mobile techniques have always been evolutionary, and the deployment of 4G Long Term Evolution (LTE) networks will be the same. It will be another transition from Third Generation (3G) to Fourth Generation (4G) over a period of several years, as is the case still with the transition from Second Generation (2G) to 3G. As a result,…
Advances in Statistical and Deterministic Modeling of Wind-Driven Seas
2011-09-30
Zakharov. Scales of nonlinear relaxation and balance of wind- driven seas. Geophysical Research Abstracts Vol. 13, EGU2011-2042, 2011. EGU General ...Dyachenko A. “On canonical equation for water waves” at General Assembly 2011 of the European Geosciences Union in Vienna, Austria, 03 – 08 April...scattering and equilibrium ranges in wind- generated waves with application to spectrometry, J. Geoph. Res., 92, 49715029, 1987. [3] Hsiao S.V. and
Defense Horizons. Number 38, January 2004. Dirty Bombs: The Threat Revisited
2004-01-01
plutonium-238 (238Pu), americium - 241 (241Am), and cali- fornium-252 (252Cf). Types of Damage Deterministic Injuries. Radiation is said to cause...megasources” such as Russian radioisotope thermal generators ( RTGs ) and Gamma-Kolos seed irradiators. By far the most likely route for terrorist...facility or a business or residential district, not just open space . More efficient RDDs relying on other means to disseminate the same amount of
Recirculating Air Filtration Significantly Reduces Exposure to Airborne Nanoparticles
Pui, David Y.H.; Qi, Chaolong; Stanley, Nick; Oberdörster, Günter; Maynard, Andrew
2008-01-01
Background Airborne nanoparticles from vehicle emissions have been associated with adverse effects in people with pulmonary and cardiovascular disease, and toxicologic studies have shown that nanoparticles can be more hazardous than their larger-scale counterparts. Recirculating air filtration in automobiles and houses may provide a low-cost solution to reducing exposures in many cases, thus reducing possible health risks. Objectives We investigated the effectiveness of recirculating air filtration on reducing exposure to incidental and intentionally produced airborne nanoparticles under two scenarios while driving in traffic, and while generating nanomaterials using gas-phase synthesis. Methods We tested the recirculating air filtration in two commercial vehicles when driving in traffic, as well as in a nonventilation room with a nanoparticle generator, simulating a nanomaterial production facility. We also measured the time-resolved aerosol size distribution during the in-car recirculation to investigate how recirculating air filtration affects particles of different sizes. We developed a recirculation model to describe the aerosol concentration change during recirculation. Results The use of inexpensive, low-efficiency filters in recirculation systems is shown to reduce nanoparticle concentrations to below levels found in a typical office within 3 min while driving through heavy traffic, and within 20 min in a simulated nanomaterial production facility. Conclusions Development and application of this technology could lead to significant reductions in airborne nanoparticle exposure, reducing possible risks to health and providing solutions for generating nanomaterials safely. PMID:18629306
Mei, Haibo; Poslad, Stefan; Du, Shuang
2017-01-01
Intelligent Transportation Systems (ITSs) can be applied to inform and incentivize travellers to help them make cognizant choices concerning their trip routes and transport modality use for their daily travel whilst achieving more sustainable societal and transport authority goals. However, in practice, it is challenging for an ITS to enable incentive generation that is context-driven and personalized, whilst supporting multi-dimensional travel goals. This is because an ITS has to address the situation where different travellers have different travel preferences and constraints for route and modality, in the face of dynamically-varying traffic conditions. Furthermore, personalized incentive generation also needs to dynamically achieve different travel goals from multiple travellers, in the face of their conducts being a mix of both competitive and cooperative behaviours. To address this challenge, a Rule-based Incentive Framework (RIF) is proposed in this paper that utilizes both decision tree and evolutionary game theory to process travel information and intelligently generate personalized incentives for travellers. The travel information processed includes travellers’ mobile patterns, travellers’ modality preferences and route traffic volume information. A series of MATLAB simulations of RIF was undertaken to validate RIF to show that it is potentially an effective way to incentivize travellers to change travel routes and modalities as an essential smart city service. PMID:29232907
Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin
2015-01-01
Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks. PMID:26501283
Li, Dongfang; Lu, Zhaojun; Zou, Xuecheng; Liu, Zhenglin
2015-10-16
Random number generators (RNG) play an important role in many sensor network systems and applications, such as those requiring secure and robust communications. In this paper, we develop a high-security and high-throughput hardware true random number generator, called PUFKEY, which consists of two kinds of physical unclonable function (PUF) elements. Combined with a conditioning algorithm, true random seeds are extracted from the noise on the start-up pattern of SRAM memories. These true random seeds contain full entropy. Then, the true random seeds are used as the input for a non-deterministic hardware RNG to generate a stream of true random bits with a throughput as high as 803 Mbps. The experimental results show that the bitstream generated by the proposed PUFKEY can pass all standard national institute of standards and technology (NIST) randomness tests and is resilient to a wide range of security attacks.
FY16 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Shemon, E. R.; Smith, M. A.
2016-09-30
The goal of the NEAMS neutronics effort is to develop a neutronics toolkit for use on sodium-cooled fast reactors (SFRs) which can be extended to other reactor types. The neutronics toolkit includes the high-fidelity deterministic neutron transport code PROTEUS and many supporting tools such as a cross section generation code MC 2-3, a cross section library generation code, alternative cross section generation tools, mesh generation and conversion utilities, and an automated regression test tool. The FY16 effort for NEAMS neutronics focused on supporting the release of the SHARP toolkit and existing and new users, continuing to develop PROTEUS functions necessarymore » for performance improvement as well as the SHARP release, verifying PROTEUS against available existing benchmark problems, and developing new benchmark problems as needed. The FY16 research effort was focused on further updates of PROTEUS-SN and PROTEUS-MOCEX and cross section generation capabilities as needed.« less
LINEBACkER: Bio-inspired Data Reduction Toward Real Time Network Traffic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teuton, Jeremy R.; Peterson, Elena S.; Nordwall, Douglas J.
Abstract—One essential component of resilient cyber applications is the ability to detect adversaries and protect systems with the same flexibility adversaries will use to achieve their goals. Current detection techniques do not enable this degree of flexibility because most existing applications are built using exact or regular-expression matching to libraries of rule sets. Further, network traffic defies traditional cyber security approaches that focus on limiting access based on the use of passwords and examination of lists of installed or downloaded programs. These approaches do not readily apply to network traffic occurring beyond the access control point, and when the datamore » in question are combined control and payload data of ever increasing speed and volume. Manual analysis of network traffic is not normally possible because of the magnitude of the data that is being exchanged and the length of time that this analysis takes. At the same time, using an exact matching scheme to identify malicious traffic in real time often fails because the lists against which such searches must operate grow too large. In this work, we introduce an alternative method for cyber network detection based on similarity-measuring algorithms for gene sequence analysis. These methods are ideal because they were designed to identify similar but nonidentical sequences. We demonstrate that our method is generally applicable to the problem of network traffic analysis by illustrating its use in two different areas both based on different attributes of network traffic. Our approach provides a logical framework for organizing large collections of network data, prioritizing traffic of interest to human analysts, and makes it possible to discover traffic signatures without the bias introduced by expert-directed signature generation. Pattern recognition on reduced representations of network traffic offers a fast, efficient, and more robust way to detect anomalies.« less
Analysis of the build-up of semi and non volatile organic compounds on urban roads.
Mahbub, Parvez; Ayoko, Godwin A; Goonetilleke, Ashantha; Egodawatta, Prasanna
2011-04-01
Vehicular traffic in urban areas may adversely affect urban water quality through the build-up of traffic generated semi and non volatile organic compounds (SVOCs and NVOCs) on road surfaces. The characterisation of the build-up processes is the key to developing mitigation measures for the removal of such pollutants from urban stormwater. An in-depth analysis of the build-up of SVOCs and NVOCs was undertaken in the Gold Coast region in Australia. Principal Component Analysis (PCA) and Multicriteria Decision tools such as PROMETHEE and GAIA were employed to understand the SVOC and NVOC build-up under combined traffic scenarios of low, moderate, and high traffic in different land uses. It was found that congestion in the commercial areas and use of lubricants and motor oils in the industrial areas were the main sources of SVOCs and NVOCs on urban roads, respectively. The contribution from residential areas to the build-up of such pollutants was hardly noticeable. It was also revealed through this investigation that the target SVOCs and NVOCs were mainly attached to particulate fractions of 75-300 μm whilst the redistribution of coarse fractions due to vehicle activity mainly occurred in the >300 μm size range. Lastly, under combined traffic scenario, moderate traffic with average daily traffic ranging from 2300 to 5900 and average congestion of 0.47 were found to dominate SVOC and NVOC build-up on roads. Copyright © 2011 Elsevier Ltd. All rights reserved.
Framework based on stochastic L-Systems for modeling IP traffic with multifractal behavior
NASA Astrophysics Data System (ADS)
Salvador, Paulo S.; Nogueira, Antonio; Valadas, Rui
2003-08-01
In a previous work we have introduced a multifractal traffic model based on so-called stochastic L-Systems, which were introduced by biologist A. Lindenmayer as a method to model plant growth. L-Systems are string rewriting techniques, characterized by an alphabet, an axiom (initial string) and a set of production rules. In this paper, we propose a novel traffic model, and an associated parameter fitting procedure, which describes jointly the packet arrival and the packet size processes. The packet arrival process is modeled through a L-System, where the alphabet elements are packet arrival rates. The packet size process is modeled through a set of discrete distributions (of packet sizes), one for each arrival rate. In this way the model is able to capture correlations between arrivals and sizes. We applied the model to measured traffic data: the well-known pOct Bellcore, a trace of aggregate WAN traffic and two traces of specific applications (Kazaa and Operation Flashing Point). We assess the multifractality of these traces using Linear Multiscale Diagrams. The suitability of the traffic model is evaluated by comparing the empirical and fitted probability mass and autocovariance functions; we also compare the packet loss ratio and average packet delay obtained with the measured traces and with traces generated from the fitted model. Our results show that our L-System based traffic model can achieve very good fitting performance in terms of first and second order statistics and queuing behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Peng; Barajas-Solano, David A.; Constantinescu, Emil
Wind and solar power generators are commonly described by a system of stochastic ordinary differential equations (SODEs) where random input parameters represent uncertainty in wind and solar energy. The existing methods for SODEs are mostly limited to delta-correlated random parameters (white noise). Here we use the Probability Density Function (PDF) method for deriving a closed-form deterministic partial differential equation (PDE) for the joint probability density function of the SODEs describing a power generator with time-correlated power input. The resulting PDE is solved numerically. A good agreement with Monte Carlo Simulations shows accuracy of the PDF method.
Generation and control of Greenberger-Horne-Zeilinger entanglement in superconducting circuits.
Wei, L F; Liu, Yu-xi; Nori, Franco
2006-06-23
Going beyond the entanglement of microscopic objects (such as photons, spins, and ions), here we propose an efficient approach to produce and control the quantum entanglement of three macroscopic coupled superconducting qubits. By conditionally rotating, one by one, selected Josephson-charge qubits, we show that their Greenberger-Horne-Zeilinger (GHZ) entangled states can be deterministically generated. The existence of GHZ correlations between these qubits could be experimentally demonstrated by effective single-qubit operations followed by high-fidelity single-shot readouts. The possibility of using the prepared GHZ correlations to test the macroscopic conflict between the noncommutativity of quantum mechanics and the commutativity of classical physics is also discussed.
Encryption key distribution via chaos synchronization
Keuninckx, Lars; Soriano, Miguel C.; Fischer, Ingo; Mirasso, Claudio R.; Nguimdo, Romain M.; Van der Sande, Guy
2017-01-01
We present a novel encryption scheme, wherein an encryption key is generated by two distant complex nonlinear units, forced into synchronization by a chaotic driver. The concept is sufficiently generic to be implemented on either photonic, optoelectronic or electronic platforms. The method for generating the key bitstream from the chaotic signals is reconfigurable. Although derived from a deterministic process, the obtained bit series fulfill the randomness conditions as defined by the National Institute of Standards test suite. We demonstrate the feasibility of our concept on an electronic delay oscillator circuit and test the robustness against attacks using a state-of-the-art system identification method. PMID:28233876
The relationship between stochastic and deterministic quasi-steady state approximations.
Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R
2015-11-23
The quasi steady-state approximation (QSSA) is frequently used to reduce deterministic models of biochemical networks. The resulting equations provide a simplified description of the network in terms of non-elementary reaction functions (e.g. Hill functions). Such deterministic reductions are frequently a basis for heuristic stochastic models in which non-elementary reaction functions are used to define reaction propensities. Despite their popularity, it remains unclear when such stochastic reductions are valid. It is frequently assumed that the stochastic reduction can be trusted whenever its deterministic counterpart is accurate. However, a number of recent examples show that this is not necessarily the case. Here we explain the origin of these discrepancies, and demonstrate a clear relationship between the accuracy of the deterministic and the stochastic QSSA for examples widely used in biological systems. With an analysis of a two-state promoter model, and numerical simulations for a variety of other models, we find that the stochastic QSSA is accurate whenever its deterministic counterpart provides an accurate approximation over a range of initial conditions which cover the likely fluctuations from the quasi steady-state (QSS). We conjecture that this relationship provides a simple and computationally inexpensive way to test the accuracy of reduced stochastic models using deterministic simulations. The stochastic QSSA is one of the most popular multi-scale stochastic simulation methods. While the use of QSSA, and the resulting non-elementary functions has been justified in the deterministic case, it is not clear when their stochastic counterparts are accurate. In this study, we show how the accuracy of the stochastic QSSA can be tested using their deterministic counterparts providing a concrete method to test when non-elementary rate functions can be used in stochastic simulations.
On the Generation and Use of TCP Acknowledgments
NASA Technical Reports Server (NTRS)
Allman, Mark
1998-01-01
This paper presents a simulation study of various TCP acknowledgment generation and utilization techniques. We investigate the standard version of TCP and the two standard acknowledgment strategies employed by receivers: those that acknowledge each incoming segment and those that implement delayed acknowledgments. We show the delayed acknowledgment mechanism hurts TCP performance, especially during slow start. Next we examine three alternate mechanisms for generating and using acknowledgments designed to mitigate the negative impact of delayed acknowledgments. The first method is to generate delayed ACKs only when the sender is not using the slow start algorithm. The second mechanism, called byte counting, allows TCP senders to increase the amount of data being injected into the network based on the amount of data acknowledged rather than on the number of acknowledgments received. The last mechanism is a limited form of byte counting. Each of these mechanisms is evaluated in a simulated network with no competing traffic, as well as a dynamic environment with a varying amount of competing traffic. We study the costs and benefits of the alternate mechanisms when compared to the standard algorithm with delayed ACKs.
NASA Technical Reports Server (NTRS)
Wong, Gregory L.; Denery, Dallas (Technical Monitor)
2000-01-01
The Dynamic Planner (DP) has been designed, implemented, and integrated into the Center-TRACON Automation System (CTAS) to assist Traffic Management Coordinators (TMCs), in real time, with the task of planning and scheduling arrival traffic approximately 35 to 200 nautical miles from the destination airport. The TMC may input to the DP a series of current and future scheduling constraints that reflect the operation and environmental conditions of the airspace. Under these constraints, the DP uses flight plans, track updates, and Estimated Time of Arrival (ETA) predictions to calculate optimal runway assignments and arrival schedules that help ensure an orderly, efficient, and conflict-free flow of traffic into the terminal area. These runway assignments and schedules can be shown directly to controllers or they can be used by other CTAS tools to generate advisories to the controllers. Additionally, the TMC and controllers may override the decisions made by the DP for tactical considerations. The DP will adapt to computations to accommodate these manual inputs.
Simulator evaluation of the final approach spacing tool
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Erzberger, Heinz; Green, Steven M.
1990-01-01
The design and simulator evaluation of an automation tool for assisting terminal radar approach controllers in sequencing and spacing traffic onto the final approach course is described. The automation tool, referred to as the Final Approach Spacing Tool (FAST), displays speed and heading advisories for arrivals as well as sequencing information on the controller's radar display. The main functional elements of FAST are a scheduler that schedules and sequences the traffic, a 4-D trajectory synthesizer that generates the advisories, and a graphical interface that displays the information to the controller. FAST was implemented on a high performance workstation. It can be operated as a stand-alone in the Terminal Radar Approach Control (TRACON) Facility or as an element of a system integrated with automation tools in the Air Route Traffic Control Center (ARTCC). FAST was evaluated by experienced TRACON controllers in a real-time air traffic control simulation. Simulation results show that FAST significantly reduced controller workload and demonstrated a potential for an increase in landing rate.
Generic Airspace Concepts and Research
NASA Technical Reports Server (NTRS)
Mogford, Richard H.
2010-01-01
The purpose of this study was to evaluate methods for reducing the training and memorization required to manage air traffic in mid-term, Next Generation Air Transportation System (NextGen) airspace. We contrasted the performance of controllers using a sector information display and NextGen automation tools while working with familiar and unfamiliar sectors. The airspace included five sectors from Oakland and Salt Lake City Centers configured as a "generic center" called "West High Center." The Controller Information Tool was used to present essential information for managing these sectors. The Multi Aircraft Control System air traffic control simulator provided data link and conflict detection and resolution. There were five experienced air traffic controller participants. Each was familiar with one or two of the five sectors, but not the others. The participants rotated through all five sectors during the ten data collection runs. The results addressing workload, traffic management, and safety, as well as controller and observer comments, supported the generic sector concept. The unfamiliar sectors were comparable to the familiar sectors on all relevant measures.
Transforming GIS data into functional road models for large-scale traffic simulation.
Wilkie, David; Sewall, Jason; Lin, Ming C
2012-06-01
There exists a vast amount of geographic information system (GIS) data that model road networks around the world as polylines with attributes. In this form, the data are insufficient for applications such as simulation and 3D visualization-tools which will grow in power and demand as sensor data become more pervasive and as governments try to optimize their existing physical infrastructure. In this paper, we propose an efficient method for enhancing a road map from a GIS database to create a geometrically and topologically consistent 3D model to be used in real-time traffic simulation, interactive visualization of virtual worlds, and autonomous vehicle navigation. The resulting representation provides important road features for traffic simulations, including ramps, highways, overpasses, legal merge zones, and intersections with arbitrary states, and it is independent of the simulation methodologies. We test the 3D models of road networks generated by our algorithm on real-time traffic simulation using both macroscopic and microscopic techniques.
Development of a First-of-a-Kind Deterministic Decision-Making Tool for Supervisory Control System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Sacit M; Kisner, Roger A; Muhlheim, Michael David
2015-07-01
Decision-making is the process of identifying and choosing alternatives where each alternative offers a different approach or path to move from a given state or condition to a desired state or condition. The generation of consistent decisions requires that a structured, coherent process be defined, immediately leading to a decision-making framework. The overall objective of the generalized framework is for it to be adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or nomore » human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision- making modules that can perform a given set of tasks reliably. Risk-informed decision making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The implementation of the probabilistic portion of the decision-making engine of the proposed supervisory control system was detailed in previous milestone reports. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic multi-attribute decision-making framework uses variable sensor data (e.g., outlet temperature) and calculates where it is within the challenge state, its trajectory, and margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. Metrics to be evaluated include stability, cost, time to complete (action), power level, etc. The integration of deterministic calculations using multi-physics analyses (i.e., neutronics, thermal, and thermal-hydraulics) and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermal-hydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies.« less
Compartmental and Spatial Rule-Based Modeling with Virtual Cell.
Blinov, Michael L; Schaff, James C; Vasilescu, Dan; Moraru, Ion I; Bloom, Judy E; Loew, Leslie M
2017-10-03
In rule-based modeling, molecular interactions are systematically specified in the form of reaction rules that serve as generators of reactions. This provides a way to account for all the potential molecular complexes and interactions among multivalent or multistate molecules. Recently, we introduced rule-based modeling into the Virtual Cell (VCell) modeling framework, permitting graphical specification of rules and merger of networks generated automatically (using the BioNetGen modeling engine) with hand-specified reaction networks. VCell provides a number of ordinary differential equation and stochastic numerical solvers for single-compartment simulations of the kinetic systems derived from these networks, and agent-based network-free simulation of the rules. In this work, compartmental and spatial modeling of rule-based models has been implemented within VCell. To enable rule-based deterministic and stochastic spatial simulations and network-free agent-based compartmental simulations, the BioNetGen and NFSim engines were each modified to support compartments. In the new rule-based formalism, every reactant and product pattern and every reaction rule are assigned locations. We also introduce the rule-based concept of molecular anchors. This assures that any species that has a molecule anchored to a predefined compartment will remain in this compartment. Importantly, in addition to formulation of compartmental models, this now permits VCell users to seamlessly connect reaction networks derived from rules to explicit geometries to automatically generate a system of reaction-diffusion equations. These may then be simulated using either the VCell partial differential equations deterministic solvers or the Smoldyn stochastic simulator. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Towards an improved ensemble precipitation forecast: A probabilistic post-processing approach
NASA Astrophysics Data System (ADS)
Khajehei, Sepideh; Moradkhani, Hamid
2017-03-01
Recently, ensemble post-processing (EPP) has become a commonly used approach for reducing the uncertainty in forcing data and hence hydrologic simulation. The procedure was introduced to build ensemble precipitation forecasts based on the statistical relationship between observations and forecasts. More specifically, the approach relies on a transfer function that is developed based on a bivariate joint distribution between the observations and the simulations in the historical period. The transfer function is used to post-process the forecast. In this study, we propose a Bayesian EPP approach based on copula functions (COP-EPP) to improve the reliability of the precipitation ensemble forecast. Evaluation of the copula-based method is carried out by comparing the performance of the generated ensemble precipitation with the outputs from an existing procedure, i.e. mixed type meta-Gaussian distribution. Monthly precipitation from Climate Forecast System Reanalysis (CFS) and gridded observation from Parameter-Elevation Relationships on Independent Slopes Model (PRISM) have been employed to generate the post-processed ensemble precipitation. Deterministic and probabilistic verification frameworks are utilized in order to evaluate the outputs from the proposed technique. Distribution of seasonal precipitation for the generated ensemble from the copula-based technique is compared to the observation and raw forecasts for three sub-basins located in the Western United States. Results show that both techniques are successful in producing reliable and unbiased ensemble forecast, however, the COP-EPP demonstrates considerable improvement in the ensemble forecast in both deterministic and probabilistic verification, in particular in characterizing the extreme events in wet seasons.
NASA Astrophysics Data System (ADS)
Hazelhoff, Lykele; Creusen, Ivo M.; Woudsma, Thomas; de With, Peter H. N.
2015-11-01
Combined databases of road markings and traffic signs provide a complete and full description of the present traffic legislation and instructions. Such databases contribute to efficient signage maintenance, improve navigation, and benefit autonomous driving vehicles. A system is presented for the automated creation of such combined databases, which additionally investigates the benefit of this combination for automated contextual placement analysis. This analysis involves verification of the co-occurrence of traffic signs and road markings to retrieve a list of potentially incorrectly signaled (and thus potentially unsafe) road situations. This co-occurrence verification is specifically explored for both pedestrian crossings and yield situations. Evaluations on 420 km of road have shown that individual detection of traffic signs and road markings denoting these road situations can be performed with accuracies of 98% and 85%, respectively. Combining both approaches shows that over 95% of the pedestrian crossings and give-way situations can be identified. An exploration toward additional co-occurrence analysis of signs and markings shows that inconsistently signaled situations can successfully be extracted, such that specific safety actions can be directed toward cases lacking signs or markings, while most consistently signaled situations can be omitted from this analysis.
Zytoon, Mohamed A
2016-05-13
As the traffic and other environmental noise generating activities are growing in The Kingdom of Saudi Arabia (KSA), adverse health and other impacts are expected to develop. The management of such problem involves many actions, of which noise mapping has been proven to be a helpful approach. The objective of the current study was to test the adequacy of the available data in KSA municipalities for generating urban noise maps and to verify the applicability of available environmental noise mapping and noise annoyance models for KSA. Therefore, noise maps were produced for Al-Fayha District in Jeddah City, KSA using commercially available noise mapping software and applying the French national computation method "NMPB" for traffic noise. Most of the data required for traffic noise prediction and annoyance analysis were available, either in the Municipality GIS department or in other governmental authorities. The predicted noise levels during the three time periods, i.e., daytime, evening, and nighttime, were found higher than the maximum recommended levels established in KSA environmental noise standards. Annoyance analysis revealed that high percentages of the District inhabitants were highly annoyed, depending on the type of planning zone and period of interest. These results reflect the urgent need to consider environmental noise reduction in KSA national plans. The accuracy of the predicted noise levels and the availability of most of the necessary data should encourage further studies on the use of noise mapping as part of noise reduction plans.
Nonlinear unitary quantum collapse model with self-generated noise
NASA Astrophysics Data System (ADS)
Geszti, Tamás
2018-04-01
Collapse models including some external noise of unknown origin are routinely used to describe phenomena on the quantum-classical border; in particular, quantum measurement. Although containing nonlinear dynamics and thereby exposed to the possibility of superluminal signaling in individual events, such models are widely accepted on the basis of fully reproducing the non-signaling statistical predictions of quantum mechanics. Here we present a deterministic nonlinear model without any external noise, in which randomness—instead of being universally present—emerges in the measurement process, from deterministic irregular dynamics of the detectors. The treatment is based on a minimally nonlinear von Neumann equation for a Stern–Gerlach or Bell-type measuring setup, containing coordinate and momentum operators in a self-adjoint skew-symmetric, split scalar product structure over the configuration space. The microscopic states of the detectors act as a nonlocal set of hidden parameters, controlling individual outcomes. The model is shown to display pumping of weights between setup-defined basis states, with a single winner randomly selected and the rest collapsing to zero. Environmental decoherence has no role in the scenario. Through stochastic modelling, based on Pearle’s ‘gambler’s ruin’ scheme, outcome probabilities are shown to obey Born’s rule under a no-drift or ‘fair-game’ condition. This fully reproduces quantum statistical predictions, implying that the proposed non-linear deterministic model satisfies the non-signaling requirement. Our treatment is still vulnerable to hidden signaling in individual events, which remains to be handled by future research.
NASA Astrophysics Data System (ADS)
KIM, Jong Woon; LEE, Young-Ouk
2017-09-01
As computing power gets better and better, computer codes that use a deterministic method seem to be less useful than those using the Monte Carlo method. In addition, users do not like to think about space, angles, and energy discretization for deterministic codes. However, a deterministic method is still powerful in that we can obtain a solution of the flux throughout the problem, particularly as when particles can barely penetrate, such as in a deep penetration problem with small detection volumes. Recently, a new state-of-the-art discrete-ordinates code, ATTILA, was developed and has been widely used in several applications. ATTILA provides the capabilities to solve geometrically complex 3-D transport problems by using an unstructured tetrahedral mesh. Since 2009, we have been developing our own code by benchmarking ATTILA. AETIUS is a discrete ordinates code that uses an unstructured tetrahedral mesh such as ATTILA. For pre- and post- processing, Gmsh is used to generate an unstructured tetrahedral mesh by importing a CAD file (*.step) and visualizing the calculation results of AETIUS. Using a CAD tool, the geometry can be modeled very easily. In this paper, we describe a brief overview of AETIUS and provide numerical results from both AETIUS and a Monte Carlo code, MCNP5, in a deep penetration problem with small detection volumes. The results demonstrate the effectiveness and efficiency of AETIUS for such calculations.
NASA Astrophysics Data System (ADS)
Federico, S.; Avolio, E.; Bellecci, C.; Colacino, M.; Walko, R. L.
2006-03-01
This paper reports preliminary results for a Limited area model Ensemble Prediction System (LEPS), based on RAMS (Regional Atmospheric Modelling System), for eight case studies of moderate-intense precipitation over Calabria, the southernmost tip of the Italian peninsula. LEPS aims to transfer the benefits of a probabilistic forecast from global to regional scales in countries where local orographic forcing is a key factor to force convection. To accomplish this task and to limit computational time in an operational implementation of LEPS, we perform a cluster analysis of ECMWF-EPS runs. Starting from the 51 members that form the ECMWF-EPS we generate five clusters. For each cluster a representative member is selected and used to provide initial and dynamic boundary conditions to RAMS, whose integrations generate LEPS. RAMS runs have 12-km horizontal resolution. To analyze the impact of enhanced horizontal resolution on quantitative precipitation forecasts, LEPS forecasts are compared to a full Brute Force (BF) ensemble. This ensemble is based on RAMS, has 36 km horizontal resolution and is generated by 51 members, nested in each ECMWF-EPS member. LEPS and BF results are compared subjectively and by objective scores. Subjective analysis is based on precipitation and probability maps of case studies whereas objective analysis is made by deterministic and probabilistic scores. Scores and maps are calculated by comparing ensemble precipitation forecasts against reports from the Calabria regional raingauge network. Results show that LEPS provided better rainfall predictions than BF for all case studies selected. This strongly suggests the importance of the enhanced horizontal resolution, compared to ensemble population, for Calabria for these cases. To further explore the impact of local physiographic features on QPF (Quantitative Precipitation Forecasting), LEPS results are also compared with a 6-km horizontal resolution deterministic forecast. Due to local and mesoscale forcing, the high resolution forecast (Hi-Res) has better performance compared to the ensemble mean for rainfall thresholds larger than 10mm but it tends to overestimate precipitation for lower amounts. This yields larger false alarms that have a detrimental effect on objective scores for lower thresholds. To exploit the advantages of a probabilistic forecast compared to a deterministic one, the relation between the ECMWF-EPS 700 hPa geopotential height spread and LEPS performance is analyzed. Results are promising even if additional studies are required.
De Roos, Anneclaire J; Koehoorn, Mieke; Tamburic, Lillian; Davies, Hugh W; Brauer, Michael
2014-10-01
The risk of rheumatoid arthritis (RA) has been associated with living near traffic; however, there is evidence suggesting that air pollution may not be responsible for this association. Noise, another traffic-generated exposure, has not been studied as a risk factor for RA. We investigated proximity to traffic, ambient air pollution, and community noise in relation to RA in the Vancouver and Victoria regions of British Columbia, Canada. Cases and controls were identified in a cohort of adults that was assembled using health insurance registration records. Incident RA cases from 1999 through 2002 were identified by diagnostic codes in combination with prescriptions and type of physician (e.g., rheumatologist). Controls were matched to RA cases by age and sex. Environmental exposures were assigned to each member of the study population by their residential postal code(s). We estimated relative risks using conditional logistic regression, with additional adjustment for median income at the postal code. RA incidence was increased with proximity to traffic, with an odds ratio (OR) of 1.37 (95% CI: 1.11, 1.68) for residence ≤ 50 m from a highway compared with residence > 150 m away. We found no association with traffic-related exposures such as PM2.5, nitrogen oxides, or noise. Ground-level ozone, which was highest in suburban areas, was associated with an increased risk of RA (OR = 1.26; 95% CI: 1.18, 1.36 per interquartile range increase). Our study confirms a previously observed association of RA risk with proximity to traffic and suggests that neither noise levels nor traffic-related air pollutants are responsible for this relationship. Additional investigation of neighborhood and individual correlates of residence near roadways may provide new insight into risk factors for RA.
Deterministic Walks with Choice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beeler, Katy E.; Berenhaut, Kenneth S.; Cooper, Joshua N.
2014-01-10
This paper studies deterministic movement over toroidal grids, integrating local information, bounded memory and choice at individual nodes. The research is motivated by recent work on deterministic random walks, and applications in multi-agent systems. Several results regarding passing tokens through toroidal grids are discussed, as well as some open questions.
Toxicity of inhaled traffic related particulate matter
NASA Astrophysics Data System (ADS)
Gerlofs-Nijland, Miriam E.; Campbell, Arezoo; Miller, Mark R.; Newby, David E.; Cassee, Flemming R.
2009-02-01
Traffic generated ultrafine particulates may play a major role in the development of adverse health effects. However, little is known about harmful effects caused by recurring exposure. We hypothesized that repeated exposure to particulate matter results in adverse pulmonary and systemic toxic effects. Exposure to diesel engine exhaust resulted in signs of oxidative stress in the lung, impaired coagulation, and changes in the immune system. Pro-inflammatory cytokine levels were decreased in some regions of the brain but increased in the striatum implying that exposure to diesel engine exhaust may selectively aggravate neurological impairment. Data from these three studies suggest that exposure to traffic related PM can mediate changes in the vasculature and brain of healthy rats. To what extent these changes may contribute to chronic neurodegenerative or vascular diseases is at present unclear.
NASA System-Level Design, Analysis and Simulation Tools Research on NextGen
NASA Technical Reports Server (NTRS)
Bardina, Jorge
2011-01-01
A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.
Initial Evaluation of a Conflict Detection Tool in the Terminal Area
NASA Technical Reports Server (NTRS)
Verma Savita Arora; Tang, Huabin; Ballinger, Deborah S.; Kozon, Thomas E.; Farrahi, Amir Hossein
2012-01-01
Despite the recent economic recession and its adverse impact on air travel, the Federal Aviation Administration (FAA) continues to forecast an increase in air traffic demand that may see traffic double or triple by the year 2025. Increases in air traffic will burden the air traffic management system, and higher levels of safety and efficiency will be required. The air traffic controllers primary task is to ensure separation between aircraft in their airspace and keep the skies safe. As air traffic is forecasted to increase in volume and complexity [1], there is an increased likelihood of conflicts between aircraft, which adds risk and inefficiency to air traffic management and increases controller workload. To attenuate these factors, recent ATM research has shown that air and ground-based automation tools could reduce controller workload, especially if the automation is focused on conflict detection and resolution. Conflict Alert is a short time horizon conflict detection tool deployed in the Terminal Radar Approach Control (TRACON), which has limited utility due to the high number of false alerts generated and its use of dead reckoning to predict loss of separation between aircraft. Terminal Tactical Separation Assurance Flight Environment (T-TSAFE) is a short time horizon conflict detection tool that uses both flight intent and dead reckoning to detect conflicts. Results of a fast time simulation experiment indicated that TTSAFE provided a more effective alert lead-time and generated less false alerts than Conflict Alert [2]. TSAFE was previously tested in a Human-In-The-Loop (HITL) simulation study that focused on the en route phase of flight [3]. The current study tested the T-TSAFE tool in an HITL simulation study, focusing on the terminal environment with current day operations. The study identified procedures, roles, responsibilities, information requirements and usability, with the help of TRACON controllers who participated in the experiment. Metrics such as lead alert time, alert response time, workload, situation awareness and other measures were statistically analyzed. These metrics were examined from an overall perspective and comparisons between conditions (altitude resolutions via keyboard entry vs. ADS-B entry) and controller positions (two final approach sectors and two feeder sectors) were also examined. Results of these analyses and controller feedback provided evidence of T-TSAFE s potential promise as a useful air traffic controller tool. Heuristic analysis also provided information on ways in which the T-TSAFE tool can be improved. Details of analyses results will be presented in the full paper.
Review of modelling air pollution from traffic at street-level - The state of the science.
Forehead, H; Huynh, N
2018-06-13
Traffic emissions are a complex and variable cocktail of toxic chemicals. They are the major source of atmospheric pollution in the parts of cities where people live, commute and work. Reducing exposure requires information about the distribution and nature of emissions. Spatially and temporally detailed data are required, because both the rate of production and the composition of emissions vary significantly with time of day and with local changes in wind, traffic composition and flow. Increasing computer processing power means that models can accept highly detailed inputs of fleet, fuels and road networks. The state of the science models can simulate the behaviour and emissions of all the individual vehicles on a road network, with resolution of a second and tens of metres. The chemistry of the simulated emissions is also highly resolved, due to consideration of multiple engine processes, fuel evaporation and tyre wear. Good results can be achieved with both commercially available and open source models. The extent of a simulation is usually limited by processing capacity; the accuracy by the quality of traffic data. Recent studies have generated real time, detailed emissions data by using inputs from novel traffic sensing technologies and data from intelligent traffic systems (ITS). Increasingly, detailed pollution data is being combined with spatially resolved demographic or epidemiological data for targeted risk analyses. Copyright © 2018 Elsevier Ltd. All rights reserved.
Effects of modeling errors on trajectory predictions in air traffic control automation
NASA Technical Reports Server (NTRS)
Jackson, Michael R. C.; Zhao, Yiyuan; Slattery, Rhonda
1996-01-01
Air traffic control automation synthesizes aircraft trajectories for the generation of advisories. Trajectory computation employs models of aircraft performances and weather conditions. In contrast, actual trajectories are flown in real aircraft under actual conditions. Since synthetic trajectories are used in landing scheduling and conflict probing, it is very important to understand the differences between computed trajectories and actual trajectories. This paper examines the effects of aircraft modeling errors on the accuracy of trajectory predictions in air traffic control automation. Three-dimensional point-mass aircraft equations of motion are assumed to be able to generate actual aircraft flight paths. Modeling errors are described as uncertain parameters or uncertain input functions. Pilot or autopilot feedback actions are expressed as equality constraints to satisfy control objectives. A typical trajectory is defined by a series of flight segments with different control objectives for each flight segment and conditions that define segment transitions. A constrained linearization approach is used to analyze trajectory differences caused by various modeling errors by developing a linear time varying system that describes the trajectory errors, with expressions to transfer the trajectory errors across moving segment transitions. A numerical example is presented for a complete commercial aircraft descent trajectory consisting of several flight segments.
3D Traffic Scene Understanding From Movable Platforms.
Geiger, Andreas; Lauer, Martin; Wojek, Christian; Stiller, Christoph; Urtasun, Raquel
2014-05-01
In this paper, we present a novel probabilistic generative model for multi-object traffic scene understanding from movable platforms which reasons jointly about the 3D scene layout as well as the location and orientation of objects in the scene. In particular, the scene topology, geometry, and traffic activities are inferred from short video sequences. Inspired by the impressive driving capabilities of humans, our model does not rely on GPS, lidar, or map knowledge. Instead, it takes advantage of a diverse set of visual cues in the form of vehicle tracklets, vanishing points, semantic scene labels, scene flow, and occupancy grids. For each of these cues, we propose likelihood functions that are integrated into a probabilistic generative model. We learn all model parameters from training data using contrastive divergence. Experiments conducted on videos of 113 representative intersections show that our approach successfully infers the correct layout in a variety of very challenging scenarios. To evaluate the importance of each feature cue, experiments using different feature combinations are conducted. Furthermore, we show how by employing context derived from the proposed method we are able to improve over the state-of-the-art in terms of object detection and object orientation estimation in challenging and cluttered urban environments.
Latency of TCP applications over the ATM-WAN using the GFR service category
NASA Astrophysics Data System (ADS)
Chen, Kuo-Hsien; Siliquini, John F.; Budrikis, Zigmantas
1998-10-01
The GFR service category has been proposed for data services in ATM networks. Since users are ultimately interested in data service that provide high efficiency and low latency, it is important to study the latency performance for data traffic of the GFR service category in an ATM network. Today much of the data traffic utilizes the TCP/IP protocol suite and in this paper we study through simulation the latency of TCP applications running over a wide-area ATM network utilizing the GFR service category using a realistic TCP traffic model. From this study, we find that during congestion periods the reserved bandwidth in GFR can improve the latency performance for TCP applications. However, due to TCP 'Slow Start' data segment generation dynamics, we show that a large proportion of TCP segments are discarded under network congestion even when the reserved bandwidth is equal to the average generated rate of user data. Therefore, a user experiences worse than expected latency performance when the network is congested. In this study we also examine the effects of segment size on the latency performance of TCP applications using the GFR service category.
Nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates
Melechko, Anatoli V [Oak Ridge, TN; McKnight, Timothy E. , Guillorn, Michael A.; Ilic, Bojan [Ithaca, NY; Merkulov, Vladimir I [Knoxville, TN; Doktycz, Mitchel J [Knoxville, TN; Lowndes, Douglas H [Knoxville, TN; Simpson, Michael L [Knoxville, TN
2011-05-17
Methods, manufactures, machines and compositions are described for nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates. A method includes depositing a catalyst particle on a surface of a substrate to define a deterministically located position; growing an aligned elongated nanostructure on the substrate, an end of the aligned elongated nanostructure coupled to the substrate at the deterministically located position; coating the aligned elongated nanostructure with a conduit material; removing a portion of the conduit material to expose the catalyst particle; removing the catalyst particle; and removing the elongated nanostructure to define a nanoconduit.
A deterministic particle method for one-dimensional reaction-diffusion equations
NASA Technical Reports Server (NTRS)
Mascagni, Michael
1995-01-01
We derive a deterministic particle method for the solution of nonlinear reaction-diffusion equations in one spatial dimension. This deterministic method is an analog of a Monte Carlo method for the solution of these problems that has been previously investigated by the author. The deterministic method leads to the consideration of a system of ordinary differential equations for the positions of suitably defined particles. We then consider the time explicit and implicit methods for this system of ordinary differential equations and we study a Picard and Newton iteration for the solution of the implicit system. Next we solve numerically this system and study the discretization error both analytically and numerically. Numerical computation shows that this deterministic method is automatically adaptive to large gradients in the solution.
NASA Technical Reports Server (NTRS)
Barmore, Bryan; Johnson, Edward; Wing, David J.; Barhydt, Richard
2003-01-01
A human-in-the-loop experiment was performed at the NASA Langley Research Center to study the feasibility of Distributed Air/Ground Traffic Management (DAG-TM) autonomous aircraft operations in highly constrained airspace. The airspace was constrained by a pair of special use airspace (SUA) regions on either side of the pilot s planned route. The available airspace was further varied by changing the separation standard for lateral separation between 3 nm and 5 nm. The pilot had to maneuver through the corridor between the SUA s, avoid other traffic and meet flow management constraints. Traffic flow management (TFM) constraints were imposed as a required time of arrival and crossing altitude at an en route fix. This is a follow-up study to work presented at the 4th USA/Europe Air Traffic Management R&D Seminar in December 2001. Nearly all of the pilots were able to meet their TFM constraints while maintaining adequate separation from other traffic. In only 3 out of 59 runs were the pilots unable to meet their required time of arrival. Two loss of separation cases are studied and it is found that the pilots need conflict prevention information presented in a clearer manner. No degradation of performance or safety was seen between the wide and narrow corridors. Although this was not a thorough study of the consequences of reducing the en route lateral separation, nothing was found that would refute the feasibility of reducing the separation requirement from 5 nm to 3 nm. The creation of additional, second-generation conflicts is also investigated. Two resolution methods were offered to the pilots: strategic and tactical. The strategic method is a closed-loop alteration to the Flight Management System (FMS) active route that considers other traffic as well as TFM constraints. The tactical resolutions are short-term resolutions that leave avoiding other traffic conflicts and meeting the TFM constraints to the pilot. Those that made use of the strategic tools avoided additional conflicts, whereas, those making tactical maneuvers often caused additional conflicts. Many of these second-generation conflicts could be avoided by improved conflict prevention tools that clearly present to the pilot which maneuver choices will result in a conflict-free path. These results, together with previously reported studies, continue to support the feasibility of autonomous aircraft operations.
Chaos without nonlinear dynamics.
Corron, Ned J; Hayes, Scott T; Pethel, Shawn D; Blakely, Jonathan N
2006-07-14
A linear, second-order filter driven by randomly polarized pulses is shown to generate a waveform that is chaotic under time reversal. That is, the filter output exhibits determinism and a positive Lyapunov exponent when viewed backward in time. The filter is demonstrated experimentally using a passive electronic circuit, and the resulting waveform exhibits a Lorenz-like butterfly structure. This phenomenon suggests that chaos may be connected to physical theories whose underlying framework is not that of a traditional deterministic nonlinear dynamical system.
Practical implementation of multilevel quantum cryptography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulik, S. P.; Maslennikov, G. A.; Moreva, E. V.
2006-05-15
The physical principles of a quantum key distribution protocol using four-level optical systems are discussed. Quantum information is encoded into polarization states created by frequency-nondegenerate spontaneous parametric down-conversion in collinear geometry. In the scheme under analysis, the required nonorthogonal states are generated in a single nonlinear crystal. All states in the selected basis are measured deterministically. The results of initial experiments on transformation of the basis polarization states of a four-level optical system are discussed.
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
Implicitly Coordinated Detect and Avoid Capability for Safe Autonomous Operation of Small UAS
NASA Technical Reports Server (NTRS)
Balachandran, Swee; Munoz, Cesar A.; Consiglio, Maria C.
2017-01-01
As the airspace becomes increasingly shared by autonomous small Unmanned Aerial Systems (UAS), there would be a pressing need for coordination strategies so that aircraft can safely and independently maneuver around obstacles, geofences, and traffic aircraft. Explicitly coordinating resolution strategies for small UAS would require additional components such as a reliable vehicle-to-vehicle communication infrastructure and standardized protocols for information exchange that could significantly increase the cost of deploying small UAS in a shared airspace. This paper explores a novel approach that enables multiple aircraft to implicitly coordinate their resolution maneuvers. By requiring all aircraft to execute the proposed approach deterministically, it is possible for all of them to implicitly agree on the region of airspace each will be occupying in a given time interval. The proposed approach lends itself to the construction of a suitable feedback mechanism that enables the real-time execution of an implicitly conflict-free path in a closed-loop manner dealing with uncertainties in aircraft speed. If a network infrastructure is available, the proposed approach can also exploit the benefits of explicit information.
A Statistical Approach For Modeling Tropical Cyclones. Synthetic Hurricanes Generator Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasqualini, Donatella
This manuscript brie y describes a statistical ap- proach to generate synthetic tropical cyclone tracks to be used in risk evaluations. The Synthetic Hur- ricane Generator (SynHurG) model allows model- ing hurricane risk in the United States supporting decision makers and implementations of adaptation strategies to extreme weather. In the literature there are mainly two approaches to model hurricane hazard for risk prediction: deterministic-statistical approaches, where the storm key physical parameters are calculated using physi- cal complex climate models and the tracks are usually determined statistically from historical data; and sta- tistical approaches, where both variables and tracks are estimatedmore » stochastically using historical records. SynHurG falls in the second category adopting a pure stochastic approach.« less
The Collaborative Seismic Earth Model: Generation 1
NASA Astrophysics Data System (ADS)
Fichtner, Andreas; van Herwaarden, Dirk-Philip; Afanasiev, Michael; SimutÄ--, SaulÄ--; Krischer, Lion; ćubuk-Sabuncu, Yeşim; Taymaz, Tuncay; Colli, Lorenzo; Saygin, Erdinc; Villaseñor, Antonio; Trampert, Jeannot; Cupillard, Paul; Bunge, Hans-Peter; Igel, Heiner
2018-05-01
We present a general concept for evolutionary, collaborative, multiscale inversion of geophysical data, specifically applied to the construction of a first-generation Collaborative Seismic Earth Model. This is intended to address the limited resources of individual researchers and the often limited use of previously accumulated knowledge. Model evolution rests on a Bayesian updating scheme, simplified into a deterministic method that honors today's computational restrictions. The scheme is able to harness distributed human and computing power. It furthermore handles conflicting updates, as well as variable parameterizations of different model refinements or different inversion techniques. The first-generation Collaborative Seismic Earth Model comprises 12 refinements from full seismic waveform inversion, ranging from regional crustal- to continental-scale models. A global full-waveform inversion ensures that regional refinements translate into whole-Earth structure.
[Reduction of automobile traffic: urgent health promotion policy].
Tapia Granados, J A
1998-03-01
During the last few decades, traffic injuries have become one of the leading causes of death and disability in the world. In urban areas, traffic congestion, noise, and emissions from motor vehicles produce subjective disturbances and detectable pathological effects. More than one billion people are exposed to harmful levels of environmental pollution. Because its combustion engine generates carbon dioxide (CO2), the automobile is one of the chief sources of the gases that are causing the greenhouse effect. The latter has already caused a rise in the average ambient temperature, and over the next decades it will predictable cause significant climatic changes whose consequences, though uncertain, are likely to be harmful and possibly catastrophic. Aside from the greenhouse effect, the relentless growth of parking zones, traffic, and the roadway infrastructure in urban and rural areas is currently one of the leading causes of environmental degradation. Urban development, which is nearly always "planned" around traffic instead of people, leads to a significant deterioration in the quality of life, while it also destroys the social fabric. Unlike the private automobile, public transportation, bicycles, and walking help reduce pollution, congestion, and traffic volume, as well as the morbidity and mortality resulting from injuries and ailments related to pollution. Non-automobile transportation also encourages physical activity--with its positive effect on general health--and helps reduce the greenhouse effect. The drop in traffic volume and the increased use of alternate means of transportation are thus an integrated health promotion policy which should become an inherent part of the movement for the promotion of healthy cities and of transportation policies and economic policy in general.
Assessment, analysis and appraisal of road traffic noise pollution in Rourkela city, India.
Goswami, Shreerup; Swain, Bijay Kumar; Panda, Santosh Kumar
2013-09-01
The problem of road traffic noise pollution has become a concern for both the public and the policy makers. Noise level was assessed in 12 different squares of Rourkela city during different specified times (7-10 a.m., 11 a.m.-2 p.m., 3-6 p.m., 7-10 p.m., 10 p.m.-12 midnight and 4-6 a.m.). Noise descriptors such as L,eq, traffic noise index, noise pollution level, noise climate, Lday, Levening, Lnight and Lden were assessed to reveal the extent of noise pollution due to heavy traffic in this city. The equivalent noise levels of all the 12 squares were found to be much beyond the permissible limit (70dB during day time and 55dB during night time). Appallingly, even the minimum L eq and NPL values were more than 82 dB and 96 dB during day time and 69 dB and 91 dB during night time respectively. Lden values of investigated squares ranged from 83.4 to 86.1 dB and were even more than the day time permissible limit of traffic noise. The prediction model was used in the present study to predict noise pollution level instead of Leq. Comparison of predicted with that of the actual measured data demonstrated that the model used for the prediction has the ability to calibrate the multicomponent traffic noise and yield reliable results close to that by direct measurement. Lastly, it is inferred that the dimension of the traffic generated noise pollution in Rourkela is critical.
State criminal justice telecommunications (STACOM). Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Fielding, J. E.; Frewing, H. K.; Lee, J. J.; Leflang, W. G.; Reilly, N. B.
1977-01-01
Techniques for identifying user requirements and network designs for criminal justice networks on a state wide basis are discussed. Topics covered include: methods for determining data required; data collection and survey; data organization procedures, and methods for forecasting network traffic volumes. Developed network design techniques center around a computerized topology program which enables the user to generate least cost network topologies that satisfy network traffic requirements, response time requirements and other specified functional requirements. The developed techniques were applied in Texas and Ohio, and results of these studies are presented.