Colored Traveling Salesman Problem.
Li, Jun; Zhou, MengChu; Sun, Qirui; Dai, Xianzhong; Yu, Xiaolong
2015-11-01
The multiple traveling salesman problem (MTSP) is an important combinatorial optimization problem. It has been widely and successfully applied to the practical cases in which multiple traveling individuals (salesmen) share the common workspace (city set). However, it cannot represent some application problems where multiple traveling individuals not only have their own exclusive tasks but also share a group of tasks with each other. This work proposes a new MTSP called colored traveling salesman problem (CTSP) for handling such cases. Two types of city groups are defined, i.e., each group of exclusive cities of a single color for a salesman to visit and a group of shared cities of multiple colors allowing all salesmen to visit. Evidences show that CTSP is NP-hard and a multidepot MTSP and multiple single traveling salesman problems are its special cases. We present a genetic algorithm (GA) with dual-chromosome coding for CTSP and analyze the corresponding solution space. Then, GA is improved by incorporating greedy, hill-climbing (HC), and simulated annealing (SA) operations to achieve better performance. By experiments, the limitation of the exact solution method is revealed and the performance of the presented GAs is compared. The results suggest that SAGA can achieve the best quality of solutions and HCGA should be the choice making good tradeoff between the solution quality and computing time. PMID:25494521
Traveling Salesman Problem: A Foveating Pyramid Model
ERIC Educational Resources Information Center
Pizlo, Zygmunt; Stefanov, Emil; Saalweachter, John; Li, Zheng; Haxhimusa, Yll; Kropatsch, Walter G.
2006-01-01
We tested human performance on the Euclidean Traveling Salesman Problem using problems with 6-50 cities. Results confirmed our earlier findings that: (a) the time of solving a problem is proportional to the number of cities, and (b) the solution error grows very slowly with the number of cities. We formulated a new version of a pyramid model. The…
Neural Network Solves "Traveling-Salesman" Problem
NASA Technical Reports Server (NTRS)
Thakoor, Anilkumar P.; Moopenn, Alexander W.
1990-01-01
Experimental electronic neural network solves "traveling-salesman" problem. Plans round trip of minimum distance among N cities, visiting every city once and only once (without backtracking). This problem is paradigm of many problems of global optimization (e.g., routing or allocation of resources) occuring in industry, business, and government. Applied to large number of cities (or resources), circuits of this kind expected to solve problem faster and more cheaply.
Diffusive behavior of a greedy traveling salesman
NASA Astrophysics Data System (ADS)
Lipowski, Adam; Lipowska, Dorota
2011-06-01
Using Monte Carlo simulations we examine the diffusive properties of the greedy algorithm in the d-dimensional traveling salesman problem. Our results show that for d=3 and 4 the average squared distance from the origin
Adapting the traveling salesman problem to an adiabatic quantum computer
NASA Astrophysics Data System (ADS)
Warren, Richard H.
2013-04-01
We show how to guide a quantum computer to select an optimal tour for the traveling salesman. This is significant because it opens a rapid solution method for the wide range of applications of the traveling salesman problem, which include vehicle routing, job sequencing and data clustering.
The cost-constrained traveling salesman problem
Sokkappa, P.R.
1990-10-01
The Cost-Constrained Traveling Salesman Problem (CCTSP) is a variant of the well-known Traveling Salesman Problem (TSP). In the TSP, the goal is to find a tour of a given set of cities such that the total cost of the tour is minimized. In the CCTSP, each city is given a value, and a fixed cost-constraint is specified. The objective is to find a subtour of the cities that achieves maximum value without exceeding the cost-constraint. Thus, unlike the TSP, the CCTSP requires both selection and sequencing. As a consequence, most results for the TSP cannot be extended to the CCTSP. We show that the CCTSP is NP-hard and that no K-approximation algorithm or fully polynomial approximation scheme exists, unless P = NP. We also show that several special cases are polynomially solvable. Algorithms for the CCTSP, which outperform previous methods, are developed in three areas: upper bounding methods, exact algorithms, and heuristics. We found that a bounding strategy based on the knapsack problem performs better, both in speed and in the quality of the bounds, than methods based on the assignment problem. Likewise, we found that a branch-and-bound approach using the knapsack bound was superior to a method based on a common branch-and-bound method for the TSP. In our study of heuristic algorithms, we found that, when selecting modes for inclusion in the subtour, it is important to consider the neighborhood'' of the nodes. A node with low value that brings the subtour near many other nodes may be more desirable than an isolated node of high value. We found two types of repetition to be desirable: repetitions based on randomization in the subtour buildings process, and repetitions encouraging the inclusion of different subsets of the nodes. By varying the number and type of repetitions, we can adjust the computation time required by our method to obtain algorithms that outperform previous methods.
The traveling salesman problem: a hierarchical model.
Graham, S M; Joshi, A; Pizlo, Z
2000-10-01
Our review of prior literature on spatial information processing in perception, attention, and memory indicates that these cognitive functions involve similar mechanisms based on a hierarchical architecture. The present study extends the application of hierarchical models to the area of problem solving. First, we report results of an experiment in which human subjects were tested on a Euclidean traveling salesman problem (TSP) with 6 to 30 cities. The subject's solutions were either optimal or near-optimal in length and were produced in a time that was, on average, a linear function of the number of cities. Next, the performance of the subjects is compared with that of five representative artificial intelligence and operations research algorithms, that produce approximate solutions for Euclidean problems. None of these algorithms was found to be an adequate psychological model. Finally, we present a new algorithm for solving the TSP, which is based on a hierarchical pyramid architecture. The performance of this new algorithm is quite similar to the performance of the subjects. PMID:11126941
Development of the PEBLebl Traveling Salesman Problem Computerized Testbed
ERIC Educational Resources Information Center
Mueller, Shane T.; Perelman, Brandon S.; Tan, Yin Yin; Thanasuan, Kejkaew
2015-01-01
The traveling salesman problem (TSP) is a combinatorial optimization problem that requires finding the shortest path through a set of points ("cities") that returns to the starting point. Because humans provide heuristic near-optimal solutions to Euclidean versions of the problem, it has sometimes been used to investigate human visual…
The ordered clustered travelling salesman problem: a hybrid genetic algorithm.
Ahmed, Zakir Hussain
2014-01-01
The ordered clustered travelling salesman problem is a variation of the usual travelling salesman problem in which a set of vertices (except the starting vertex) of the network is divided into some prespecified clusters. The objective is to find the least cost Hamiltonian tour in which vertices of any cluster are visited contiguously and the clusters are visited in the prespecified order. The problem is NP-hard, and it arises in practical transportation and sequencing problems. This paper develops a hybrid genetic algorithm using sequential constructive crossover, 2-opt search, and a local search for obtaining heuristic solution to the problem. The efficiency of the algorithm has been examined against two existing algorithms for some asymmetric and symmetric TSPLIB instances of various sizes. The computational results show that the proposed algorithm is very effective in terms of solution quality and computational time. Finally, we present solution to some more symmetric TSPLIB instances. PMID:24701148
A quantum heuristic algorithm for the traveling salesman problem
NASA Astrophysics Data System (ADS)
Bang, Jeongho; Ryu, Junghee; Lee, Changhyoup; Yoo, Seokwon; Lim, James; Lee, Jinhyoung
2012-12-01
We propose a quantum heuristic algorithm to solve the traveling salesman problem by generalizing the Grover search. Sufficient conditions are derived to greatly enhance the probability of finding the tours with the cheapest costs reaching almost to unity. These conditions are characterized by the statistical properties of tour costs and are shown to be automatically satisfied in the large-number limit of cities. In particular for a continuous distribution of the tours along the cost, we show that the quantum heuristic algorithm exhibits a quadratic speedup compared to its classical heuristic algorithm.
Simulated annealing with probabilistic analysis for solving traveling salesman problems
NASA Astrophysics Data System (ADS)
Hong, Pei-Yee; Lim, Yai-Fung; Ramli, Razamin; Khalid, Ruzelan
2013-09-01
Simulated Annealing (SA) is a widely used meta-heuristic that was inspired from the annealing process of recrystallization of metals. Therefore, the efficiency of SA is highly affected by the annealing schedule. As a result, in this paper, we presented an empirical work to provide a comparable annealing schedule to solve symmetric traveling salesman problems (TSP). Randomized complete block design is also used in this study. The results show that different parameters do affect the efficiency of SA and thus, we propose the best found annealing schedule based on the Post Hoc test. SA was tested on seven selected benchmarked problems of symmetric TSP with the proposed annealing schedule. The performance of SA was evaluated empirically alongside with benchmark solutions and simple analysis to validate the quality of solutions. Computational results show that the proposed annealing schedule provides a good quality of solution.
Solving large scale traveling salesman problems by chaotic neurodynamics.
Hasegawa, Mikio; Ikeguch, Tohru; Aihara, Kazuyuki
2002-03-01
We propose a novel approach for solving large scale traveling salesman problems (TSPs) by chaotic dynamics. First, we realize the tabu search on a neural network, by utilizing the refractory effects as the tabu effects. Then, we extend it to a chaotic neural network version. We propose two types of chaotic searching methods, which are based on two different tabu searches. While the first one requires neurons of the order of n2 for an n-city TSP, the second one requires only n neurons. Moreover, an automatic parameter tuning method of our chaotic neural network is presented for easy application to various problems. Last, we show that our method with n neurons is applicable to large TSPs such as an 85,900-city problem and exhibits better performance than the conventional stochastic searches and the tabu searches. PMID:12022514
Modified reactive tabu search for the symmetric traveling salesman problems
NASA Astrophysics Data System (ADS)
Lim, Yai-Fung; Hong, Pei-Yee; Ramli, Razamin; Khalid, Ruzelan
2013-09-01
Reactive tabu search (RTS) is an improved method of tabu search (TS) and it dynamically adjusts tabu list size based on how the search is performed. RTS can avoid disadvantage of TS which is in the parameter tuning in tabu list size. In this paper, we proposed a modified RTS approach for solving symmetric traveling salesman problems (TSP). The tabu list size of the proposed algorithm depends on the number of iterations when the solutions do not override the aspiration level to achieve a good balance between diversification and intensification. The proposed algorithm was tested on seven chosen benchmarked problems of symmetric TSP. The performance of the proposed algorithm is compared with that of the TS by using empirical testing, benchmark solution and simple probabilistic analysis in order to validate the quality of solution. The computational results and comparisons show that the proposed algorithm provides a better quality solution than that of the TS.
Experience with two parallel programs solving the traveling salesman problem
Mohan, J.
1983-01-01
The traveling salesman problem is solved on CM*, a multiprocessor system, using two parallel search programs based on the branch and bound algorithm of Little, Murty, Sweeny and Karel. One of these programs is synchronous and has a master-slave process structure, while the other is asynchronous and has an egalitarian structure. The absolute execution times and the speedups of the two programs differ significantly. Their execution times differ because of the difference in their process structure. Their speedups differ because they require different amounts of computation to solve the same problem. This difference in the amount of computation is explained by their different heuristic granularities. The difference between the speedup of the asynchronous second program and linear speedup is attributed to processors idling owing to resource contention. 6 references.
Water flow algorithm decision support tool for travelling salesman problem
NASA Astrophysics Data System (ADS)
Kamarudin, Anis Aklima; Othman, Zulaiha Ali; Sarim, Hafiz Mohd
2016-08-01
This paper discuss about the role of Decision Support Tool in Travelling Salesman Problem (TSP) for helping the researchers who doing research in same area will get the better result from the proposed algorithm. A study has been conducted and Rapid Application Development (RAD) model has been use as a methodology which includes requirement planning, user design, construction and cutover. Water Flow Algorithm (WFA) with initialization technique improvement is used as the proposed algorithm in this study for evaluating effectiveness against TSP cases. For DST evaluation will go through usability testing conducted on system use, quality of information, quality of interface and overall satisfaction. Evaluation is needed for determine whether this tool can assists user in making a decision to solve TSP problems with the proposed algorithm or not. Some statistical result shown the ability of this tool in term of helping researchers to conduct the experiments on the WFA with improvements TSP initialization.
Solving the Traveling Salesman's Problem Using the African Buffalo Optimization
Odili, Julius Beneoluchi; Mohmad Kahar, Mohd Nizam
2016-01-01
This paper proposes the African Buffalo Optimization (ABO) which is a new metaheuristic algorithm that is derived from careful observation of the African buffalos, a species of wild cows, in the African forests and savannahs. This animal displays uncommon intelligence, strategic organizational skills, and exceptional navigational ingenuity in its traversal of the African landscape in search for food. The African Buffalo Optimization builds a mathematical model from the behavior of this animal and uses the model to solve 33 benchmark symmetric Traveling Salesman's Problem and six difficult asymmetric instances from the TSPLIB. This study shows that buffalos are able to ensure excellent exploration and exploitation of the search space through regular communication, cooperation, and good memory of its previous personal exploits as well as tapping from the herd's collective exploits. The results obtained by using the ABO to solve these TSP cases were benchmarked against the results obtained by using other popular algorithms. The results obtained using the African Buffalo Optimization algorithm are very competitive. PMID:26880872
A Simple Algorithm for the Metric Traveling Salesman Problem
NASA Technical Reports Server (NTRS)
Grimm, M. J.
1984-01-01
An algorithm was designed for a wire list net sort problem. A branch and bound algorithm for the metric traveling salesman problem is presented for this. The algorithm is a best bound first recursive descent where the bound is based on the triangle inequality. The bounded subsets are defined by the relative order of the first K of the N cities (i.e., a K city subtour). When K equals N, the bound is the length of the tour. The algorithm is implemented as a one page subroutine written in the C programming language for the VAX 11/750. Average execution times for randomly selected planar points using the Euclidean metric are 0.01, 0.05, 0.42, and 3.13 seconds for ten, fifteen, twenty, and twenty-five cities, respectively. Maximum execution times for a hundred cases are less than eleven times the averages. The speed of the algorithms is due to an initial ordering algorithm that is a N squared operation. The algorithm also solves the related problem where the tour does not return to the starting city and the starting and/or ending cities may be specified. It is possible to extend the algorithm to solve a nonsymmetric problem satisfying the triangle inequality.
List-Based Simulated Annealing Algorithm for Traveling Salesman Problem
Zhan, Shi-hua; Lin, Juan; Zhang, Ze-jun
2016-01-01
Simulated annealing (SA) algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters' setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA) algorithm to solve traveling salesman problem (TSP). LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms. PMID:27034650
Fast marching methods for the continuous traveling salesman problem
Andrews, J.; Sethian, J.A.
2008-12-01
We consider a problem in which we are given a domain, a cost function which depends on position at each point in the domain, and a subset of points ('cities') in the domain. The goal is to determine the cheapest closed path that visits each city in the domain once. This can be thought of as a version of the Traveling Salesman Problem, in which an underlying known metric determines the cost of moving through each point of the domain, but in which the actual shortest path between cities is unknown at the outset. We describe algorithms for both a heuristic and an optimal solution to this problem. The order of the heuristic algorithm is at worst case M * N logN, where M is the number of cities, and N the size of the computational mesh used to approximate the solutions to the shortest paths problems. The average runtime of the heuristic algorithm is linear in the number of cities and O(N log N) in the size N of the mesh.
Solving the Traveling Salesman's Problem Using the African Buffalo Optimization.
Odili, Julius Beneoluchi; Mohmad Kahar, Mohd Nizam
2016-01-01
This paper proposes the African Buffalo Optimization (ABO) which is a new metaheuristic algorithm that is derived from careful observation of the African buffalos, a species of wild cows, in the African forests and savannahs. This animal displays uncommon intelligence, strategic organizational skills, and exceptional navigational ingenuity in its traversal of the African landscape in search for food. The African Buffalo Optimization builds a mathematical model from the behavior of this animal and uses the model to solve 33 benchmark symmetric Traveling Salesman's Problem and six difficult asymmetric instances from the TSPLIB. This study shows that buffalos are able to ensure excellent exploration and exploitation of the search space through regular communication, cooperation, and good memory of its previous personal exploits as well as tapping from the herd's collective exploits. The results obtained by using the ABO to solve these TSP cases were benchmarked against the results obtained by using other popular algorithms. The results obtained using the African Buffalo Optimization algorithm are very competitive. PMID:26880872
An Analysis of the Fitness Landscape of Travelling Salesman Problem.
Tayarani-N, Mohammad-H; Prügel-Bennett, Adam
2016-01-01
The fitness landscape of the travelling salesman problem is investigated for 11 different types of the problem. The types differ in how the distances between cities are generated. Many different properties of the landscape are studied. The properties chosen are all potentially relevant to choosing an appropriate search algorithm. The analysis includes a scaling study of the time to reach a local optimum, the number of local optima, the expected probability of reaching a local optimum as a function of its fitness, the expected fitness found by local search and the best fitness, the probability of reaching a global optimum, the distance between the local optima and the global optimum, the expected fitness as a function of the distance from an optimum, their basins of attraction and a principal component analysis of the local optima. The principal component analysis shows the correlation of the local optima in the component space. We show how the properties of the principal components of the local optima change from one problem type to another. PMID:26066806
List-Based Simulated Annealing Algorithm for Traveling Salesman Problem.
Zhan, Shi-hua; Lin, Juan; Zhang, Ze-jun; Zhong, Yi-wen
2016-01-01
Simulated annealing (SA) algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters' setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA) algorithm to solve traveling salesman problem (TSP). LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms. PMID:27034650
Stability of Solutions to Classes of Traveling Salesman Problems.
Niendorf, Moritz; Kabamba, Pierre T; Girard, Anouck R
2016-04-01
By performing stability analysis on an optimal tour for problems belonging to classes of the traveling salesman problem (TSP), this paper derives margins of optimality for a solution with respect to disturbances in the problem data. Specifically, we consider the asymmetric sequence-dependent TSP, where the sequence dependence is driven by the dynamics of a stack. This is a generalization of the symmetric non sequence-dependent version of the TSP. Furthermore, we also consider the symmetric sequence-dependent variant and the asymmetric non sequence-dependent variant. Amongst others these problems have applications in logistics and unmanned aircraft mission planning. Changing external conditions such as traffic or weather may alter task costs, which can render an initially optimal itinerary suboptimal. Instead of optimizing the itinerary every time task costs change, stability criteria allow for fast evaluation of whether itineraries remain optimal. This paper develops a method to compute stability regions for the best tour in a set of tours for the symmetric TSP and extends the results to the asymmetric problem as well as their sequence-dependent counterparts. As the TSP is NP-hard, heuristic methods are frequently used to solve it. The presented approach is also applicable to analyze stability regions for a tour obtained through application of the k -opt heuristic with respect to the k -neighborhood. A dimensionless criticality metric for edges is proposed, such that a high criticality of an edge indicates that the optimal tour is more susceptible to cost changes in that edge. Multiple examples demonstrate the application of the developed stability computation method as well as the edge criticality measure that facilitates an intuitive assessment of instances of the TSP. PMID:25910270
Use of a Colony of Cooperating Agents and MAPLE To Solve the Traveling Salesman Problem.
ERIC Educational Resources Information Center
Guerrieri, Bruno
This paper reviews an approach for finding optimal solutions to the traveling salesman problem, a well-known problem in combinational optimization, and describes implementing the approach using the MAPLE computer algebra system. The method employed in this approach to the problem is similar to the way ant colonies manage to establish shortest…
Extended Tabu Search on Fuzzy Traveling Salesman Problem in Multi-criteria Analysis
NASA Astrophysics Data System (ADS)
Zheng, Yujun
The paper proposes an extended tabu search algorithm for the traveling salesman problem (TSP) with fuzzy edge weights. The algorithm considers three important fuzzy ranking criteria including expected value, optimistic value and pessimistic value, and performs a three-stage search towards the Pareto front, involving a preferred criterion at each stage. Simulations demonstrate that our approach can produce a set of near optimal solutions for fuzzy TSP instances with up to 750 uniformly randomly generated nodes.
An argument for abandoning the travelling salesman problem as a neural-network benchmark.
Smith, K
1996-01-01
In this paper, a distinction is drawn between research which assesses the suitability of the Hopfield network for solving the travelling salesman problem (TSP) and research which attempts to determine the effectiveness of the Hopfield network as an optimization technique. It is argued that the TSP is generally misused as a benchmark for the latter goal, with the existence of an alternative linear formulation giving rise to unreasonable comparisons. PMID:18263553
Electronic neural network for solving traveling salesman and similar global optimization problems
NASA Technical Reports Server (NTRS)
Thakoor, Anilkumar P. (Inventor); Moopenn, Alexander W. (Inventor); Duong, Tuan A. (Inventor); Eberhardt, Silvio P. (Inventor)
1993-01-01
This invention is a novel high-speed neural network based processor for solving the 'traveling salesman' and other global optimization problems. It comprises a novel hybrid architecture employing a binary synaptic array whose embodiment incorporates the fixed rules of the problem, such as the number of cities to be visited. The array is prompted by analog voltages representing variables such as distances. The processor incorporates two interconnected feedback networks, each of which solves part of the problem independently and simultaneously, yet which exchange information dynamically.
An analogue approach to the travelling salesman problem using an elastic net method
NASA Astrophysics Data System (ADS)
Durbin, Richard; Willshaw, David
1987-04-01
The travelling salesman problem1 is a classical problem in the field of combinatorial optimization, concerned with efficient methods for maximizing or minimizing a function of many independent variables. Given the positions of N cities, which in the simplest case lie in the plane, what is the shortest closed tour in which each city can be visited once? We describe how a parallel analogue algorithm, derived from a formal model2-3 for the establishment of topographically ordered projections in the brain4-10, can be applied to the travelling salesman problem1,11,12. Using an iterative procedure, a circular closed path is gradually elongated non-uniformly until it eventually passes sufficiently near to all the cities to define a tour. This produces shorter tour lengths than another recent parallel analogue algorithm13, scales well with the size of the problem, and is naturally extendable to a large class of optimization problems involving topographic mappings between geometrical structures14.
Large neighborhood search for the double traveling salesman problem with multiple stacks
Bent, Russell W; Van Hentenryck, Pascal
2009-01-01
This paper considers a complex real-life short-haul/long haul pickup and delivery application. The problem can be modeled as double traveling salesman problem (TSP) in which the pickups and the deliveries happen in the first and second TSPs respectively. Moreover, the application features multiple stacks in which the items must be stored and the pickups and deliveries must take place in reserve (LIFO) order for each stack. The goal is to minimize the total travel time satisfying these constraints. This paper presents a large neighborhood search (LNS) algorithm which improves the best-known results on 65% of the available instances and is always within 2% of the best-known solutions.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false Agent-driver or commission-driver, full-time... commission-driver, full-time life insurance salesman, home worker, or traveling or city salesman. (a) General... work performed must not be a single transaction. Part-time and regular seasonal work may be...
Code of Federal Regulations, 2014 CFR
2014-04-01
... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false Agent-driver or commission-driver, full-time... commission-driver, full-time life insurance salesman, home worker, or traveling or city salesman. (a) General... work performed must not be a single transaction. Part-time and regular seasonal work may be...
Code of Federal Regulations, 2013 CFR
2013-04-01
... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false Agent-driver or commission-driver, full-time... commission-driver, full-time life insurance salesman, home worker, or traveling or city salesman. (a) General... work performed must not be a single transaction. Part-time and regular seasonal work may be...
Code of Federal Regulations, 2011 CFR
2011-04-01
... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false Agent-driver or commission-driver, full-time... commission-driver, full-time life insurance salesman, home worker, or traveling or city salesman. (a) General... work performed must not be a single transaction. Part-time and regular seasonal work may be...
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Agent-driver or commission-driver, full-time... commission-driver, full-time life insurance salesman, home worker, or traveling or city salesman. (a) General... work performed must not be a single transaction. Part-time and regular seasonal work may be...
Fault-tolerance of a neural network solving the traveling salesman problem
NASA Technical Reports Server (NTRS)
Protzel, P.; Palumbo, D.; Arras, M.
1989-01-01
This study presents the results of a fault-injection experiment that stimulates a neural network solving the Traveling Salesman Problem (TSP). The network is based on a modified version of Hopfield's and Tank's original method. We define a performance characteristic for the TSP that allows an overall assessment of the solution quality for different city-distributions and problem sizes. Five different 10-, 20-, and 30- city cases are sued for the injection of up to 13 simultaneous stuck-at-0 and stuck-at-1 faults. The results of more than 4000 simulation-runs show the extreme fault-tolerance of the network, especially with respect to stuck-at-0 faults. One possible explanation for the overall surprising result is the redundancy of the problem representation.
Finite Size and Dimensional Dependence in the Euclidean Traveling Salesman Problem
NASA Astrophysics Data System (ADS)
Percus, Allon G.; Martin, Olivier C.
1996-02-01
We consider the Euclidean traveling salesman problem for N cities randomly distributed in the unit d-dimensional hypercube, and investigate the finite size scaling of the mean optimal tour length LE. With toroidal boundary conditions we find, motivated by a remarkable universality in the kth nearest neighbor distribution, that LE\\(d = 2\\) = \\(0.7120+/-0.0002\\) N1/2 [1+O\\(1/N\\)] and LE\\(d = 3\\) = \\(0.6979+/-0.0002\\) N2/3[1+O\\(1/N\\)]. We then consider a mean-field approach in the limit N-->∞ which we find to be a good approximation (the error being less than 2.1% at d = 1,2, and 3), and which suggests that LE\\(d\\) = N1-1/dd/2πe \\(πd\\)1/2d[1+O\\(1/d\\)] at large d.
An Integer-Coded Chaotic Particle Swarm Optimization for Traveling Salesman Problem
NASA Astrophysics Data System (ADS)
Yue, Chen; Yan-Duo, Zhang; Jing, Lu; Hui, Tian
Traveling Salesman Problem (TSP) is one of NP-hard combinatorial optimization problems, which will experience “combination explosion” when the problem goes beyond a certain size. Therefore, it has been a hot topic to search an effective solving method. The general mathematical model of TSP is discussed, and its permutation and combination based model is presented. Based on these, Integer-coded Chaotic Particle Swarm Optimization for solving TSP is proposed. Where, particle is encoded with integer; chaotic sequence is used to guide global search; and particle varies its positions via “flying”. With a typical 20-citys TSP as instance, the simulation experiment of comparing ICPSO with GA is carried out. Experimental results demonstrate that ICPSO is simple but effective, and better than GA at performance.
A High-Performance Genetic Algorithm: Using Traveling Salesman Problem as a Case
Tsai, Chun-Wei; Tseng, Shih-Pang; Yang, Chu-Sing
2014-01-01
This paper presents a simple but efficient algorithm for reducing the computation time of genetic algorithm (GA) and its variants. The proposed algorithm is motivated by the observation that genes common to all the individuals of a GA have a high probability of surviving the evolution and ending up being part of the final solution; as such, they can be saved away to eliminate the redundant computations at the later generations of a GA. To evaluate the performance of the proposed algorithm, we use it not only to solve the traveling salesman problem but also to provide an extensive analysis on the impact it may have on the quality of the end result. Our experimental results indicate that the proposed algorithm can significantly reduce the computation time of GA and GA-based algorithms while limiting the degradation of the quality of the end result to a very small percentage compared to traditional GA. PMID:24892038
Cesari, G.
1994-12-31
The aim of this paper is to analyze experimentally the quality of the solution obtained with dissection algorithms applied to the geometric Traveling Salesman Problem. Starting from Karp`s results. We apply a divide and conquer strategy, first dividing the plane into subregions where we calculate optimal subtours and then merging these subtours to obtain the final tour. The analysis is restricted to problem instances where points are uniformly distributed in the unit square. For relatively small sets of cities we analyze the quality of the solution by calculating the length of the optimal tour and by comparing it with our approximate solution. When the problem instance is too large we perform an asymptotical analysis estimating the length of the optimal tour. We apply the same dissection strategy also to classical heuristics by calculating approximate subtours and by comparing the results with the average quality of the heuristic. Our main result is the estimate of the rate of convergence of the approximate solution to the optimal solution as a function of the number of dissection steps, of the criterion used for the plane division and of the quality of the subtours. We have implemented our programs on MUSIC (MUlti Signal processor system with Intelligent Communication), a Single-Program-Multiple-Data parallel computer with distributed memory developed at the ETH Zurich.
Finding long chains in kidney exchange using the traveling salesman problem.
Anderson, Ross; Ashlagi, Itai; Gamarnik, David; Roth, Alvin E
2015-01-20
As of May 2014 there were more than 100,000 patients on the waiting list for a kidney transplant from a deceased donor. Although the preferred treatment is a kidney transplant, every year there are fewer donors than new patients, so the wait for a transplant continues to grow. To address this shortage, kidney paired donation (KPD) programs allow patients with living but biologically incompatible donors to exchange donors through cycles or chains initiated by altruistic (nondirected) donors, thereby increasing the supply of kidneys in the system. In many KPD programs a centralized algorithm determines which exchanges will take place to maximize the total number of transplants performed. This optimization problem has proven challenging both in theory, because it is NP-hard, and in practice, because the algorithms previously used were unable to optimally search over all long chains. We give two new algorithms that use integer programming to optimally solve this problem, one of which is inspired by the techniques used to solve the traveling salesman problem. These algorithms provide the tools needed to find optimal solutions in practice. PMID:25561535
Hybrid water flow-like algorithm with Tabu search for traveling salesman problem
NASA Astrophysics Data System (ADS)
Bostamam, Jasmin M.; Othman, Zulaiha
2016-08-01
This paper presents a hybrid Water Flow-like Algorithm with Tabu Search for solving travelling salesman problem (WFA-TS-TSP).WFA has been proven its outstanding performances in solving TSP meanwhile TS is a conventional algorithm which has been used since decades to solve various combinatorial optimization problem including TSP. Hybridization between WFA with TS provides a better balance of exploration and exploitation criteria which are the key elements in determining the performance of one metaheuristic. TS use two different local search namely, 2opt and 3opt separately. The proposed WFA-TS-TSP is tested on 23 sets on the well-known benchmarked symmetric TSP instances. The result shows that the proposed WFA-TS-TSP has significant better quality solutions compared to WFA. The result also shows that the WFA-TS-TSP with 3-opt obtained the best quality solution. With the result obtained, it could be concluded that WFA has potential to be further improved by using hybrid technique or using better local search technique.
On one modification of traveling salesman problem oriented on application in atomic engineering
Chentsov, A. G.; Sesekin, A. N.; Shcheklein, S. E.; Tashlykov, O. L.
2010-10-25
The mathematical model of a problem of minimization of a dose of an irradiation of the personnel which is carrying out dismantling of the completing block of a nuclear power plant is considered. Dismantling of elements of the block is carried out consistently. A brigade of workers having carried out dismantling of the next element of the block passes to similar work on other element of the block. Thus it is supposed that on the sequence of performance of works restrictions are imposed. These restrictions assume that on a number of pairs of works the condition is imposed: the second work cannot be executed before the first. This problem is similar to a known traveling salesman problem with the difference that expenses function depends on the list of outstanding works, and on sequence of performance of works and corresponding motions the constraints in the form of antecedence are imposed. The variant of the dynamic programming method is developed for such problem and the corresponding software is created.
Automatic Combination of Operators in a Genetic Algorithm to Solve the Traveling Salesman Problem
2015-01-01
Genetic algorithms are powerful search methods inspired by Darwinian evolution. To date, they have been applied to the solution of many optimization problems because of the easy use of their properties and their robustness in finding good solutions to difficult problems. The good operation of genetic algorithms is due in part to its two main variation operators, namely, crossover and mutation operators. Typically, in the literature, we find the use of a single crossover and mutation operator. However, there are studies that have shown that using multi-operators produces synergy and that the operators are mutually complementary. Using multi-operators is not a simple task because which operators to use and how to combine them must be determined, which in itself is an optimization problem. In this paper, it is proposed that the task of exploring the different combinations of the crossover and mutation operators can be carried out by evolutionary computing. The crossover and mutation operators used are those typically used for solving the traveling salesman problem. The process of searching for good combinations was effective, yielding appropriate and synergic combinations of the crossover and mutation operators. The numerical results show that the use of the combination of operators obtained by evolutionary computing is better than the use of a single operator and the use of multi-operators combined in the standard way. The results were also better than those of the last operators reported in the literature. PMID:26367182
NASA Technical Reports Server (NTRS)
Dahl, Roy W.; Keating, Karen; Salamone, Daryl J.; Levy, Laurence; Nag, Barindra; Sanborn, Joan A.
1987-01-01
This paper presents an algorithm (WHAMII) designed to solve the Artificial Intelligence Design Challenge at the 1987 AIAA Guidance, Navigation and Control Conference. The problem under consideration is a stochastic generalization of the traveling salesman problem in which travel costs can incur a penalty with a given probability. The variability in travel costs leads to a probability constraint with respect to violating the budget allocation. Given the small size of the problem (eleven cities), an approach is considered that combines partial tour enumeration with a heuristic city insertion procedure. For computational efficiency during both the enumeration and insertion procedures, precalculated binomial probabilities are used to determine an upper bound on the actual probability of violating the budget constraint for each tour. The actual probability is calculated for the final best tour, and additional insertions are attempted until the actual probability exceeds the bound.
Death of the (traveling) salesman: primates do not show clear evidence of multi-step route planning.
Janson, Charles
2014-05-01
Several comparative studies have linked larger brain size to a fruit-eating diet in primates and other animals. The general explanation for this correlation is that fruit is a complex resource base, consisting of many discrete patches of many species, each with distinct nutritional traits, the production of which changes predictably both within and between seasons. Using this information to devise optimal spatial foraging strategies is among the most difficult problems to solve in all of mathematics, a version of the famous Traveling Salesman Problem. Several authors have suggested that primates might use their large brains and complex cognition to plan foraging strategies that approximate optimal solutions to this problem. Three empirical studies have examined how captive primates move when confronted with the simplest version of the problem: a spatial array of equally valuable goals. These studies have all concluded that the subjects remember many food source locations and show very efficient travel paths; some authors also inferred that the subjects may plan their movements based on considering combinations of three or more future goals at a time. This analysis re-examines critically the claims of planned movement sequences from the evidence presented. The efficiency of observed travel paths is largely consistent with use of the simplest of foraging rules, such as visiting the nearest unused "known" resource. Detailed movement sequences by test subjects are most consistent with a rule that mentally sums spatial information from all unused resources in a given trial into a single "gravity" measure that guides movements to one destination at a time. PMID:23934927
An Investigation of Starting Point Preferences in Human Performance on Traveling Salesman Problems
ERIC Educational Resources Information Center
MacGregor, James N.
2014-01-01
Previous studies have shown that people start traveling sales problem tours significantly more often from boundary than from interior nodes. There are a number of possible reasons for such a tendency: first, it may arise as a direct result of the processes involved in tour construction; second, boundary points may be perceptually more salient than…
Zhu, Liping; Aono, Masashi; Kim, Song-Ju; Hara, Masahiko
2013-04-01
A single-celled, multi-nucleated amoeboid organism, a plasmodium of the true slime mold Physarum polycephalum, can perform sophisticated computing by exhibiting complex spatiotemporal oscillatory dynamics while deforming its amorphous body. We previously devised an "amoeba-based computer (ABC)" to quantitatively evaluate the optimization capability of the amoeboid organism in searching for a solution to the traveling salesman problem (TSP) under optical feedback control. In ABC, the organism changes its shape to find a high quality solution (a relatively shorter TSP route) by alternately expanding and contracting its pseudopod-like branches that exhibit local photoavoidance behavior. The quality of the solution serves as a measure of the optimality of which the organism maximizes its global body area (nutrient absorption) while minimizing the risk of being illuminated (exposure to aversive stimuli). ABC found a high quality solution for the 8-city TSP with a high probability. However, it remains unclear whether intracellular communication among the branches of the organism is essential for computing. In this study, we conducted a series of control experiments using two individual cells (two single-celled organisms) to perform parallel searches in the absence of intercellular communication. We found that ABC drastically lost its ability to find a solution when it used two independent individuals. However, interestingly, when two individuals were prepared by dividing one individual, they found a solution for a few tens of minutes. That is, the two divided individuals remained correlated even though they were spatially separated. These results suggest the presence of a long-term memory in the intrinsic dynamics of this organism and its significance in performing sophisticated computing. PMID:23438635
An Adaptive Evolutionary Algorithm for Traveling Salesman Problem with Precedence Constraints
Sung, Jinmo; Jeong, Bongju
2014-01-01
Traveling sales man problem with precedence constraints is one of the most notorious problems in terms of the efficiency of its solution approach, even though it has very wide range of industrial applications. We propose a new evolutionary algorithm to efficiently obtain good solutions by improving the search process. Our genetic operators guarantee the feasibility of solutions over the generations of population, which significantly improves the computational efficiency even when it is combined with our flexible adaptive searching strategy. The efficiency of the algorithm is investigated by computational experiments. PMID:24701158
Multi-step routes of capuchin monkeys in a laser pointer traveling salesman task.
Howard, Allison M; Fragaszy, Dorothy M
2014-09-01
Prior studies have claimed that nonhuman primates plan their routes multiple steps in advance. However, a recent reexamination of multi-step route planning in nonhuman primates indicated that there is no evidence for planning more than one step ahead. We tested multi-step route planning in capuchin monkeys using a pointing device to "travel" to distal targets while stationary. This device enabled us to determine whether capuchins distinguish the spatial relationship between goals and themselves and spatial relationships between goals and the laser dot, allocentrically. In Experiment 1, two subjects were presented with identical food items in Near-Far (one item nearer to subject) and Equidistant (both items equidistant from subject) conditions with a laser dot visible between the items. Subjects moved the laser dot to the items using a joystick. In the Near-Far condition, one subject demonstrated a bias for items closest to self but the other subject chose efficiently. In the second experiment, subjects retrieved three food items in similar Near-Far and Equidistant arrangements. Both subjects preferred food items nearest the laser dot and showed no evidence of multi-step route planning. We conclude that these capuchins do not make choices on the basis of multi-step look ahead strategies. PMID:24700520
Zhang, Zili; Gao, Chao; Lu, Yuxiao; Liu, Yuxin; Liang, Mingxin
2016-01-01
Bi-objective Traveling Salesman Problem (bTSP) is an important field in the operations research, its solutions can be widely applied in the real world. Many researches of Multi-objective Ant Colony Optimization (MOACOs) have been proposed to solve bTSPs. However, most of MOACOs suffer premature convergence. This paper proposes an optimization strategy for MOACOs by optimizing the initialization of pheromone matrix with the prior knowledge of Physarum-inspired Mathematical Model (PMM). PMM can find the shortest route between two nodes based on the positive feedback mechanism. The optimized algorithms, named as iPM-MOACOs, can enhance the pheromone in the short paths and promote the search ability of ants. A series of experiments are conducted and experimental results show that the proposed strategy can achieve a better compromise solution than the original MOACOs for solving bTSPs. PMID:26751562
Zhang, Zili; Gao, Chao; Lu, Yuxiao; Liu, Yuxin; Liang, Mingxin
2016-01-01
Bi-objective Traveling Salesman Problem (bTSP) is an important field in the operations research, its solutions can be widely applied in the real world. Many researches of Multi-objective Ant Colony Optimization (MOACOs) have been proposed to solve bTSPs. However, most of MOACOs suffer premature convergence. This paper proposes an optimization strategy for MOACOs by optimizing the initialization of pheromone matrix with the prior knowledge of Physarum-inspired Mathematical Model (PMM). PMM can find the shortest route between two nodes based on the positive feedback mechanism. The optimized algorithms, named as iPM-MOACOs, can enhance the pheromone in the short paths and promote the search ability of ants. A series of experiments are conducted and experimental results show that the proposed strategy can achieve a better compromise solution than the original MOACOs for solving bTSPs. PMID:26751562
Measurement, coordination, and the relativized a priori
NASA Astrophysics Data System (ADS)
Padovani, Flavia
2015-11-01
The problem of measurement is a central issue in the epistemology and methodology of the physical sciences. In recent literature on scientific representation, large emphasis has been put on the "constitutive role" played by measurement procedures as forms of representation. Despite its importance, this issue hardly finds any mention in writings on constitutive principles, viz. in Michael Friedman's account of relativized a priori principles. This issue, instead, was at the heart of Reichenbach's analysis of coordinating principles that has inspired Friedman's interpretation. This paper suggests that these procedures should have a part in an account of constitutive principles of science, and that they could be interpreted following the intuition originally present (but ultimately not fully developed) in Reichenbach's early work.
Predictive a priori pressure-dependent kinetics.
Jasper, Ahren W; Pelzer, Kenley M; Miller, James A; Kamarchik, Eugene; Harding, Lawrence B; Klippenstein, Stephen J
2014-12-01
The ability to predict the pressure dependence of chemical reaction rates would be a great boon to kinetic modeling of processes such as combustion and atmospheric chemistry. This pressure dependence is intimately related to the rate of collision-induced transitions in energy E and angular momentum J. We present a scheme for predicting this pressure dependence based on coupling trajectory-based determinations of moments of the E,J-resolved collisional transfer rates with the two-dimensional master equation. This completely a priori procedure provides a means for proceeding beyond the empiricism of prior work. The requisite microcanonical dissociation rates are obtained from ab initio transition state theory. Predictions for the CH4 = CH3 + H and C2H3 = C2H2 + H reaction systems are in excellent agreement with experiment. PMID:25477457
"A Priori" Assessment of Language Learning Tasks by Practitioners
ERIC Educational Resources Information Center
Westhoff, Gerard J.
2009-01-01
Teachers' competence to estimate the effectiveness of learning materials is important and often neglected in programmes for teacher education. In this lecture I will try to explore the possibilities of designing scaffolding instruments for a "priori" assessment of language learning tasks, based on insights from SLA and cognitive psychology, more…
The Influence of "a priori" Ideas on the Experimental Approach.
ERIC Educational Resources Information Center
Cauzinille-Marmeche, Evelyne; And Others
1985-01-01
Investigated the role of "a priori" ideas in planning experiments and data processing leading to inferences. Thirty-one students (ages 11-13) observed a "combustion/candle in a closed container" experiment and were asked to interpret sets of measurements. Findings, among others, show that children preferentially experiment on factors about which…
Bioluminescence tomography with structural and functional a priori information
NASA Astrophysics Data System (ADS)
Yan, Han; Unlu, Mehmet B.; Nalcioglu, Orhan; Gulsen, Gultekin
2010-02-01
Multispectral bioluminescence tomography (BLT) is one of the seemingly promising approaches to recover 3D tomographic images of bioluminescence source distribution in vivo. In bioluminescence tomography, internal light source, such as luciferase is activated within a volume and multiple wavelength emission data from the internal bioluminescence sources is acquired for reconstruction. The underline non-uniqueness problem associated with non-spectrally resolved intensity-based bioluminescence tomography was demonstrated by Dehghani et al. and it also shown that using a spectrally resolved technique, an accurate solution for the source distribution can be calculated from the measured data if both functional and anatomical a priori information are at hand. Thus it is of great desire to develop an imaging system that is capable of simultaneously acquiring both the optical and structural a priori information as well as acquiring the bioluminescence data. In this paper we present our first combined optical tomography and CT system which constitutes with a cool CCD camera ( perkin elmer "cold blue"), laser launching units and Xray CT( Dxray proto-type). It is capable of acquiring non contact diffuse optical tomography (DOT) data which is used for functional a priori; X-ray CT images which yields the structure information; and BLT images. Physical phantom experiments are designed to verify the system accuracy, repeatability and resolution. These studies shows the feasibility of such imaging system and its potential.
A priori estimates for the Hill and Dirac operators
NASA Astrophysics Data System (ADS)
Korotyaev, E.
2008-09-01
The Hill operator Ty = -y″ + q'( t) y is considered in L 2(ℝ), where q ∈ L 2(0, 1) is a periodic real potential. The spectrum of T is absolutely continuous and consists of bands separated by gaps. We obtain a priori estimates of gap lengths, effective masses, and action variables for the KDV equation. In the proof of these results, the analysis of a conformal mapping corresponding to quasimomentum of the Hill operator is used. Similar estimates for the Dirac operator are obtained.
First-arrival traveltime sound speed inversion with a priori information
Hooi, Fong Ming; Carson, Paul L.
2014-01-01
Purpose: A first-arrival travel-time sound speed algorithm presented byTarantola [Inverse Problem Theory and Methods for Model Parameter Estimation (SIAM, Philadelphia, PA, 2005)] is adapted to the medical ultrasonics setting. Through specification of a covariance matrix for the object model, the algorithm allows for natural inclusion of physical a priori information of the object. The algorithm's ability to accurately and robustly reconstruct a complex sound speed distribution is demonstrated on simulation and experimental data using a limited aperture. Methods: The algorithm is first demonstrated generally in simulation with a numerical breast phantom imaged in different geometries. As this work is motivated by the authors' limited aperture dual sided ultrasound breast imaging system, experimental data are acquired with a Verasonics system with dual, 128 element, linear L7-4 arrays. The transducers are automatically calibrated for usage in the eikonal forward model.A priori information such as knowledge of correlated regions within the object is obtained via segmentation of B-mode images generated from synthetic aperture imaging. Results: As one illustration of the algorithm's facility for inclusion ofa priori information, physically grounded regularization is demonstrated in simulation. The algorithm's practicality is then demonstrated through experimental realization in limited aperture cases. Reconstructions of sound speed distributions of various complexity are improved through inclusion of a priori information. The sound speed maps are generally reconstructed with accuracy within a few m/s. Conclusions: This paper demonstrates the ability to form sound speed images using two opposed commercial linear arrays to mimic ultrasound image acquisition in the compressed mammographic geometry. The ability to create reasonably good speed of sound images in the compressed mammographic geometry allows images to be readily coregistered to tomosynthesis image volumes for
Fluorescence molecular-tomography reconstruction with a priori anatomical information
NASA Astrophysics Data System (ADS)
Zhou, Lu; Yazici, Birsen; Ntziachristos, Vasilis
2008-02-01
In this study, we combine a generalized Tikhonov regularization method with a priori anatomical information to reconstruct the concentration of fluorophores in mouse with Chronic Obstructive Pulmonary disease (COPD) from in vivo optical and Magnetic Resonance (MR) measurements. Generalized Tikhonov regularization incorporates a penalty term in the optimization formulation of the fluorescence molecular tomography (FMT) inverse problem. Our design involves two penalty terms to make use of a priori anatomical structural information from segmented MR images. The choice of the penalty terms guide the fluorophores in reconstructed image concentrates in the region where it is supposed to be and assure smooth flourophore distribution within tissue of same type and enhances the discontinuities between different tissue types. We compare our results with traditional Tikhanov regularization techniques in extensive simulations and demonstrate the performance our approach in vivo mouse data. The results show that the increased fluorophore concentration in the mouse lungs is consistent with an increased inflammatory response expected from the corresponding animal disease model.
A priori discretization quality metrics for distributed hydrologic modeling applications
NASA Astrophysics Data System (ADS)
Liu, Hongli; Tolson, Bryan; Craig, James; Shafii, Mahyar; Basu, Nandita
2016-04-01
In distributed hydrologic modelling, a watershed is treated as a set of small homogeneous units that address the spatial heterogeneity of the watershed being simulated. The ability of models to reproduce observed spatial patterns firstly depends on the spatial discretization, which is the process of defining homogeneous units in the form of grid cells, subwatersheds, or hydrologic response units etc. It is common for hydrologic modelling studies to simply adopt a nominal or default discretization strategy without formally assessing alternative discretization levels. This approach lacks formal justifications and is thus problematic. More formalized discretization strategies are either a priori or a posteriori with respect to building and running a hydrologic simulation model. A posteriori approaches tend to be ad-hoc and compare model calibration and/or validation performance under various watershed discretizations. The construction and calibration of multiple versions of a distributed model can become a seriously limiting computational burden. Current a priori approaches are more formalized and compare overall heterogeneity statistics of dominant variables between candidate discretization schemes and input data or reference zones. While a priori approaches are efficient and do not require running a hydrologic model, they do not fully investigate the internal spatial pattern changes of variables of interest. Furthermore, the existing a priori approaches focus on landscape and soil data and do not assess impacts of discretization on stream channel definition even though its significance has been noted by numerous studies. The primary goals of this study are to (1) introduce new a priori discretization quality metrics considering the spatial pattern changes of model input data; (2) introduce a two-step discretization decision-making approach to compress extreme errors and meet user-specified discretization expectations through non-uniform discretization threshold
A priori physicalism, lonely ghosts and Cartesian doubt.
Goff, Philip
2012-06-01
A zombie is a physical duplicates of a human being which lacks consciousness. A ghost is a phenomenal duplicate of a human being whose nature is exhausted by consciousness. Discussion of zombie arguments, that is anti-physicalist arguments which appeal to the conceivability of zombies, is familiar in the philosophy of mind literature, whilst ghostly arguments, that is, anti-physicalist arguments which appeal to the conceivability of ghosts, are somewhat neglected. In this paper I argue that ghostly arguments have a number of dialectical advantages over zombie arguments. I go onto explain how the conceivability of ghosts is inconsistent with two kinds of a priori physicalism: analytic functionalism and the Australian physicalism of Armstrong and Lewis. PMID:21459620
Deconvolution in line scanners using a priori information
NASA Astrophysics Data System (ADS)
Wirnitzer, Bernhard; Spraggon-Hernandez, Tadeo
2002-12-01
In a digital camera the MTF of the optical system must comprise a low-pass filter in order to avoid aliasing. The MTF of incoherent imaging usually and in principle is far from an ideal low-pass. Theoretically a digital ARMA-Filter can be used to compensate for this drawback. In praxis such deconvolution filters suffer from instability because of time-variant noise and space-variance of the MTF. In addition in a line scanner the MTF in scan direction slightly differs in each scanned image. Therefore inverse filtering will not operate satisfactory in an unknown environment. A new concept is presented which solves both problems using a-priori information about an object, e.g. that parts of it are known to be binary. This information is enough to achieve a stable space and time-variant ARMA-deconvolution filter. Best results are achieved using non linear filtering and pattern feedback. The new method was used to improve the bit-error-rate (BER) of a high-density matrix-code scanner by more than one order of magnitude. An audio scanner will be demonstrated, which reads 12 seconds of music in CD-quality from an audio coded image of 18mmÚ55mm size.
Perfusion from angiogram and a priori (PAP) with temporal regularization
NASA Astrophysics Data System (ADS)
Taguchi, Katsuyuki; Geschwind, Jean-Francois H.
2009-02-01
Perfusion imaging is often used for diagnosis and for assessment of the response to the treatment. If perfusion can be measured during interventional procedures, it could lead to quantitative, more efficient and accurate treatment; however, imaging modalities that allow continuous dynamic scanning are not available in most of procedure rooms. Thus, we developed a method to measure the perfusion-time attenuation curves (TACs)-of regions-of-interest (ROIs) using xray C-arm angiography system with no gantry rotation but with a priori. The previous study revealed a problem of large oscillations in the estimated TACs and the lack of comparison with CT-based approaches. Thus the purposes of this study were (1) to reduce the variance of TDCs; and (2) to compare the performance of the improved PAP with that of the CT-based perfusion method. Our computer simulation study showed that the standard deviation of PAP method was decreased by 10.7-59.0% and that it outperformed (20× or 200× times) higher dose CT methods in terms of the accuracy, variance, and the temporal resolution.
Precise regional baseline estimation using a priori orbital information
NASA Technical Reports Server (NTRS)
Lindqwister, Ulf J.; Lichten, Stephen M.; Blewitt, Geoffrey
1990-01-01
A solution using GPS measurements acquired during the CASA Uno campaign has resulted in 3-4 mm horizontal daily baseline repeatability and 13 mm vertical repeatability for a 729 km baseline, located in North America. The agreement with VLBI is at the level of 10-20 mm for all components. The results were obtained with the GIPSY orbit determination and baseline estimation software and are based on five single-day data arcs spanning the 20, 21, 25, 26, and 27 of January, 1988. The estimation strategy included resolving the carrier phase integer ambiguities, utilizing an optial set of fixed reference stations, and constraining GPS orbit parameters by applying a priori information. A multiday GPS orbit and baseline solution has yielded similar 2-4 mm horizontal daily repeatabilities for the same baseline, consistent with the constrained single-day arc solutions. The application of weak constraints to the orbital state for single-day data arcs produces solutions which approach the precise orbits obtained with unconstrained multiday arc solutions.
A priori precision estimation for neutron triples counting
Croft, S.; Swinhoe, M. T.; Henzl, V.
2011-07-01
The nondestructive assay of Plutonium bearing items for criticality, safety, security, safeguards, inventory balance, process control, waste management and compliance is often undertaken using correlated neutron counting. In particular Multiplicity Shift Register analysis allows one to extract autocorrelation parameters from the pulse train which can, within the framework of a simple interpretational model, be related to the effective {sup 240}Pu spontaneous fission mass present. The effective {sup 240}Pu mass is a weighted sum of the {sup 238}Pu, {sup 240}Pu and {sup 242}Pu masses so if the relative isotopic composition of the Pu can be established from the measured {sup 240}Pu effective mass one can estimate the total Pu mass and also the masses of the individual isotopes, example the fissile species {sup 239}Pu and {sup 241}Pu. In multiplicity counting three counting rates are obtained. These are the Singles, Doubles and Triples rates. The Singles rate is just the gross, totals or trigger rate. The Doubles and Triples rates are calculated from factorial moments of the observed signal triggered neutron multiplicity distributions following spontaneous fission in the item and can be thought of as the rate of observed coincident pairs and coincident triplets on the pulse train. Coincident events come about because the spontaneous fission and induced fission chains taking place in the item result in bursts of neutrons. These remain time correlated during the detection process and so retain information, through the burst size distribution, about the Pu content. In designing and assessing the performance of a detector system to meet a given goal it is necessary to make a priori estimates of the counting precision for all three kinds of rates. This is non-trivial because the counting does not obey the familiar rules of a Poissonian counting experiment because the pulse train has time correlated events on it and the train is sampled by event triggered gates that may
The A Priori Ideological Orientation of Schools in Kibbutzim in Israel.
ERIC Educational Resources Information Center
Gross, Zehavit
This paper examines the a-priori ideological orientation of pupils in two different types of schools in the kibbutzim in Israel, the movement schools (Hatakam or Hashomer Hatzair) and the mixed schools. The paper attempts to show how different educational circumstances and environments develop a distinct a-priori ideological orientation in…
ERIC Educational Resources Information Center
Sollervall, Håkan; Stadler, Erika
2015-01-01
The aim of the presented case study is to investigate how coherent analytical instruments may guide the a priori and a posteriori analyses of a didactical situation. In the a priori analysis we draw on the notion of affordances, as artefact-mediated opportunities for action, to construct hypothetical trajectories of goal-oriented actions that have…
LLNL's 3-D A Priori Model Constraints and Uncertainties for Improving Seismic Location
Flanagan, M P; Myers, S C; Schultz, C A; Pasyanos, M E; Bhattacharyya, J
2000-07-14
Accurate seismic event location is key to monitoring the Comprehensive Nuclear-Test-Ban Treaty (CTBT) and is largely dependent on our understanding of the crust and mantle velocity structure. This is particularly challenging in aseismic regions, devoid of calibration data, which leads us to rely on a priori constraints on the velocities. We investigate our ability to improve seismic event location in the Middle East, North Africa, and the Former Soviet Union (ME/NA/FSU) by using a priori three-dimensional (3-D) velocity models in lieu of more commonly used one dimensional (1-D) models. Event locations based on 1-D models are often biased, as they do not account for significant travel-time variations that result from heterogeneous crust and mantle; it follows that 3-D velocity models have the potential to reduce this bias. Here, we develop a composite 3-D model for the ME/NA/FSU regions. This fully 3-D model is an amalgamation of studies ranging from seismic reflection to geophysical analogy. Our a priori model specifies geographic boundaries and velocity structures based on geology, tectonics, and seismicity and information taken from published literature, namely a global sediment thickness map of 1{sup o} resolution (Laske and Masters, 1997), a regionalized crustal model based on geology and tectonics (Sweeney and Walter, 1998; Bhattacharyya et al., 2000; Walter et al., 2000), and regionalized upper mantle (RUM) models developed from teleseismic travel times (Gudmundsson and Sambridge, 1998). The components of this model were chosen for the complementary structures they provide. The 1{sup o} sediment map and regionalized crustal model provide detailed structures and boundaries not available in the more coarse 5{sup o} models used for global-scale studies. The RUM models offer improved resolution over global tomography, most notably above depths of 300 km where heterogeneity is greatest; however, we plan to test other published upper mantle models of both P- and S
Use of a priori statistics to minimize acquisition time for RFI immune spread spectrum systems
NASA Technical Reports Server (NTRS)
Holmes, J. K.; Woo, K. T.
1978-01-01
The optimum acquisition sweep strategy was determined for a PN code despreader when the a priori probability density function was not uniform. A psuedo noise spread spectrum system was considered which could be utilized in the DSN to combat radio frequency interference. In a sample case, when the a priori probability density function was Gaussian, the acquisition time was reduced by about 41% compared to a uniform sweep approach.
Conventional Principles in Science: On the foundations and development of the relativized a priori
NASA Astrophysics Data System (ADS)
Ivanova, Milena; Farr, Matt
2015-11-01
The present volume consists of a collection of papers originally presented at the conference Conventional Principles in Science, held at the University of Bristol, August 2011, which featured contributions on the history and contemporary development of the notion of 'relativized a priori' principles in science, from Henri Poincaré's conventionalism to Michael Friedman's contemporary defence of the relativized a priori. In Science and Hypothesis, Poincaré assessed the problematic epistemic status of Euclidean geometry and Newton's laws of motion, famously arguing that each has the status of 'convention' in that their justification is neither analytic nor empirical in nature. In The Theory of Relativity and A Priori Knowledge, Hans Reichenbach, in light of the general theory of relativity, proposed an updated notion of the Kantian synthetic a priori to account for the dynamic inter-theoretic status of geometry and other non-empirical physical principles. Reichenbach noted that one may reject the 'necessarily true' aspect of the synthetic a priori whilst preserving the feature of being constitutive of the object of knowledge. Such constitutive principles are theory-relative, as illustrated by the privileged role of non-Euclidean geometry in general relativity theory. This idea of relativized a priori principles in spacetime physics has been analysed and developed at great length in the modern literature in the work of Michael Friedman, in particular the roles played by the light postulate and the equivalence principle - in special and general relativity respectively - in defining the central terms of their respective theories and connecting the abstract mathematical formalism of the theories with their empirical content. The papers in this volume guide the reader through the historical development of conventional and constitutive principles in science, from the foundational work of Poincaré, Reichenbach and others, to contemporary issues and applications of the
Impact of A Priori Gradients on VLBI-Derived Terrestrial Reference Frames
NASA Astrophysics Data System (ADS)
Böhm, J.; Spicakova, H.; Urquhart, L.; Steigenberger, P.; Schuh, H.
2011-07-01
Tropospheric gradients are usually estimated in the analysis of space geodetic observations to account for the azimuthal asymmetry of troposphere delays. Whereas some analysis centres use a priori gradients for the analysis of Very Long Baseline Interferometry (VLBI) observations, no a priori information is generally applied in the analysis of Global Navigation Satellite Systems (GNSS) observations. We introduce a spherical harmonic expansion of total gradients derived from climatology data of the European Centre for Medium-Range Weather Forecasts (ECMWF), and we compare it to the gradients which have been determined for selected VLBI sites from data of the Data Assimilation Office (DAO) at Goddard Space Flight Center. The latter are usually applied in VLBI analysis. We show the effect of using both types of a priori gradients on the terrestrial and celestial reference frames which are determined from GNSS and VLBI analysis.
Bayesian classification of polarimetric SAR images using adaptive a priori probabilities
NASA Technical Reports Server (NTRS)
Van Zyl, J. J.; Burnette, C. F.
1992-01-01
The problem of classifying earth terrain by observed polarimetric scattering properties is tackled with an iterative Bayesian scheme using a priori probabilities adaptively. The first classification is based on the use of fixed and not necessarily equal a priori probabilities, and successive iterations change the a priori probabilities adaptively. The approach is applied to an SAR image in which a single water body covers 10 percent of the image area. The classification accuracy for ocean, urban, vegetated, and total area increase, and the percentage of reclassified pixels decreases greatly as the iteration number increases. The iterative scheme is found to improve the a posteriori classification accuracy of maximum likelihood classifiers by iteratively using the local homogeneity in polarimetric SAR images. A few iterations can improve the classification accuracy significantly without sacrificing key high-frequency detail or edges in the image.
Mediterranean Diet and Cardiovascular Disease: A Critical Evaluation of A Priori Dietary Indexes
D’Alessandro, Annunziata; De Pergola, Giovanni
2015-01-01
The aim of this paper is to analyze the a priori dietary indexes used in the studies that have evaluated the role of the Mediterranean Diet in influencing the risk of developing cardiovascular disease. All the studies show that this dietary pattern protects against cardiovascular disease, but studies show quite different effects on specific conditions such as coronary heart disease or cerebrovascular disease. A priori dietary indexes used to measure dietary exposure imply quantitative and/or qualitative divergences from the traditional Mediterranean Diet of the early 1960s, and, therefore, it is very difficult to compare the results of different studies. Based on real cultural heritage and traditions, we believe that the a priori indexes used to evaluate adherence to the Mediterranean Diet should consider classifying whole grains and refined grains, olive oil and monounsaturated fats, and wine and alcohol differently. PMID:26389950
Hardy, Tyler; Cain, Stephen; Blake, Travis
2016-05-20
This paper investigates the ability to improve Space Domain Awareness (SDA) by increasing the number of detectable Resident Space Objects (RSOs) from space surveillance sensors. With matched filter based techniques, the expected impulse response, or Point Spread Function (PSF), is compared against the received data. In the situation where the images are spatially undersampled, the modeled PSF may not match the received data if the RSO does not fall in the center of the pixel. This aliasing can be accounted for with a Multiple Hypothesis Test (MHT). Previously, proposed MHTs have implemented a test with an equal a priori prior probability assumption. This paper investigates using an unequal a priori probability MHT. To determine accurate a priori probabilities, three metrics are computed; they are correlation, physical distance, and empirical. Using the calculated a priori probabilities, a new algorithm is developed, and images from the Space Surveillance Telescope (SST) are analyzed. The number of detected objects by both an equal and unequal prior probabilities are compared while keeping the false alarm rate constant. Any additional number of detected objects will help improve SDA capabilities. PMID:27411129
A priori [Formula: see text] estimates for solutions of a class of reaction-diffusion systems.
Du, Zengji; Peng, Rui
2016-05-01
In this short paper, we establish a priori [Formula: see text]-norm estimates for solutions of a class of reaction-diffusion systems which can be used to model the spread of infectious disease. The developed technique may find applications in other reaction-diffusion systems. PMID:26141826
Comparison of a priori calibration models for respiratory inductance plethysmography during running.
Leutheuser, Heike; Heyde, Christian; Gollhofer, Albert; Eskofier, Bjoern M
2014-01-01
Respiratory inductive plethysmography (RIP) has been introduced as an alternative for measuring ventilation by means of body surface displacement (diameter changes in rib cage and abdomen). Using a posteriori calibration, it has been shown that RIP may provide accurate measurements for ventilatory tidal volume under exercise conditions. Methods for a priori calibration would facilitate the application of RIP. Currently, to the best knowledge of the authors, none of the existing ambulant procedures for RIP calibration can be used a priori for valid subsequent measurements of ventilatory volume under exercise conditions. The purpose of this study is to develop and validate a priori calibration algorithms for ambulant application of RIP data recorded in running exercise. We calculated Volume Motion Coefficients (VMCs) using seven different models on resting data and compared the root mean squared error (RMSE) of each model applied on running data. Least squares approximation (LSQ) without offset of a two-degree-of-freedom model achieved the lowest RMSE value. In this work, we showed that a priori calibration of RIP exercise data is possible using VMCs calculated from 5 min resting phase where RIP and flowmeter measurements were performed simultaneously. The results demonstrate that RIP has the potential for usage in ambulant applications. PMID:25571459
FORTRAN IV Program for Analysis of Covariance with A Priori or A Posteriori Mean Comparisons
ERIC Educational Resources Information Center
Fordyce, Michael W.
1977-01-01
A flexible Fortran program for computing a complete analysis of covariance is described. Requiring minimal core space, the program provides all group and overall summary statistics for the analysis, a test of homogeneity of regression, and all posttest mean comparisons for a priori or a posteriori testing. (Author/JKS)
A priori estimates for the free boundary problem of incompressible neo-Hookean elastodynamics
NASA Astrophysics Data System (ADS)
Hao, Chengchun; Wang, Dehua
2016-07-01
A free boundary problem for the incompressible neo-Hookean elastodynamics is studied in two and three spatial dimensions. The a priori estimates in Sobolev norms of solutions with the physical vacuum condition are established through a geometrical point of view of Christodoulou and Lindblad (2000) [3]. Some estimates on the second fundamental form and velocity of the free surface are also obtained.
APhoRISM FP7 project: the A Priori information for Earthquake damage mapping method
NASA Astrophysics Data System (ADS)
Bignami, Christian; Stramondo, Salvatore; Pierdicca, Nazzareno
2014-05-01
The APhoRISM - Advanced PRocedure for volcanIc and Seismic Monitoring - project is an FP7 funded project, which aims at developing and testing two new methods to combine Earth Observation satellite data from different sensors, and ground data for seismic and volcanic risk management. The objective is to demonstrate that this two types of data, appropriately managed and integrated, can provide new improved products useful for seismic and volcanic crisis management. One of the two methods deals with earthquakes, and it concerns the generation of maps to address the detection and estimate of damage caused by a seism. The method is named APE - A Priori information for Earthquake damage mapping. The use of satellite data to investigate earthquake damages is not an innovative issue. Indeed, a wide literature and projects have addressed and focused such issue, but usually the proposed approaches are only based on change detection techniques and/or classifications algorithms. The novelty of APhoRISM-APE relies on the exploitation of a priori information derived by: - InSAR time series to measure surface movements - shakemaps obtained from seismological data - vulnerability information. This a priori information is then integrated with change detection map from earth observation satellite sensors (either Optical or Synthetic Aperture Radar) to improve accuracy and to limit false alarms.
Code of Federal Regulations, 2011 CFR
2011-04-01
... performed by retail commission salesman. 31.3402(j)-1 Section 31.3402(j)-1 Internal Revenue INTERNAL REVENUE... Remuneration other than in cash for service performed by retail commission salesman. (a) In general. (1) An... the employee as a retail commission salesman and (ii) the employer ordinarily pays the...
Examples of use of a-priori information to the inversion of AEM data
NASA Astrophysics Data System (ADS)
Viezzoli, A.; Munday, T. J.; Sapia, V.
2012-12-01
There is a growing focus in the international near surface geophysical community in the merging of information (loosely termed "data") from different sources, in the modelling of the subsurface. The use of a-priori data as extra input to the inversion of Airborne Electromagnetic data is one illustrative example of such trend. It allows providing more robust results, for a number of reasons. The first one is probably the capability to cross check the geophysical derived model against ancillary information, in a more quantitative and objective way than it can be done a-posteriori. The second is that mitigates the inherent non uniqueness of the results of inversion of geophysical data, which is due to the fact that the problem is usually ill posed. The third is the ever higher level of accuracy of the derived output sought after by end users that, rightly so, demand results (either direct or derived) they can use directly for management. Last, but not least, is the drive to incorporate different physical parameters originating from different sources into one inversion problem, in order to derive directly, e.g., geological or hydrogeological models that fit all data sets at once. In this paper we present examples obtained adding information from geophysics (i.e., seismic, surface and borehole geoelectric) and from geology (e.g., lithology), to the inversion of Airborne EM data from different systems (e.g., VTEM, AeroTEM, SkyTEM, Resolve). Case studies are from several areas in the world, with varied geological settings. In our formulation, the a-priori information is treated as nothing but an extra data set, carrying location, values, uncertainty, and expected lateral variability. The information it contains is spread to the location of the neighbouring AEM soundings, using the Spatially Constrained Inversion approach. Constraints and uncertainties are usually different depending on data types and geology. Case studies show the effect on the inversion results of the a-priori
SAC-SMA a priori parameter differences and their impact on distributed hydrologic model simulations
NASA Astrophysics Data System (ADS)
Zhang, Ziya; Koren, Victor; Reed, Seann; Smith, Michael; Zhang, Yu; Moreda, Fekadu; Cosgrove, Brian
2012-02-01
SummaryDeriving a priori gridded parameters is an important step in the development and deployment of an operational distributed hydrologic model. Accurate a priori parameters can reduce the manual calibration effort and/or speed up the automatic calibration process, reduce calibration uncertainty, and provide valuable information at ungauged locations. Underpinned by reasonable parameter data sets, distributed hydrologic modeling can help improve water resource and flood and flash flood forecasting capabilities. Initial efforts at the National Weather Service Office of Hydrologic Development (NWS OHD) to derive a priori gridded Sacramento Soil Moisture Accounting (SAC-SMA) model parameters for the conterminous United States (CONUS) were based on a relatively coarse resolution soils property database, the State Soil Geographic Database (STATSGO) (Soil Survey Staff, 2011) and on the assumption of uniform land use and land cover. In an effort to improve the parameters, subsequent work was performed to fully incorporate spatially variable land cover information into the parameter derivation process. Following that, finer-scale soils data (the county-level Soil Survey Geographic Database (SSURGO) ( Soil Survey Staff, 2011a,b), together with the use of variable land cover data, were used to derive a third set of CONUS, a priori gridded parameters. It is anticipated that the second and third parameter sets, which incorporate more physical data, will be more realistic and consistent. Here, we evaluate whether this is actually the case by intercomparing these three sets of a priori parameters along with their associated hydrologic simulations which were generated by applying the National Weather Service Hydrology Laboratory's Research Distributed Hydrologic Model (HL-RDHM) ( Koren et al., 2004) in a continuous fashion with an hourly time step. This model adopts a well-tested conceptual water balance model, SAC-SMA, applied on a regular spatial grid, and links to physically
Lagrangian formulation and a priori estimates for relativistic fluid flows with vacuum
NASA Astrophysics Data System (ADS)
Jang, Juhi; LeFloch, Philippe G.; Masmoudi, Nader
2016-03-01
We study the evolution of a compressible fluid surrounded by vacuum and introduce a new symmetrization in Lagrangian coordinates that allows us to encompass both relativistic and non-relativistic fluid flows. The problem under consideration is a free boundary problem of central interest in compressible fluid dynamics and, from the mathematical standpoint, the main challenge to be overcome lies in the loss of regularity in the fluid variables near the free boundary. Based on our Lagrangian formulation, we establish the necessary a priori estimates in weighted Sobolev spaces which are adapted to this loss of regularity.
NASA Astrophysics Data System (ADS)
Montaru, Alexandre; Sirakov, Boyan; Souplet, Philippe
2014-07-01
We study qualitative properties of positive solutions of noncooperative, possibly nonvariational, elliptic systems. We obtain new classification and Liouville type theorems in the whole Euclidean space, as well as in half-spaces, and deduce a priori estimates and the existence of positive solutions for related Dirichlet problems. We significantly improve the known results for a large class of systems involving a balance between repulsive and attractive terms. This class contains systems arising in biological models of Lotka-Volterra type, in physical models of Bose-Einstein condensates and in models of chemical reactions.
NASA Astrophysics Data System (ADS)
Armero, F.; Simo, J. C.
This article describes new a priori stability estimates for the full nonlinear system of coupled thermoplasticity at finite strains and presents a fractional step method leading to a new class of unconditionally stable staggered algorithms. These results are shown to hold for general models of multiplicative plasticity that include, as a particular case, the single-crystal model. The proposed product formula algorithm is designed via an entropy based operator split that yields one of the first known staggered algorithms that retains the property of nonlinear unconditional stability. The scheme employs an isentropic step, in which the total entropy is held constant, followed by a heat conduction step (with nonlinear source) at fixed configuration. The nonlinear stability analysis shows that the proposed staggered scheme inherits the a priori energy estimate for the continuum problem, regardless of the size of the time-step. In sharp contrast with these results, it is shown that widely used staggered methods employing an isothermal step followed by a heat conduction problem can be at most only conditionally stable. The excellent performance of the methodology is illustrated in representative numerical simulations.
NASA Astrophysics Data System (ADS)
Zhou, Shuai; Huang, Danian
2015-11-01
We have developed a new method for the interpretation of gravity tensor data based on the generalized Tilt-depth method. Cooper (2011, 2012) extended the magnetic Tilt-depth method to gravity data. We take the gradient-ratio method of Cooper (2011, 2012) and modify it so that the source type does not need to be specified a priori. We develop the new method by generalizing the Tilt-depth method for depth estimation for different types of source bodies. The new technique uses only the three vertical tensor components of the full gravity tensor data observed or calculated at different height plane to estimate the depth of the buried bodies without a priori specification of their structural index. For severely noise-corrupted data, our method utilizes different upward continuation height data, which can effectively reduce the influence of noise. Theoretical simulations of the gravity source model with and without noise illustrate the ability of the method to provide source depth information. Additionally, the simulations demonstrate that the new method is simple, computationally fast and accurate. Finally, we apply the method using the gravity data acquired over the Humble Salt Dome in the USA as an example. The results show a good correspondence to the previous drilling and seismic interpretation results.
Using a priori information for regularization in breast microwave image reconstruction.
Ashtari, Ali; Noghanian, Sima; Sabouni, Abas; Aronsson, Jonatan; Thomas, Gabriel; Pistorius, Stephen
2010-09-01
Regularization methods are used in microwave image reconstruction problems, which are ill-posed. Traditional regularization methods are usually problem-independent and do not take advantage of a priori information specific to any particular imaging application. In this paper, a novel problem-dependent regularization approach is introduced for the application of breast imaging. A real genetic algorithm (RGA) minimizes a cost function that is the error between the recorded and the simulated data. At each iteration of the RGA, a priori information about the shape of the breast profiles is used by a neural network classifier to reject the solutions that cannot be a map of the dielectric properties of a breast profile. The algorithm was tested against four realistic numerical breast phantoms including a mostly fatty, a scattered fibroglandular, a heterogeneously dense, and a very dense sample. The tests were also repeated where a 4 mm x 4 mm tumor was inserted in the fibroglandular tissue in each of the four breast types. The results show the effectiveness of the proposed approach, which to the best of our knowledge has the highest resolution amongst the evolutionary algorithms used for the inversion of realistic numerical breast phantoms. PMID:20562033
Lu, Yujie; Zhang, Xiaoqun; Douraghy, Ali; Stout, David; Tian, Jie; Chan, Tony F.; Chatziioannou, Arion F.
2009-01-01
Through restoration of the light source information in small animals in vivo, optical molecular imaging, such as fluorescence molecular tomography (FMT) and bioluminescence tomography (BLT), can depict biological and physiological changes observed using molecular probes. A priori information plays an indispensable role in tomographic reconstruction. As a type of a priori information, the sparsity characteristic of the light source has not been sufficiently considered to date. In this paper, we introduce a compressed sensing method to develop a new tomographic algorithm for spectrally-resolved bioluminescence tomography. This method uses the nature of the source sparsity to improve the reconstruction quality with a regularization implementation. Based on verification of the inverse crime, the proposed algorithm is validated with Monte Carlo-based synthetic data and the popular Tikhonov regularization method. Testing with different noise levels and single/multiple source settings at different depths demonstrates the improved performance of this algorithm. Experimental reconstruction with a mouse-shaped phantom further shows the potential of the proposed algorithm. PMID:19434138
He, Kaifei; Xu, Tianhe; Förste, Christoph; Petrovic, Svetozar; Barthelmes, Franz; Jiang, Nan; Flechtner, Frank
2016-01-01
When applying the Global Navigation Satellite System (GNSS) for precise kinematic positioning in airborne and shipborne gravimetry, multiple GNSS receiving equipment is often fixed mounted on the kinematic platform carrying the gravimetry instrumentation. Thus, the distances among these GNSS antennas are known and invariant. This information can be used to improve the accuracy and reliability of the state estimates. For this purpose, the known distances between the antennas are applied as a priori constraints within the state parameters adjustment. These constraints are introduced in such a way that their accuracy is taken into account. To test this approach, GNSS data of a Baltic Sea shipborne gravimetric campaign have been used. The results of our study show that an application of distance constraints improves the accuracy of the GNSS kinematic positioning, for example, by about 4 mm for the radial component. PMID:27043580
A Priori Estimates for Free Boundary Problem of Incompressible Inviscid Magnetohydrodynamic Flows
NASA Astrophysics Data System (ADS)
Hao, Chengchun; Luo, Tao
2014-06-01
In the present paper, we prove the a priori estimates of Sobolev norms for a free boundary problem of the incompressible inviscid magnetohydrodynamics equations in all physical spatial dimensions n = 2 and 3 by adopting a geometrical point of view used in Christodoulou and Lindblad (Commun Pure Appl Math 53:1536-1602, 2000), and estimating quantities such as the second fundamental form and the velocity of the free surface. We identify the well-posedness condition that the outer normal derivative of the total pressure including the fluid and magnetic pressures is negative on the free boundary, which is similar to the physical condition (Taylor sign condition) for the incompressible Euler equations of fluids.
Microwave Radar Imaging of Heterogeneous Breast Tissue Integrating A Priori Information
Kelly, Thomas N.; Sarafianou, Mantalena; Craddock, Ian J.
2014-01-01
Conventional radar-based image reconstruction techniques fail when they are applied to heterogeneous breast tissue, since the underlying in-breast relative permittivity is unknown or assumed to be constant. This results in a systematic error during the process of image formation. A recent trend in microwave biomedical imaging is to extract the relative permittivity from the object under test to improve the image reconstruction quality and thereby to enhance the diagnostic assessment. In this paper, we present a novel radar-based methodology for microwave breast cancer detection in heterogeneous breast tissue integrating a 3D map of relative permittivity as a priori information. This leads to a novel image reconstruction formulation where the delay-and-sum focusing takes place in time rather than range domain. Results are shown for a heterogeneous dense (class-4) and a scattered fibroglandular (class-2) numerical breast phantom using Bristol's 31-element array configuration. PMID:25435861
A priori mesh quality metrics for three-dimensional hybrid grids
Kallinderis, Y. Fotia, S.
2015-01-01
Use of general hybrid grids to attain complex-geometry field simulations poses a challenge on estimation of their quality. Apart from the typical problems of non-uniformity and non-orthogonality, the change in element topology is an extra issue to address. The present work derives and evaluates an a priori mesh quality indicator for structured, unstructured, as well as hybrid grids consisting of hexahedra, prisms, tetrahedra, and pyramids. Emphasis is placed on deriving a direct relation between the quality measure and mesh distortion. The work is based on use of the Finite Volume discretization for evaluation of first order spatial derivatives. The analytic form of the truncation error is derived and applied to elementary types of mesh distortion including typical hybrid grid interfaces. The corresponding analytic expressions provide relations between the truncation error and the degree of stretching, skewness, shearing, torsion, expansion, as well as the type of grid interface.
Maclean, Carla L; Brimacombe, C A Elizabeth; Lindsay, D Stephen
2013-12-01
The current study addressed tunnel vision in industrial incident investigation by experimentally testing how a priori information and a human bias (generated via the fundamental attribution error or correspondence bias) affected participants' investigative behavior as well as the effectiveness of a debiasing intervention. Undergraduates and professional investigators engaged in a simulated industrial investigation exercise. We found that participants' judgments were biased by knowledge about the safety history of either a worker or piece of equipment and that a human bias was evident in participants' decision making. However, bias was successfully reduced with "tunnel vision education." Professional investigators demonstrated a greater sophistication in their investigative decision making compared to undergraduates. The similarities and differences between these two populations are discussed. PMID:24295062
He, Kaifei; Xu, Tianhe; Förste, Christoph; Petrovic, Svetozar; Barthelmes, Franz; Jiang, Nan; Flechtner, Frank
2016-01-01
When applying the Global Navigation Satellite System (GNSS) for precise kinematic positioning in airborne and shipborne gravimetry, multiple GNSS receiving equipment is often fixed mounted on the kinematic platform carrying the gravimetry instrumentation. Thus, the distances among these GNSS antennas are known and invariant. This information can be used to improve the accuracy and reliability of the state estimates. For this purpose, the known distances between the antennas are applied as a priori constraints within the state parameters adjustment. These constraints are introduced in such a way that their accuracy is taken into account. To test this approach, GNSS data of a Baltic Sea shipborne gravimetric campaign have been used. The results of our study show that an application of distance constraints improves the accuracy of the GNSS kinematic positioning, for example, by about 4 mm for the radial component. PMID:27043580
A priori analysis: an application to the estimate of the uncertainty in course grades
NASA Astrophysics Data System (ADS)
Lippi, G. L.
2014-07-01
A priori analysis (APA) is discussed as a tool to assess the reliability of grades in standard curricular courses. This unusual, but striking, application is presented when teaching the section on the data treatment of a laboratory course to illustrate the characteristics of the APA and its potential for widespread use, beyond the traditional physics curriculum. The conditions necessary for this kind of analysis are discussed, the general framework is set out and a specific example is given to illustrate its various aspects. Students are often struck by this unusual application and are more apt to remember the APA. Instructors may also benefit from some of the gathered information, as discussed in the paper.
A Priori Bound on the Velocity in Axially Symmetric Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Lei, Zhen; Navas, Esteban A.; Zhang, Qi S.
2016-01-01
Let v be the velocity of Leray-Hopf solutions to the axially symmetric three-dimensional Navier-Stokes equations. Under suitable conditions for initial values, we prove the following a priori bound |v(x, t)| ≤ C |ln r|^{1/2}/r^2, qquad 0 < r ≤ 1/2, where r is the distance from x to the z axis, and C is a constant depending only on the initial value. This provides a pointwise upper bound (worst case scenario) for possible singularities, while the recent papers (Chiun-Chuan et al., Commun PDE 34(1-3):203-232, 2009; Koch et al., Acta Math 203(1):83-105, 2009) gave a lower bound. The gap is polynomial order 1 modulo a half log term.
SOLVING THE INTERIOR PROBLEM OF COMPUTED TOMOGRAPHY USING A PRIORI KNOWLEDGE
Courdurier, M.; Noo, F.; Defrise, M.; Kudo, H.
2008-01-01
The case of incomplete tomographic data for a compactly supported attenuation function is studied. When the attenuation function is a priori known in a subregion, we show that a reduced set of measurements is enough to uniquely determine the attenuation function over all the space. Furthermore, we found stability estimates showing that reconstruction can be stable near the region where the attenuation is known. These estimates also suggest that reconstruction stability collapses quickly when approaching the set of points that are viewed under less than 180 degrees. This paper may be seen as a continuation of the work “Truncated Hilbert transform and Image reconstruction from limited tomographic data” that was published in Inverse Problems in 2006. This continuation tackles new cases of incomplete data that could be of interest in applications of computed tomography. PMID:20613970
Rapid multi-wavelength optical assessment of circulating blood volume without a priori data
NASA Astrophysics Data System (ADS)
Loginova, Ekaterina V.; Zhidkova, Tatyana V.; Proskurnin, Mikhail A.; Zharov, Vladimir P.
2016-03-01
The measurement of circulating blood volume (CBV) is crucial in various medical conditions including surgery, iatrogenic problems, rapid fluid administration, transfusion of red blood cells, or trauma with extensive blood loss including battlefield injuries and other emergencies. Currently, available commercial techniques are invasive and time-consuming for trauma situations. Recently, we have proposed high-speed multi-wavelength photoacoustic/photothermal (PA/PT) flow cytometry for in vivo CBV assessment with multiple dyes as PA contrast agents (labels). As the first step, we have characterized the capability of this technique to monitor the clearance of three dyes (indocyanine green, methylene blue, and trypan blue) in an animal model. However, there are strong demands on improvements in PA/PT flow cytometry. As additional verification of our proof-of-concept of this technique, we performed optical photometric CBV measurements in vitro. Three label dyes—methylene blue, crystal violet and, partially, brilliant green—were selected for simultaneous photometric determination of the components of their two-dye mixtures in the circulating blood in vitro without any extra data (like hemoglobin absorption) known a priori. The tests of single dyes and their mixtures in a flow system simulating a blood transfusion system showed a negligible difference between the sensitivities of the determination of these dyes under batch and flow conditions. For individual dyes, the limits of detection of 3×10-6 M‒3×10-6 M in blood were achieved, which provided their continuous determination at a level of 10-5 M for the CBV assessment without a priori data on the matrix. The CBV assessment with errors no higher than 4% were obtained, and the possibility to apply the developed procedure for optical photometric (flow cytometry) with laser sources was shown.
A priori-defined Diet Quality Indexes and Risk of Type 2 diabetes: The Multiethnic Cohort
Jacobs, Simone; Harmon, Brook E.; Boushey, Carol J.; Morimoto, Yukiko; Wilkens, Lynne R.; Le Marchand, Loic; Kröger, Janine; Schulze, Matthias B.; Kolonel, Laurence N.; Maskarinec, Gertraud
2014-01-01
Aim Dietary patterns have been associated with type 2 diabetes incidence, but little is known about the impact of ethnicity on this relation. This study evaluated the association of four a priori dietary quality indexes and type 2 diabetes risk among whites, Japanese Americans, and Native Hawaiians in the Hawaii component of the Multiethnic Cohort (MEC). Methods After excluding participants with prevalent diabetes and missing values, the analysis included 89,185 participants (11,217 cases). Dietary intake was assessed at baseline with a quantitative food frequency questionnaire designed for use in the relevant ethnic populations. Sex- and ethnicity-specific hazard ratios were calculated for the Healthy Eating Index-2010 (HEI-2010), the alternative HEI-2010 (AHEI-2010), the alternate Mediterranean diet score (aMED), and the Dietary Approaches to Stop Hypertension (DASH). Results We observed significant inverse associations between higher scores of the DASH index and type 2 diabetes risk in white men and women, as well as in Japanese American women and Native Hawaiian men with respective risk reductions of 37, 31, 19 and 21% (highest compared to lowest index category). A higher adherence to the AHEI-2010 and aMED diet was related to a 13–28% lower type 2 diabetes risk in white participants but not in other ethnic groups. No significant associations with type 2 diabetes risk were observed for the HEI-2010 index. Conclusions The small ethnic differences in type 2 diabetes risk associated with scores of a priori-defined dietary patterns may be due to different consumption patterns of food components and the fact that the original indexes were not based on Asians and Pacific Islanders. PMID:25319012
A-Priori Rupture Models for Northern California Type-A Faults
Wills, Chris J.; Weldon, Ray J., II; Field, Edward H.
2008-01-01
This appendix describes how a-priori rupture models were developed for the northern California Type-A faults. As described in the main body of this report, and in Appendix G, ?a-priori? models represent an initial estimate of the rate of single and multi-segment surface ruptures on each fault. Whether or not a given model is moment balanced (i.e., satisfies section slip-rate data) depends on assumptions made regarding the average slip on each segment in each rupture (which in turn depends on the chosen magnitude-area relationship). Therefore, for a given set of assumptions, or branch on the logic tree, the methodology of the present Working Group (WGCEP-2007) is to find a final model that is as close as possible to the a-priori model, in the least squares sense, but that also satisfies slip rate and perhaps other data. This is analogous the WGCEP- 2002 approach of effectively voting on the relative rate of each possible rupture, and then finding the closest moment-balance model (under a more limiting set of assumptions than adopted by the present WGCEP, as described in detail in Appendix G). The 2002 Working Group Report (WCCEP, 2003, referred to here as WGCEP-2002), created segmented earthquake rupture forecast models for all faults in the region, including some that had been designated as Type B faults in the NSHMP, 1996, and one that had not previously been considered. The 2002 National Seismic Hazard Maps used the values from WGCEP-2002 for all the faults in the region, essentially treating all the listed faults as Type A faults. As discussed in Appendix A, the current WGCEP found that there are a number of faults with little or no data on slip-per-event, or dates of previous earthquakes. As a result, the WGCEP recommends that faults with minimal available earthquake recurrence data: the Greenville, Mount Diablo, San Gregorio, Monte Vista-Shannon and Concord-Green Valley be modeled as Type B faults to be consistent with similarly poorly-known faults statewide
Buijsse, Brian; Jacobs, David R.; Steffen, Lyn M.; Kromhout, Daan; Gross, Myron D.
2015-01-01
Vitamin C may reduce risk of hypertension, either in itself or by marking a healthy diet pattern. We assessed whether plasma ascorbic acid and the a priori diet quality score relate to incident hypertension and whether they explain each other’s predictive abilities. Data were from 2884 black and white adults (43% black, mean age 35 years) initially hypertension-free in the Coronary Artery Risk Development in Young Adults Study (study year 10, 1995–1996). Plasma ascorbic acid was assessed at year 10 and the diet quality score at year 7. Eight-hundred-and-forty cases of hypertension were documented between years 10 and 25. After multiple adjustments, each 12-point (1 SD) higher diet quality score at year 7 related to mean 3.7 μmol/L (95% CI 2.9 to 4.6) higher plasma ascorbic acid at year 10. In separate multiple-adjusted Cox regression models, the hazard ratio of hypertension per 19.6-μmol/L (1 SD) higher ascorbic acid was 0.85 (95% CI 0.79–0.92) and per 12-points higher diet score 0.86 (95% CI 0.79–0.94). These hazard ratios changed little with mutual adjustment of ascorbic acid and diet quality score for each other, or when adjusted for anthropometric variables, diabetes, and systolic blood pressure at year 10. Intake of dietary vitamin C and several food groups high in vitamin C content were inversely related to hypertension, whereas supplemental vitamin C was not. In conclusion, plasma ascorbic acid and the a priori diet quality score independently predict hypertension. This suggests that hypertension risk is reduced by improving overall diet quality and/or vitamin C status. The inverse association seen for dietary but not for supplemental vitamin C suggests that vitamin C status is preferably improved by eating foods rich in vitamin C, in addition to not smoking and other dietary habits that prevent ascorbic acid from depletion. PMID:26683190
On Evaluation of Recharge Model Uncertainty: a Priori and a Posteriori
Ming Ye; Karl Pohlmann; Jenny Chapman; David Shafer
2006-01-30
Hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Hydrologic analyses typically rely on a single conceptual-mathematical model, which ignores conceptual model uncertainty and may result in bias in predictions and under-estimation of predictive uncertainty. This study is to assess conceptual model uncertainty residing in five recharge models developed to date by different researchers based on different theories for Nevada and Death Valley area, CA. A recently developed statistical method, Maximum Likelihood Bayesian Model Averaging (MLBMA), is utilized for this analysis. In a Bayesian framework, the recharge model uncertainty is assessed, a priori, using expert judgments collected through an expert elicitation in the form of prior probabilities of the models. The uncertainty is then evaluated, a posteriori, by updating the prior probabilities to estimate posterior model probability. The updating is conducted through maximum likelihood inverse modeling by calibrating the Death Valley Regional Flow System (DVRFS) model corresponding to each recharge model against observations of head and flow. Calibration results of DVRFS for the five recharge models are used to estimate three information criteria (AIC, BIC, and KIC) used to rank and discriminate these models. Posterior probabilities of the five recharge models, evaluated using KIC, are used as weights to average head predictions, which gives posterior mean and variance. The posterior quantities incorporate both parametric and conceptual model uncertainties.
A Priori Attitudes Predict Amniocentesis Uptake in Women of Advanced Maternal Age: A Pilot Study.
Grinshpun-Cohen, Julia; Miron-Shatz, Talya; Rhee-Morris, Laila; Briscoe, Barbara; Pras, Elon; Towner, Dena
2015-01-01
Amniocentesis is an invasive procedure performed during pregnancy to determine, among other things, whether the fetus has Down syndrome. It is often preceded by screening, which gives a probabilistic risk assessment. Thus, ample information is conveyed to women with the goal to inform their decisions. This study examined the factors that predict amniocentesis uptake among pregnant women of advanced maternal age (older than 35 years old at the time of childbirth). Participants filled out a questionnaire regarding risk estimates, demographics, and attitudes on screening and pregnancy termination before their first genetic counseling appointment and were followed up to 24 weeks of gestation. Findings show that women's decisions are not always informed by screening results or having a medical indication. Psychological factors measured at the beginning of pregnancy: amniocentesis risk tolerance, pregnancy termination tolerance, and age risk perception affected amniocentesis uptake. Although most women thought that screening for Down syndrome risk would inform their decision, they later stated other reasons for screening, such as preparing for the possibility of a child with special needs. Findings suggest that women's decisions regarding amniocentesis are driven not only by medical factors, but also by a priori attitudes. The authors believe that these should be addressed in the dialogue on women's informed use of prenatal tests. PMID:26065331
NASA Astrophysics Data System (ADS)
Dekdouk, B.; Ktistis, C.; Yin, W.; Armitage, D. W.; Peyton, A. J.
2010-04-01
Magnetic induction tomography (MIT) is a non-invasive contactless modality that could be capable of imaging the conductivity distribution of biological tissues. In this paper we consider the possibility of using absolute MIT voltage measurements for monitoring the progress of a peripheral hemorrhagic stroke in a human brain. The pathology is modelled as a local blood accumulation in the white matter. The solution of the MIT inverse problem is nonlinear and ill-posed and hence requires the use of a regularisation method. In this paper, we describe the construction and present the performance of a regularisation matrix based on a priori structural information of the head tissues obtained from a very recent MRI scan. The method takes the MRI scan as an initial state of the stroke and constructs a learning set containing the possible conductivity distributions of the current state of the stroke. This data is used to calculate an approximation of the covariance matrix and then a subspace is constructed using principal component analysis (PCA). It is shown by simulations the method is capable of producing a representative reconstruction of a stroke compared to smoothing Tikhonov regularization in a simplified model of the head.
DNS/LES of turbulent flow in a square duct: A priori evaluation of subgrid models
NASA Astrophysics Data System (ADS)
O'Sullivan, Peter L.; Biringen, Sedat; Huser, Asmund
We have performed a priori tests of two dynamic subgrid-scale (SGS) turbulence models using a highly resolved direct numerical simulation (DNS) data-base for the case of incompressible flow in a straight duct of square cross-section. The model testing is applied only to the homogeneous flow direction where grid filtering can be applied without the introduction of commutation errors. The first model is the dynamic (Smagorinsky/eddy viscosity) SGS model (DSM) developed by Germano et al. [5] while the second is the dynamic two-parameter (mixed) model (DTM) developed by Salvetti and Banerjee [2]. As found in prior studies of this sort there is a very poor correlation of the modelled and exact subgrid-scale dissipation in the case of the DSM. The DSM over-predicts subgrid-scale dissipation on average. Instantaneously, the model provides an inaccurate representation of subgrid-scale dissipation, in general underestimating the magnitude by approximately one order of magnitude. On the other hand, the DTM shows excellent agreement with the exact SGS dissipation over most of the duct cross-section with a correlation coefficient of approximately 0.9.
A Priori Analyses of Three Subgrid-Scale Models for One-Parameter Families of Filters
NASA Technical Reports Server (NTRS)
Pruett, C. David; Adams, Nikolaus A.
1998-01-01
The decay of isotropic turbulence a compressible flow is examined by direct numerical simulation (DNS). A priori analyses of the DNS data are then performed to evaluate three subgrid-scale (SGS) models for large-eddy simulation (LES): a generalized Smagorinsky model (M1), a stress-similarity model (M2), and a gradient model (M3). The models exploit one-parameter second- or fourth-order filters of Pade type, which permit the cutoff wavenumber k(sub c) to be tuned independently of the grid increment (delta)x. The modeled (M) and exact (E) SGS-stresses are compared component-wise by correlation coefficients of the form C(E,M) computed over the entire three-dimensional fields. In general, M1 correlates poorly against exact stresses (C < 0.2), M3 correlates moderately well (C approx. 0.6), and M2 correlates remarkably well (0.8 < C < 1.0). Specifically, correlations C(E, M2) are high provided the grid and test filters are of the same order. Moreover, the highest correlations (C approx.= 1.0) result whenever the grid and test filters are identical (in both order and cutoff). Finally, present results reveal the exact SGS stresses obtained by grid filters of differing orders to be only moderately well correlated. Thus, in LES the model should not be specified independently of the filter.
Optical diffraction tomography in fluid velocimetry: the use of a priori information
NASA Astrophysics Data System (ADS)
Lobera, J.; Coupland, J. M.
2008-07-01
Holographic particle image velocimetry (HPIV) has been used successfully to make three-dimensional, three-component flow measurements from holographic recordings of seeded fluid. It is clear that measurements can only be made in regions that contain particles, but simply adding more seeding results in poor quality images due to the effects of multiple scattering. In this paper, we describe optical diffraction tomography (ODT) techniques and consider its use as a means to overcome the problems of multiple scattering in HPIV. We consider several approaches to tomographic reconstruction that are essentially based on linear and nonlinear combinations of holographic reconstructions of the scattered fields observed under varied illuminating conditions. We show that linear reconstruction provides images of highest fidelity, but none of the methods properly accounts for the effects of multiple scattering. We go on to consider nonlinear optimization methods in ODT that attempt to minimize the error between the scattered field computed from an estimate of the particle distribution and that measured in practice. We describe an optimization procedure that is based on the conjugated gradient method (CGM) that makes use of a priori information (the size and refractive index of the seeding particles) to effectively reduce the problem to that of finding the set of particle locations. Some 2D numerical experiments are computed and some promising results are shown.
NASA Astrophysics Data System (ADS)
Guardia, M.; Kaloshin, V.; Zhang, J.
2016-07-01
In this paper we study a so-called separatrix map introduced by Zaslavskii-Filonenko (Sov Phys JETP 27:851-857, 1968) and studied by Treschev (Physica D 116(1-2):21-43, 1998; J Nonlinear Sci 12(1):27-58, 2002), Piftankin (Nonlinearity (19):2617-2644, 2006) Piftankin and Treshchëv (Uspekhi Mat Nauk 62(2(374)):3-108, 2007). We derive a second order expansion of this map for trigonometric perturbations. In Castejon et al. (Random iteration of maps of a cylinder and diffusive behavior. Preprint available at arXiv:1501.03319, 2015), Guardia and Kaloshin (Stochastic diffusive behavior through big gaps in a priori unstable systems (in preparation), 2015), and Kaloshin et al. (Normally Hyperbolic Invariant Laminations and diffusive behavior for the generalized Arnold example away from resonances. Preprint available at http://www.terpconnect.umd.edu/vkaloshi/, 2015), applying the results of the present paper, we describe a class of nearly integrable deterministic systems with stochastic diffusive behavior.
A priori data-driven multi-clustered reservoir generation algorithm for echo state network.
Li, Xiumin; Zhong, Ling; Xue, Fangzheng; Zhang, Anguo
2015-01-01
Echo state networks (ESNs) with multi-clustered reservoir topology perform better in reservoir computing and robustness than those with random reservoir topology. However, these ESNs have a complex reservoir topology, which leads to difficulties in reservoir generation. This study focuses on the reservoir generation problem when ESN is used in environments with sufficient priori data available. Accordingly, a priori data-driven multi-cluster reservoir generation algorithm is proposed. The priori data in the proposed algorithm are used to evaluate reservoirs by calculating the precision and standard deviation of ESNs. The reservoirs are produced using the clustering method; only the reservoir with a better evaluation performance takes the place of a previous one. The final reservoir is obtained when its evaluation score reaches the preset requirement. The prediction experiment results obtained using the Mackey-Glass chaotic time series show that the proposed reservoir generation algorithm provides ESNs with extra prediction precision and increases the structure complexity of the network. Further experiments also reveal the appropriate values of the number of clusters and time window size to obtain optimal performance. The information entropy of the reservoir reaches the maximum when ESN gains the greatest precision. PMID:25875296
A Priori Analysis of Flamelet-Based Modeling for a Dual-Mode Scramjet Combustor
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; McDaniel, James C.; Drozda, Tomasz G.; Lacaze, Guilhem; Oefelein, Joseph
2014-01-01
An a priori investigation of the applicability of flamelet-based combustion models to dual-mode scramjet combustion was performed utilizing Reynolds-averaged simulations (RAS). For this purpose, the HIFiRE Direct Connect Rig (HDCR) flowpath, fueled with a JP-7 fuel surrogate and operating in dual- and scram-mode was considered. The chemistry of the JP-7 fuel surrogate was modeled using a 22 species, 18-step chemical reaction mechanism. Simulation results were compared to experimentally-obtained, time-averaged, wall pressure measurements to validate the RAS solutions. The analysis of the dual-mode operation of this flowpath showed regions of predominately non-premixed, high-Damkohler number, combustion. Regions of premixed combustion were also present but associated with only a small fraction of the total heat-release in the flow. This is in contrast to the scram-mode operation, where a comparable amount of heat is released from non-premixed and premixed combustion modes. Representative flamelet boundary conditions were estimated by analyzing probability density functions for temperature and pressure for pure fuel and oxidizer conditions. The results of the present study reveal the potential for a flamelet model to accurately model the combustion processes in the HDCR and likely other high-speed flowpaths of engineering interest.
A quantum question order model supported by empirical tests of an a priori and precise prediction.
Wang, Zheng; Busemeyer, Jerome R
2013-10-01
Question order effects are commonly observed in self-report measures of judgment and attitude. This article develops a quantum question order model (the QQ model) to account for four types of question order effects observed in literature. First, the postulates of the QQ model are presented. Second, an a priori, parameter-free, and precise prediction, called the QQ equality, is derived from these mathematical principles, and six empirical data sets are used to test the prediction. Third, a new index is derived from the model to measure similarity between questions. Fourth, we show that in contrast to the QQ model, Bayesian and Markov models do not generally satisfy the QQ equality and thus cannot account for the reported empirical data that support this equality. Finally, we describe the conditions under which order effects are predicted to occur, and we review a broader range of findings that are encompassed by these very same quantum principles. We conclude that quantum probability theory, initially invented to explain order effects on measurements in physics, appears to be a powerful natural explanation for order effects of self-report measures in social and behavioral sciences, too. PMID:24027203
Double-difference traveltime tomography with edge-preserving regularization and a priori interfaces
NASA Astrophysics Data System (ADS)
Lin, Youzuo; Syracuse, Ellen M.; Maceira, Monica; Zhang, Haijiang; Larmat, Carene
2015-05-01
Conventional traveltime seismic tomography methods with Tikhonov regularization (L2 norm) typically produce smooth models, but these models may be inappropriate when subsurface structure contains discontinuous features, such as faults or fractures, indicating that tomographic models should contain sharp boundaries. For this reason, we develop a double-difference (DD) traveltime tomography method that uses a modified total-variation regularization scheme incorporated with a priori information on interfaces to preserve sharp property contrasts and obtain accurate inversion results. In order to solve the inversion problem, we employ an alternating minimization method to decouple the original DD tomography problem into two separate subproblems: a conventional DD tomography with Tikhonov regularization and a L2 total-variation inversion. We use the LSQR linear solver to solve the Tikhonov inversion and the split-Bregman iterative method to solve the total-variation inversion. Through our numerical examples, we show that our new DD tomography method yields more accurate results than the conventional DD tomography method at almost the same computational cost.
A priori least expected time paths in fuzzy, time-variant transportation networks
NASA Astrophysics Data System (ADS)
Wang, Li; Gao, Ziyou; Yang, Lixing
2016-02-01
Dynamics and fuzziness are two significant characteristics of real-world transportation networks. To capture these two features theoretically, this article proposes the concept of a fuzzy, time-variant network characterized by a series of time-dependent fuzzy link travel times. To find an effective route guidance for travelers, the expected travel time is specifically adopted as an evaluation criterion to assess the route generation process. Then the shortest path problem is formulated as a multi-objective 0-1 optimization model for finding the least expected time path over the considered time horizon. Different from the shortest path problem in dynamic and random networks, an efficient method is proposed in this article to calculate the fuzzy expected travel time for each given path. A tabu search algorithm is designed for the problem to generate the best solution under the framework of linear weighted methods. Finally, two numerical experiments are performed to verify the effectiveness and efficiency of the model and algorithm.
NASA Astrophysics Data System (ADS)
Thayer, David; Liu, Ning; Unlu, Burcin; Chen, Jeon-Hor; Su, Min-Ying; Nalcioglu, Orhan; Gulsen, Gultekin
2010-02-01
Breast cancer is a significant cause of mortality and morbidity among women with early diagnosis being vital to successful treatment. Diffuse Optical Tomography (DOT) is an emerging medical imaging modality that provides information that is complementary to current screening modalities such as MRI and mammography, and may improve the specificity in determining cancer malignancy. Using high-resolution anatomic images as a priori information improves the accuracy of DOT. Measurements are presented characterizing the performance of our system. Preliminary data is also shown illustrating the use of a priori MRI data in phantom studies.ä
A priori noise and regularization in least squares collocation of gravity anomalies
NASA Astrophysics Data System (ADS)
Jarmołowski, Wojciech
2013-12-01
The paper describes the estimation of covariance parameters in least squares collocation (LSC) by the cross-validation (CV) technique called leave-one-out (LOO). Two parameters of Gauss-Markov third order model (GM3) are estimated together with a priori noise standard deviation, which contributes significantly to the covariance matrix composed of the signal and noise. Numerical tests are performed using large set of Bouguer gravity anomalies located in the central part of the U.S. Around 103 000 gravity stations are available in the selected area. This dataset, together with regular grids generated from EGM2008 geopotential model, give an opportunity to work with various spatial resolutions of the data and heterogeneous variances of the signal and noise. This plays a crucial role in the numerical investigations, because the spatial resolution of the gravity data determines the number of gravity details that we may observe and model. This establishes a relation between the spatial resolution of the data and the resolution of the gravity field model. This relation is inspected in the article and compared to the regularization problem occurring frequently in data modeling. Artykuł opisuje estymację parametrów kowariancji w kolokacji najmniejszych kwadratów (LSC) przy pomocy techniki kroswalidacji nazywanej leave-one-out (LOO). Wyznaczane są dwa parametry modelu Gaussa-Markova trzeciego rzędu (GM3) wraz z odchyleniem standardowym szumu a priori, które ma znaczny wpływ na macierz kowariancji złożoną z sygnału i szumu. Testy numeryczne przeprowadzono na dużym zbiorze anomalii grawimetrycznych Bouguera z obszaru centralnej części USA. Obszar ten mieści około 103000 pomiarów grawimetrycznych. Dane te wraz z regularnymi siatkami wygenerowanymi z modelu geopotencjalnego EGM2008 pozwalają na pracę z różną rozdzielczością przestrzenną i różnymi wariancjami sygnału i szumu. Odgrywa to kluczową rolę w badaniach numerycznych, ponieważ rozdzielczo
An a priori DNS study of the shadow-position mixing model
Zhao, Xin -Yu; Bhagatwala, Ankit; Chen, Jacqueline H.; Haworth, Daniel C.; Pope, Stephen B.
2016-01-15
In this study, the modeling of mixing by molecular diffusion is a central aspect for transported probability density function (tPDF) methods. In this paper, the newly-proposed shadow position mixing model (SPMM) is examined, using a DNS database for a temporally evolving di-methyl ether slot jet flame. Two methods that invoke different levels of approximation are proposed to extract the shadow displacement (equivalent to shadow position) from the DNS database. An approach for a priori analysis of the mixing-model performance is developed. The shadow displacement is highly correlated with both mixture fraction and velocity, and the peak correlation coefficient of themore » shadow displacement and mixture fraction is higher than that of the shadow displacement and velocity. This suggests that the composition-space localness is reasonably well enforced by the model, with appropriate choices of model constants. The conditional diffusion of mixture fraction and major species from DNS and from SPMM are then compared, using mixing rates that are derived by matching the mixture fraction scalar dissipation rates. Good qualitative agreement is found, for the prediction of the locations of zero and maximum/minimum conditional diffusion locations for mixture fraction and individual species. Similar comparisons are performed for DNS and the IECM (interaction by exchange with the conditional mean) model. The agreement between SPMM and DNS is better than that between IECM and DNS, in terms of conditional diffusion iso-contour similarities and global normalized residual levels. It is found that a suitable value for the model constant c that controls the mixing frequency can be derived using the local normalized scalar variance, and that the model constant a controls the localness of the model. A higher-Reynolds-number test case is anticipated to be more appropriate to evaluate the mixing models, and stand-alone transported PDF simulations are required to more fully enforce
Quantifying the sensitivity of aerosol optical depths retrieved from MSG SEVIRI to a priori data
NASA Astrophysics Data System (ADS)
Bulgin, C. E.; Palmer, P. I.; Merchant, C. J.; Siddans, R.; Poulsen, C.; Grainger, R. G.; Thomas, G.; Carboni, E.; McConnell, C.; Highwood, E.
2009-12-01
Radiative forcing contributions from aerosol direct and indirect effects remain one of the most uncertain components of the climate system. Satellite observations of aerosol optical properties offer important constraints on atmospheric aerosols but their sensitivity to prior assumptions must be better characterized before they are used effectively to reduce uncertainty in aerosol radiative forcing. We assess the sensitivity of the Oxford-RAL Aerosol and Cloud (ORAC) optimal estimation retrieval of aerosol optical depth (AOD) from the Spinning Enhanced Visible and InfraRed Imager (SEVIRI) to a priori aerosol data. SEVIRI is a geostationary satellite instrument centred over Africa and the neighbouring Atlantic Ocean, routinely sampling desert dust and biomass burning outflow from Africa. We quantify the uncertainty in SEVIRI AOD retrievals in the presence of desert dust by comparing retrievals that use prior information from the Optical Properties of Aerosol and Cloud (OPAC) database, with those that use measured aerosol properties during the Dust Outflow and Deposition to the Ocean (DODO) aircraft campaign (August, 2006). We also assess the sensitivity of retrieved AODs to changes in solar zenith angle, and the vertical profile of aerosol effective radius and extinction coefficient input into the retrieval forward model. Currently the ORAC retrieval scheme retrieves AODs for five aerosol types (desert dust, biomass burning, maritime, urban and continental) and chooses the most appropriate AOD based on the cost functions. We generate an improved prior aerosol speciation database for SEVIRI based on a statistical analysis of a Saharan Dust Index (SDI) determined using variances of different brightness temperatures, and organic and black carbon tracers from the GEOS-Chem chemistry transport model. This database is described as a function of season and time of day. We quantify the difference in AODs between those chosen based on prior information from the SDI and GEOS
Assessing the benefit of 3D a priori models for earthquake location
NASA Astrophysics Data System (ADS)
Tilmann, F. J.; Manzanares, A.; Peters, K.; Kahle, R. L.; Lange, D.; Saul, J.; Nooshiri, N.
2014-12-01
Earthquake location in 1D Earth models is a routine procedure. Particularly in environments such as subduction zones where the network geometry is biased and lateral velocity variations are large, the use of a 1D model can lead to strongly biased solutions. This is well-known and it is therefore usually preferred to use three-dimensional models, e.g. from local earthquake tomography. Efficient codes for earthquake location in 3D models are available for routine use, for example NonLinLoc. However, tomographic studies are time-consuming to carry out, and a sufficient number of data might not always be available. However, in many cases, information about the three-dimensional velocity structure is available in the form of refraction surveys or other constraints such as gravity or receiver functions based models. Failing that, global or regional scale crustal models could be employed. However, it is not obvious that models derived using different types of data lead to better location results than an optimised 1D velocity model. On the other hand, correct interpretation of seismicity patterns often requires comparison and exaxt positioning in pre-existing velocity models. In this presentation we draw on examples from the Chilean and Sumatran margins as well as a mid-ocean ridge environments, using both data and synthetic examples to investigate under what conditions the use of a priori 3D models is expected to result in improved location results and modifies interpretation. Furthermore, we introduce MATLAB tools that facilitate the creation of three-dimensional models suitable for earthquake location from refraction profiles, CRUST1 and SLAB1.0 and other model types.
Inversion of EM38 data measured in different heights using a-priori information for stabilization
NASA Astrophysics Data System (ADS)
Wunderlich, Tina; Rabbel, Wolfgang
2010-05-01
Within the frame of the iSOIL project apparent conductivity measurements using the EM38DD (Geonics) have been conducted on different soil types. The EM38DD is mounted in different heights on a metal-free sled and pulled behind a tractor with an inline sampling distance of 20cm and a profile offset of 1m. The apparent conductivities have been inverted into real conductivities over the whole measured area. In order to improve the equation system and to avoid singular matrices 4 measurements (vertical and horizontal mode in two different heights) at one location are used to determine the conductivities of two layers and the depth of the interface between the layers. The inversion is stabilized by weighted a-priori information for both conductivities and depth and by the inclusion of neighboring points. Depth information is gained from GPR measurements over the same area that have been done in one survey together with the EM38DD measurements. The inversion results are compared to results of 1D and 2D electrical resistivity imaging using optimized and Schlumberger configurations. Principal Component Analysis is used to compare modeled and measured data and correlation coefficients between them are calculated to evaluate the reliability of the inversion. Acknowledgement: iSOIL-Interactions between soil related sciences - Linking geophysics, soil science and digital soil mapping is a Collaborative Project (Grant Agreement number 211386) co-funded by the Research DG of the European Commission within the RTD activities of the FP7 Thematic Priority Environment.
Lost in space: Onboard star identification using CCD star tracker data without an a priori attitude
NASA Technical Reports Server (NTRS)
Ketchum, Eleanor A.; Tolson, Robert H.
1993-01-01
There are many algorithms in use today which determine spacecraft attitude by identifying stars in the field of view of a star tracker. Some methods, which date from the early 1960's, compare the angular separation between observed stars with a small catalog. In the last 10 years, several methods have been developed which speed up the process and reduce the amount of memory needed, a key element to onboard attitude determination. However, each of these methods require some a priori knowledge of the spacecraft attitude. Although the Sun and magnetic field generally provide the necessary coarse attitude information, there are occasions when a spacecraft could get lost when it is not prudent to wait for sunlight. Also, the possibility of efficient attitude determination using only the highly accurate CCD star tracker could lead to fully autonomous spacecraft attitude determination. The need for redundant coarse sensors could thus be eliminated at substantial cost reduction. Some groups have extended their algorithms to implement a computation intense full sky scan. Some require large data bases. Both storage and speed are concerns for autonomous onboard systems. Neural network technology is even being explored by some as a possible solution, but because of the limited number of patterns that can be stored and large overhead, nothing concrete has resulted from these efforts. This paper presents an algorithm which, by descretizing the sky and filtering by visual magnitude of the brightness observed star, speeds up the lost in space star identification process while reducing the amount of necessary onboard computer storage compared to existing techniques.
FORTRAN IV Program for One-Way Analysis of Variance with A Priori or A Posteriori Mean Comparisons
ERIC Educational Resources Information Center
Fordyce, Michael W.
1977-01-01
A flexible Fortran program for computing one way analysis of variance is described. Requiring minimal core space, the program provides a variety of useful group statistics, all summary statistics for the analysis, and all mean comparisons for a priori or a posteriori testing. (Author/JKS)
ERIC Educational Resources Information Center
Schmidt, Sebastian; Troge, Thomas A.; Lorrain, Denis
2013-01-01
A theory of listening to music is proposed. It suggests that, for listeners, the process of prediction is the starting point to experiencing music. This implies that perception of music starts through both a predisposed and an experience-based extrapolation into the future (this is labeled "a priori" listening). Indications for this…
The origin of anomalous transport in porous media - is it possible to make a priori predictions?
NASA Astrophysics Data System (ADS)
Bijeljic, Branko; Blunt, Martin
2013-04-01
at approximately the average flow speed; in the carbonate with the widest velocity distribution the stagnant concentration peak is persistent, while the emergence of a smaller secondary mobile peak is observed, leading to a highly anomalous behavior. This defines different generic nature of non-Fickian transport in the three media and quantifies the effect of pore structure on transport. Moreover, the propagators obtained by the model are in a very good agreement with the propagators measured on beadpack, Bentheimer sandstone and Portland carbonate cores in nuclear magnetic resonance experiments. These findings demonstrate that it is possible to make a priori predictions of anomalous transport in porous media. The importance of these findings for transport in complex carbonate rock micro-CT images is discussed, classifying them in terms of degree of anomalous transport that can have an impact at the field scale. Extensions to reactive transport will be discussed.
NASA Astrophysics Data System (ADS)
Bellingeri, Michele; Agliari, Elena; Cassi, Davide
2015-10-01
The best strategy to immunize a complex network is usually evaluated in terms of the percolation threshold, i.e. the number of vaccine doses which make the largest connected cluster (LCC) vanish. The strategy inducing the minimum percolation threshold represents the optimal way to immunize the network. Here we show that the efficacy of the immunization strategies can change during the immunization process. This means that, if the number of doses is limited, the best strategy is not necessarily the one leading to the smallest percolation threshold. This outcome should warn about the adoption of global measures in order to evaluate the best immunization strategy.
Towards a 4/3 Approximation for the Asymmetric Traveling Salesman Problem
Carr, Robert; Vempala, Santosh
1999-06-10
A long-standing conjecture in combinatorial optimization says that the integrality gap of the famous Held-Karp relaxation of the symmetric TSP is precisely 4/3. In this paper, we show that a slight strengthening of this conjecture implies a tight 4/3 integrality gap for a linear programming relaxation of the asymmetric TSP. This is surprising since no constant-factor approximation is known for the latter problem. Our main tools are a new characterization of the integrality gap for linear objective functions over polyhedra, and the isolation of ''hard-to-round'' solutions of the relaxations.
Study of alternate strategies for a branch and cut code for the symmetric traveling salesman problem
Clochard, J.M.; Naddef, D.
1994-12-31
In this talk, we will report on experimental results obtained in comparing various strategies for a Branch and Cut code for the STSP. In particular we will address the following topics: choice of initial set of variables: k-neighboring graph (k = 3, 5, 7) vs 2-closure of the Delaunay triangulation, choice of inequalities: comb vs path inequalities, branching rules: branching on the variables vs branching on sub-tour elimination constraints, effort spent on choosing the branching variable or inequality.
NASA Astrophysics Data System (ADS)
Niazi, M. Khalid Khan; Hemminger, Jessica; Kurt, Habibe; Lozanski, Gerard; Gurcan, Metin
2014-03-01
Vascularity represents an important element of tissue/tumor microenvironment and is implicated in tumor growth, metastatic potential and resistence to therapy. Small blood vessels can be visualized using immunohistochemical stains specific to vascular cells. However, currently used manual methods to assess vascular density are poorly reproducible and are at best semi quantitative. Computer based quantitative and objective methods to measure microvessel density are urgently needed to better understand and clinically utilize microvascular density information. We propose a new method to quantify vascularity from images of bone marrow biopsies stained for CD34 vascular lining cells protein as a model. The method starts by automatically segmenting the blood vessels by methods of maxlink thresholding and minimum graph cuts. The segmentation is followed by morphological post-processing to reduce blast and small spurious objects from the bone marrow images. To classify the images into one of the four grades, we extracted 20 features from the segmented blood vessel images. These features include first four moments of the distribution of the area of blood vessels, first four moments of the distribution of 1) the edge weights in the minimum spanning tree of the blood vessels, 2) the shortest distance between blood vessels, 3) the homogeneity of the shortest distance (absolute difference in distance between consecutive blood vessels along the shortest path) between blood vessels and 5) blood vessel orientation. The method was tested on 26 bone marrow biopsy images stained with CD34 IHC stain, which were evaluated by three pathologists. The pathologists took part in this study by quantifying blood vessel density using gestalt assessment in hematopoietic bone marrow portions of bone marrow core biopsies images. To determine the intra-reader variability, each image was graded twice by each pathologist with two-week interval in between their readings. For each image, the ground truth (grade) was acquired through consensus among the three pathologists at the end of the study. A ranking of the features reveals that the fourth moment of the distribution of the area of blood vessels along with the first moment of the distribution of the shortest distance between blood vessels can correctly grade 68.2% of the bone marrow biopsies, while the intra- and inter-reader variability among the pathologists are 66.9% and 40.0%, respectively.
Schroeder, Alex
2015-11-01
The Connected Traveler project is a multi-disciplinary undertaking that seeks to validate potential for transformative transportation system energy savings by incentivizing efficient traveler behavior. This poster outlines various aspects of the Connected Traveler project, including market opportunity, understanding traveler behavior and decision-making, automation and connectivity, and a projected timeline for Connected Traveler's key milestones.
NASA Technical Reports Server (NTRS)
Bey, Kim S.; Oden, J. Tinsley
1993-01-01
A priori error estimates are derived for hp-versions of the finite element method for discontinuous Galerkin approximations of a model class of linear, scalar, first-order hyperbolic conservation laws. These estimates are derived in a mesh dependent norm in which the coefficients depend upon both the local mesh size h(sub K) and a number p(sub k) which can be identified with the spectral order of the local approximations over each element.
Wiuf, Carsten; Schaumburg-Müller Pallesen, Jonatan; Foldager, Leslie; Grove, Jakob
2016-08-01
In many areas of science it is custom to perform many, potentially millions, of tests simultaneously. To gain statistical power it is common to group tests based on a priori criteria such as predefined regions or by sliding windows. However, it is not straightforward to choose grouping criteria and the results might depend on the chosen criteria. Methods that summarize, or aggregate, test statistics or p-values, without relying on a priori criteria, are therefore desirable. We present a simple method to aggregate a sequence of stochastic variables, such as test statistics or p-values, into fewer variables without assuming a priori defined groups. We provide different ways to evaluate the significance of the aggregated variables based on theoretical considerations and resampling techniques, and show that under certain assumptions the FWER is controlled in the strong sense. Validity of the method was demonstrated using simulations and real data analyses. Our method may be a useful supplement to standard procedures relying on evaluation of test statistics individually. Moreover, by being agnostic and not relying on predefined selected regions, it might be a practical alternative to conventionally used methods of aggregation of p-values over regions. The method is implemented in Python and freely available online (through GitHub, see the Supplementary information). PMID:27269897
NASA Astrophysics Data System (ADS)
Luo, Li; Mountrakis, Giorgos
2011-09-01
A classification model was demonstrated that explored spectral and spatial contextual information from previously classified neighbors to improve classification of remaining unclassified pixels. The classification was composed by two major steps, the a priori and the a posteriori classifications. The a priori algorithm classified the less difficult image portion. The a posteriori classifier operated on the more challenging image parts and strived to enhance accuracy by converting classified information from the a priori process into specific knowledge. The novelty of this work relies on the substitution of image-wide information with local spectral representations and spatial correlations, in essence classifying each pixel using exclusively neighboring behavior. Furthermore, the a posteriori classifier is a simple and intuitive algorithm, adjusted to perform in a localized setting for the task requirements. A 2001 and a 2006 Landsat scene from Central New York were used to assess the performance on an impervious classification task. The proposed method was compared with a back propagation neural network. Kappa statistic values in the corresponding applicable datasets increased from 18.67 to 24.05 for the 2006 scene, and from 22.92 to 35.76 for the 2001 scene classification, mostly correcting misclassifications between impervious and soil pixels. This finding suggests that simple classifiers have the ability to surpass complex classifiers through incorporation of partial results and an elegant multi-process framework.
ERIC Educational Resources Information Center
Zorn, Theodore E.
1991-01-01
Discusses how the movie "Death of a Salesman" (a 1986 movie starring Dustin Hoffman in the role of Willy Loman) is useful for teaching communication concepts. Examines how the movie provides a rich case study for illustrating the negotiation of identities. (KEH)
Jones, Sarah E; Dutschmann, Mathias
2016-05-01
Degeneracy of respiratory network function would imply that anatomically discrete aspects of the brain stem are capable of producing respiratory rhythm. To test this theory we a priori transected brain stem preparations before reperfusion and reoxygenation at 4 rostrocaudal levels: 1.5 mm caudal to obex (n = 5), at obex (n = 5), and 1.5 (n = 7) and 3 mm (n = 6) rostral to obex. The respiratory activity of these preparations was assessed via recordings of phrenic and vagal nerves and lumbar spinal expiratory motor output. Preparations with a priori transection at level of the caudal brain stem did not produce stable rhythmic respiratory bursting, even when the arterial chemoreceptors were stimulated with sodium cyanide (NaCN). Reperfusion of brain stems that preserved the pre-Bötzinger complex (pre-BötC) showed spontaneous and sustained rhythmic respiratory bursting at low phrenic nerve activity (PNA) amplitude that occurred simultaneously in all respiratory motor outputs. We refer to this rhythm as the pre-BötC burstlet-type rhythm. Conserving circuitry up to the pontomedullary junction consistently produced robust high-amplitude PNA at lower burst rates, whereas sequential motor patterning across the respiratory motor outputs remained absent. Some of the rostrally transected preparations expressed both burstlet-type and regular PNA amplitude rhythms. Further analysis showed that the burstlet-type rhythm and high-amplitude PNA had 1:2 quantal relation, with burstlets appearing to trigger high-amplitude bursts. We conclude that no degenerate rhythmogenic circuits are located in the caudal medulla oblongata and confirm the pre-BötC as the primary rhythmogenic kernel. The absence of sequential motor patterning in a priori transected preparations suggests that pontine circuits govern respiratory pattern formation. PMID:26888109
Aw, Brian; Boraston, Suni; Botten, David; Cherniwchan, Darin; Fazal, Hyder; Kelton, Timothy; Libman, Michael; Saldanha, Colin; Scappatura, Philip; Stowe, Brian
2014-01-01
Abstract Objective To define the practice of travel medicine, provide the basics of a comprehensive pretravel consultation for international travelers, and assist in identifying patients who might require referral to travel medicine professionals. Sources of information Guidelines and recommendations on travel medicine and travel-related illnesses by national and international travel health authorities were reviewed. MEDLINE and EMBASE searches for related literature were also performed. Main message Travel medicine is a highly dynamic specialty that focuses on pretravel preventive care. A comprehensive risk assessment for each individual traveler is essential in order to accurately evaluate traveler-, itinerary-, and destination-specific risks, and to advise on the most appropriate risk management interventions to promote health and prevent adverse health outcomes during travel. Vaccinations might also be required and should be personalized according to the individual traveler’s immunization history, travel itinerary, and the amount of time available before departure. Conclusion A traveler’s health and safety depends on a practitioner’s level of expertise in providing pretravel counseling and vaccinations, if required. Those who advise travelers are encouraged to be aware of the extent of this responsibility and to refer all high-risk travelers to travel medicine professionals whenever possible. PMID:25500599
Chen, Charles; DeClerck, Genevieve; Tian, Feng; Spooner, William; McCouch, Susan; Buckler, Edward
2012-01-01
PICARA is an analytical pipeline designed to systematically summarize observed SNP/trait associations identified by genome wide association studies (GWAS) and to identify candidate genes involved in the regulation of complex trait variation. The pipeline provides probabilistic inference about a priori candidate genes using integrated information derived from genome-wide association signals, gene homology, and curated gene sets embedded in pathway descriptions. In this paper, we demonstrate the performance of PICARA using data for flowering time variation in maize – a key trait for geographical and seasonal adaption of plants. Among 406 curated flowering time-related genes from Arabidopsis, we identify 61 orthologs in maize that are significantly enriched for GWAS SNP signals, including key regulators such as FT (Flowering Locus T) and GI (GIGANTEA), and genes centered in the Arabidopsis circadian pathway, including TOC1 (Timing of CAB Expression 1) and LHY (Late Elongated Hypocotyl). In addition, we discover a regulatory feature that is characteristic of these a priori flowering time candidates in maize. This new probabilistic analytical pipeline helps researchers infer the functional significance of candidate genes associated with complex traits and helps guide future experiments by providing statistical support for gene candidates based on the integration of heterogeneous biological information. PMID:23144785
NASA Technical Reports Server (NTRS)
Quinlan, Jesse R.; Drozda, Tomasz G.; McDaniel, James C.; Lacaze, Guilhem; Oefelein, Joseph
2015-01-01
In an effort to make large eddy simulation of hydrocarbon-fueled scramjet combustors more computationally accessible using realistic chemical reaction mechanisms, a compressible flamelet/progress variable (FPV) model was proposed that extends current FPV model formulations to high-speed, compressible flows. Development of this model relied on observations garnered from an a priori analysis of the Reynolds-Averaged Navier-Stokes (RANS) data obtained for the Hypersonic International Flight Research and Experimentation (HI-FiRE) dual-mode scramjet combustor. The RANS data were obtained using a reduced chemical mechanism for the combustion of a JP-7 surrogate and were validated using avail- able experimental data. These RANS data were then post-processed to obtain, in an a priori fashion, the scalar fields corresponding to an FPV-based modeling approach. In the current work, in addition to the proposed compressible flamelet model, a standard incompressible FPV model was also considered. Several candidate progress variables were investigated for their ability to recover static temperature and major and minor product species. The effects of pressure and temperature on the tabulated progress variable source term were characterized, and model coupling terms embedded in the Reynolds- averaged Navier-Stokes equations were studied. Finally, results for the novel compressible flamelet/progress variable model were presented to demonstrate the improvement attained by modeling the effects of pressure and flamelet boundary conditions on the combustion.
Use of a priori spectral information in the measurement of x-ray flux with filtered diode arrays
NASA Astrophysics Data System (ADS)
Marrs, R. E.; Widmann, K.; Brown, G. V.; Heeter, R. F.; MacLaren, S. A.; May, M. J.; Moore, A. S.; Schneider, M. B.
2015-10-01
Filtered x-ray diode (XRD) arrays are often used to measure x-ray spectra vs. time from spectrally continuous x-ray sources such as hohlraums. A priori models of the incident x-ray spectrum enable a more accurate unfolding of the x-ray flux as compared to the standard technique of modifying a thermal Planckian with spectral peaks or dips at the response energy of each filtered XRD channel. A model x-ray spectrum consisting of a thermal Planckian, a Gaussian at higher energy, and (in some cases) a high energy background provides an excellent fit to XRD-array measurements of x-ray emission from laser heated hohlraums. If high-resolution measurements of part of the x-ray emission spectrum are available, that information can be included in the a priori model. In cases where the x-ray emission spectrum is not Planckian, candidate x-ray spectra can be allowed or excluded by fitting them to measured XRD voltages. Examples are presented from the filtered XRD arrays, named Dante, at the National Ignition Facility and the Laboratory for Laser Energetics.
A priori comparison of RANS scalar flux models using DNS data of a Mach 5 boundary layer
NASA Astrophysics Data System (ADS)
Braman, Kalen; Raman, Venkatramanan
2009-11-01
In order to investigate the applicability of Reynolds-averaged scalar flux models (SFM) to scalar dispersion in high speed turbulent flows, a priori comparisons have been performed utilizing the results of direct numerical simulations (DNS) of a Mach 5 boundary layer. At a small patch on the solid surface boundary, a scalar was introduced into the flow at a rate depending upon the local surface temperature. This configuration mimics surface ablation in hypersonic flows. In different simulations, the scalar injection rate was varied, and the scalar was treated as both passive, not affecting the flow field, and active, affecting the flow field due to having different molecular properties than the bulk flow and having an injection velocity. Statistics of the simulated scalar fields have been calculated and compared a priori with terms from SFMs. Comparisons from the passive scalar case show that the scalar flux terms in the standard gradient diffusion model fail to predict even the trend of the DNS values. The generalized gradient diffusion models, while an improvement for the streamwise component of scalar flux, nevertheless fail to predict the wall normal and spanwise fluxes. Additionally, production and dissipation models for the scalar variance equation are evaluated.
Wickham, J.D.; Stehman, S.V.; Smith, J.H.; Wade, T.G.; Yang, L.
2004-01-01
Two-stage cluster sampling reduces the cost of collecting accuracy assessment reference data by constraining sample elements to fall within a limited number of geographic domains (clusters). However, because classification error is typically positively spatially correlated, within-cluster correlation may reduce the precision of the accuracy estimates. The detailed population information to quantify a priori the effect of within-cluster correlation on precision is typically unavailable. Consequently, a convenient, practical approach to evaluate the likely performance of a two-stage cluster sample is needed. We describe such an a priori evaluation protocol focusing on the spatial distribution of the sample by land-cover class across different cluster sizes and costs of different sampling options, including options not imposing clustering. This protocol also assesses the two-stage design's adequacy for estimating the precision of accuracy estimates for rare land-cover classes. We illustrate the approach using two large-area, regional accuracy assessments from the National Land-Cover Data (NLCD), and describe how the a priorievaluation was used as a decision-making tool when implementing the NLCD design.
... or Zika Travel to the Olympics Infographic: Olympic Games in Brazil Olympics Freqently Asked Questions Find a ... Travelers Zika infographic: Enjoy Your Vacation Infographic: Olympic Games in Brazil Pack smart to prevent Zika Prevent ...
... Citizens and Residents Living in Areas with Ongoing Zika Virus Transmission Guidelines for Travelers Visiting Friends and Family ... Vaccines. Medicines. Advice. Do you have questions about Zika virus or travel to the Olympics ? Destinations Who are ...
ERIC Educational Resources Information Center
Brodie, Carolyn S.
1994-01-01
Includes ideas and activities for school library media specialists relating to vacationing and traveling, including the use of maps, travel brochures, travel diaries, postcards, videos, slides, and guest speakers. An annotated bibliography of 75 pertinent sources of information, including picture books, intermediate level, nonfiction,…
Safi, Maria; Lilien, Ryan H
2012-06-25
Active site mutations that disrupt drug binding are an important mechanism of drug resistance. Computational methods capable of predicting resistance a priori are poised to become extremely useful tools in the fields of drug discovery and treatment design. In this paper, we describe an approach to predicting drug resistance on the basis of Dead-End Elimination and MM-PBSA that requires no prior knowledge of resistance. Our method utilizes a two-pass search to identify mutations that impair drug binding while maintaining affinity for the native substrate. We use our method to probe resistance in four drug-target systems: isoniazid-enoyl-ACP reductase (tuberculosis), ritonavir-HIV protease (HIV), methotrexate-dihydrofolate reductase (breast cancer and leukemia), and gleevec-ABL kinase (leukemia). We validate our model using clinically known resistance mutations for all four test systems. In all cases, the model correctly predicts the majority of known resistance mutations. PMID:22651699
Travelers' Health: Travel and Breastfeeding
... providers should explain clearly to breastfeeding mothers the value of continuing breastfeeding during travel. For the first 6 months of life, exclusive breastfeeding is recommended. This is especially important during travel because exclusive breastfeeding means feeding only ...
Johnston, Raymond V; Hudson, Martin F
2014-02-01
The suggestion that venous thromboembolism (VTE) is associated with air travel has for several decades been the subject of both "media hype" and extensive debate in the medical literature. As emotion and anecdote is often a feature in this debate, it is therefore necessary to separate evidence from anecdote. "Travelers' thrombosis" is a more appropriate term because the evidence suggests that any form of travel involving immobility lasting more than 4 h can predispose to thrombosis. There is no unique factor in the air travel cabin environment that has been shown to have any effect on the coagulation cascade. Prevention of thrombosis in any form of travel, including air travel, requires being aware of the issue and making an adequate risk assessment together with appropriate prophylactic measures. PMID:24597166
Giddings, Stanley L; Stevens, A Michal; Leung, Daniel T
2016-03-01
Traveler's diarrhea (TD) is the most common travel-related illness, and it can have a significant impact on the traveler. Pretravel consultation provides an excellent opportunity for the clinician to counsel the traveler and discuss strategies such as food and water hygiene, vaccinations, and medications for prophylaxis or self-treatment that may decrease the incidence and impact of TD. Postinfectious sequelae, such as postinfectious irritable bowel syndrome, reactive arthritis, and Guillain-Barre syndrome, may develop weeks or months after return. PMID:26900116
NASA Astrophysics Data System (ADS)
Carper, Matthew A.; Porté-Agel, Fernando
2008-04-01
The ability of subfilter-scale (SFS) models to reproduce the statistical properties of SFS stresses and energy transfers over heterogeneous surface roughness is key to improving the accuracy of large-eddy simulations of the atmospheric boundary layer. In this study, several SFS models are evaluated a priori using experimental data acquired downwind of a rough-to-smooth transition in a wind tunnel. The SFS models studied include the eddy-viscosity, similarity, non-linear and a mixed model consisting of a combination of the eddy-viscosity and non-linear models. The dynamic eddy-viscosity model is also evaluated. The experimental data consist of vertical and horizontal planes of high-spatial-resolution velocity fields measured using particle image velocimetry. These velocity fields are spatially filtered and used to calculate SFS stresses and SFS transfer rates of resolved kinetic energy. Coefficients for each SFS model are calculated by matching the measured and modelled SFS energy transfer rates. For the eddy-viscosity model, the Smagorinsky coefficient is also evaluated using a dynamic procedure. The model coefficients are found to be scale dependent when the filter scales are larger than the vertical measurement height and fall into the production subrange of the turbulence where the flow scales are anisotropic. Near the surface, the Smagorinsky coefficient is also found to decrease with distance downwind from the transition, in response to the increase in mean shear as the flow adjusts to the smooth surface. In a priori tests, the ability of each model to reproduce statistical properties of the SFS stress is assessed. While the eddy-viscosity model has low spatial correlation with the measured stress, it predicts mean stresses with the same accuracy as the other models. However, the deficiency of the eddy-viscosity model is apparent in the underestimation of the standard deviation of the SFS stresses and the inability to predict transfers of kinetic energy from
NASA Technical Reports Server (NTRS)
Winckelmans, G. S.; Lund, T. S.; Carati, D.; Wray, A. A.
1996-01-01
Subgrid-scale models for Large Eddy Simulation (LES) in both the velocity-pressure and the vorticity-velocity formulations were evaluated and compared in a priori tests using spectral Direct Numerical Simulation (DNS) databases of isotropic turbulence: 128(exp 3) DNS of forced turbulence (Re(sub(lambda))=95.8) filtered, using the sharp cutoff filter, to both 32(exp 3) and 16(exp 3) synthetic LES fields; 512(exp 3) DNS of decaying turbulence (Re(sub(Lambda))=63.5) filtered to both 64(exp 3) and 32(exp 3) LES fields. Gaussian and top-hat filters were also used with the 128(exp 3) database. Different LES models were evaluated for each formulation: eddy-viscosity models, hyper eddy-viscosity models, mixed models, and scale-similarity models. Correlations between exact versus modeled subgrid-scale quantities were measured at three levels: tensor (traceless), vector (solenoidal 'force'), and scalar (dissipation) levels, and for both cases of uniform and variable coefficient(s). Different choices for the 1/T scaling appearing in the eddy-viscosity were also evaluated. It was found that the models for the vorticity-velocity formulation produce higher correlations with the filtered DNS data than their counterpart in the velocity-pressure formulation. It was also found that the hyper eddy-viscosity model performs better than the eddy viscosity model, in both formulations.
Bielecki, J.; Scholz, M.; Drozdowicz, K.; Giacomelli, L.; Kiptily, V.; Kempenaars, M.; Conroy, S.; Craciunescu, T.; Collaboration: EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB
2015-09-15
A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.
Bielecki, J; Giacomelli, L; Kiptily, V; Scholz, M; Drozdowicz, K; Conroy, S; Craciunescu, T; Kempenaars, M
2015-09-01
A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET. PMID:26429441
Liu, Jie; Ying, Dongwen; Zhou, Ping
2014-12-01
Voluntary surface electromyogram (EMG) signals from neurological injury patients are often corrupted by involuntary background interference or spikes, imposing difficulties for myoelectric control. We present a novel framework to suppress involuntary background spikes during voluntary surface EMG recordings. The framework applies a Wiener filter to restore voluntary surface EMG signals based on tracking a priori signal to noise ratio (SNR) by using the decision-directed method. Semi-synthetic surface EMG signals contaminated by different levels of involuntary background spikes were constructed from a database of surface EMG recordings in a group of spinal cord injury subjects. After the processing, the onset detection of voluntary muscle activity was significantly improved against involuntary background spikes. The magnitude of voluntary surface EMG signals can also be reliably estimated for myoelectric control purpose. Compared with the previous sample entropy analysis for suppressing involuntary background spikes, the proposed framework is characterized by quick and simple implementation, making it more suitable for application in a myoelectric control system toward neurological injury rehabilitation. PMID:25443536
Liu, Aiping; Li, Junning; Wang, Z. Jane; McKeown, Martin J.
2012-01-01
Graphical models appear well suited for inferring brain connectivity from fMRI data, as they can distinguish between direct and indirect brain connectivity. Nevertheless, biological interpretation requires not only that the multivariate time series are adequately modeled, but also that there is accurate error-control of the inferred edges. The PCfdr algorithm, which was developed by Li and Wang, was to provide a computationally efficient means to control the false discovery rate (FDR) of computed edges asymptotically. The original PCfdr algorithm was unable to accommodate a priori information about connectivity and was designed to infer connectivity from a single subject rather than a group of subjects. Here we extend the original PCfdr algorithm and propose a multisubject, error-rate-controlled brain connectivity modeling approach that allows incorporation of prior knowledge of connectivity. In simulations, we show that the two proposed extensions can still control the FDR around or below a specified threshold. When the proposed approach is applied to fMRI data in a Parkinson's disease study, we find robust group evidence of the disease-related changes, the compensatory changes, and the normalizing effect of L-dopa medication. The proposed method provides a robust, accurate, and practical method for the assessment of brain connectivity patterns from functional neuroimaging data. PMID:23251232
NASA Astrophysics Data System (ADS)
Bielecki, J.; Giacomelli, L.; Kiptily, V.; Scholz, M.; Drozdowicz, K.; Conroy, S.; Craciunescu, T.; Kempenaars, M.
2015-09-01
A method of tomographic reconstruction of the neutron emissivity in the poloidal cross section of the Joint European Torus (JET, Culham, UK) tokamak was developed. Due to very limited data set (two projection angles, 19 lines of sight only) provided by the neutron emission profile monitor (KN3 neutron camera), the reconstruction is an ill-posed inverse problem. The aim of this work consists in making a contribution to the development of reliable plasma tomography reconstruction methods that could be routinely used at JET tokamak. The proposed method is based on Phillips-Tikhonov regularization and incorporates a priori knowledge of the shape of normalized neutron emissivity profile. For the purpose of the optimal selection of the regularization parameters, the shape of normalized neutron emissivity profile is approximated by the shape of normalized electron density profile measured by LIDAR or high resolution Thomson scattering JET diagnostics. In contrast with some previously developed methods of ill-posed plasma tomography reconstruction problem, the developed algorithms do not include any post-processing of the obtained solution and the physical constrains on the solution are imposed during the regularization process. The accuracy of the method is at first evaluated by several tests with synthetic data based on various plasma neutron emissivity models (phantoms). Then, the method is applied to the neutron emissivity reconstruction for JET D plasma discharge #85100. It is demonstrated that this method shows good performance and reliability and it can be routinely used for plasma neutron emissivity reconstruction on JET.
Zhang Jin; Yi Byongyong; Lasio, Giovanni; Suntharalingam, Mohan; Yu, Cedric
2009-10-15
Kilovoltage x-ray projection images (kV images for brevity) are increasingly available in image guided radiotherapy (IGRT) for patient positioning. These images are two-dimensional (2D) projections of a three-dimensional (3D) object along the x-ray beam direction. Projecting a 3D object onto a plane may lead to ambiguities in the identification of anatomical structures and to poor contrast in kV images. Therefore, the use of kV images in IGRT is mainly limited to bony landmark alignments. This work proposes a novel subtraction technique that isolates a slice of interest (SOI) from a kV image with the assistance of a priori information from a previous CT scan. The method separates structural information within a preselected SOI by suppressing contributions to the unprocessed projection from out-of-SOI-plane structures. Up to a five-fold increase in the contrast-to-noise ratios (CNRs) was observed in selected regions of the isolated SOI, when compared to the original unprocessed kV image. The tomographic image via background subtraction (TIBS) technique aims to provide a quick snapshot of the slice of interest with greatly enhanced image contrast over conventional kV x-ray projections for fast and accurate image guidance of radiation therapy. With further refinements, TIBS could, in principle, provide real-time tumor localization using gantry-mounted x-ray imaging systems without the need for implanted markers.
Ericsson, Charles D
2003-02-01
Risk of travellers' diarrhoea is about 7% in developed countries and 20-50% in the developing world. Options for prevention include education and chemoprophylaxis. Vaccination is a promising but incomplete option. Achieving behaviour modification of food and water choices among tourists is difficult. Bismuth subsalicylate (BSS)-containing compounds are about 62% effective in the prevention of travellers' diarrhoea. Antibiotics are about 84% effective in preventing travellers' diarrhoea. Routine prophylaxis of travellers' diarrhoea, especially with antibiotics, should be discouraged. Oral rehydration is generally important in the treatment of diarrhoea, but travellers' diarrhoea is only infrequently dehydrating in adults. The addition of oral rehydration solutions confers no additional benefit to loperamide in the treatment of travellers' diarrhoea in adults. Presently, the most active of the antibiotics routinely available for treatment are members of the fluoroquinolone group. Antibiotics that are not absorbed such as aztreonam and a rifampicin-like agent, rifaximin, are both effective. The latter might become a therapy of choice once it is routinely available, due to predictably less adverse reactions with a non-absorbed antibiotic. Preliminary results with azithromycin look very promising. Less severe disease can be treated with a variety of non-antibiotic agents (e.g. BSS-containing compounds, loperamide and a calmodulin inhibitor, zaldaride). The combination of an antibiotic and loperamide is superior to treatment with either agent alone in a several studies and is arguably the treatment of choice for distressing travellers' diarrhoea. PMID:12615374
Beck, Bernhard R
2013-06-01
Extreme travelling experiences appear to be a quite popular kick offered by tourist operators and sought by some travellers. But some travellers expose themselves to increased risk also during normal holidays, either voluntarily by booking hikes or tours leading them to adventurous locations or to unexpectedly encountering dangerous situations. In planned adventures, precise information in advance, good physical condition, careful planning, and profound medical preparation may contribute to a less hazardous adventure. Advising medical persons may need an expert consultation for specific topics in order to optimise the preparation. Based on three specific environmental situations (jungle, desert, and cave) the specific conditions, dangers and some medical aspects are outlined. PMID:23732454
NASA Technical Reports Server (NTRS)
Mauldin, L. E.
1994-01-01
Business travel planning within an organization is often a time-consuming task. Travel Forecaster is a menu-driven, easy-to-use program which plans, forecasts cost, and tracks actual vs. planned cost for business-related travel of a division or branch of an organization and compiles this information into a database to aid the travel planner. The program's ability to handle multiple trip entries makes it a valuable time-saving device. Travel Forecaster takes full advantage of relational data base properties so that information that remains constant, such as per diem rates and airline fares (which are unique for each city), needs entering only once. A typical entry would include selection with the mouse of the traveler's name and destination city from pop-up lists, and typed entries for number of travel days and purpose of the trip. Multiple persons can be selected from the pop-up lists and multiple trips are accommodated by entering the number of days by each appropriate month on the entry form. An estimated travel cost is not required of the user as it is calculated by a Fourth Dimension formula. With this information, the program can produce output of trips by month with subtotal and total cost for either organization or sub-entity of an organization; or produce outputs of trips by month with subtotal and total cost for international-only travel. It will also provide monthly and cumulative formats of planned vs. actual outputs in data or graph form. Travel Forecaster users can do custom queries to search and sort information in the database, and it can create custom reports with the user-friendly report generator. Travel Forecaster 1.1 is a database program for use with Fourth Dimension Runtime 2.1.1. It requires a Macintosh Plus running System 6.0.3 or later, 2Mb of RAM and a hard disk. The standard distribution medium for this package is one 3.5 inch 800K Macintosh format diskette. Travel Forecaster was developed in 1991. Macintosh is a registered trademark of
Tornow, Matthew A; Skelton, Randall R
2012-01-01
When molecules and morphology produce incongruent hypotheses of primate interrelationships, the data are typically viewed as incompatible, and molecular hypotheses are often considered to be better indicators of phylogenetic history. However, it has been demonstrated that the choice of which taxa to include in cladistic analysis as well as assumptions about character weighting, character state transformation order, and outgroup choice all influence hypotheses of relationships and may positively influence tree topology, so that relationships between extant taxa are consistent with those found using molecular data. Thus, the source of incongruence between morphological and molecular trees may lie not in the morphological data themselves but in assumptions surrounding the ways characters evolve and their impact on cladistic analysis. In this study, we investigate the role that assumptions about character polarity and transformation order play in creating incongruence between primate phylogenies based on morphological data and those supported by multiple lines of molecular data. By releasing constraints imposed on published morphological analyses of primates from disparate clades and subjecting those data to parsimony analysis, we test the hypothesis that incongruence between morphology and molecules results from inherent flaws in morphological data. To quantify the difference between incongruent trees, we introduce a new method called branch slide distance (BSD). BSD mitigates many of the limitations attributed to other tree comparison methods, thus allowing for a more accurate measure of topological similarity. We find that releasing a priori constraints on character behavior often produces trees that are consistent with molecular trees. Case studies are presented that illustrate how congruence between molecules and unconstrained morphological data may provide insight into issues of polarity, transformation order, homology, and homoplasy. PMID:22065165
NASA Astrophysics Data System (ADS)
Grete, Philipp; Vlaykov, Dimitar G.; Schmidt, Wolfram; Schleicher, Dominik R. G.
2016-06-01
Even though compressible plasma turbulence is encountered in many astrophysical phenomena, its effect is often not well understood. Furthermore, direct numerical simulations are typically not able to reach the extreme parameters of these processes. For this reason, large-eddy simulations (LES), which only simulate large and intermediate scales directly, are employed. The smallest, unresolved scales and the interactions between small and large scales are introduced by means of a subgrid-scale (SGS) model. We propose and verify a new set of nonlinear SGS closures for future application as an SGS model in LES of compressible magnetohydrodynamics. We use 15 simulations (without explicit SGS model) of forced, isotropic, homogeneous turbulence with varying sonic Mach number Ms=0.2 -20 as reference data for the most extensive a priori tests performed so far in literature. In these tests, we explicitly filter the reference data and compare the performance of the new closures against the most widely tested closures. These include eddy-viscosity and scale-similarity type closures with different normalizations. Performance indicators are correlations with the turbulent energy and cross-helicity flux, the average SGS dissipation, the topological structure and the ability to reproduce the correct magnitude and the direction of the SGS vectors. We find that only the new nonlinear closures exhibit consistently high correlations (median value > 0.8) with the data over the entire parameter space and outperform the other closures in all tests. Moreover, we show that these results are independent of resolution and chosen filter scale. Additionally, the new closures are effectively coefficient-free with a deviation of less than 20%.
ERIC Educational Resources Information Center
Raiche, Gilles; Blais, Jean-Guy
In a computerized adaptive test (CAT), it would be desirable to obtain an acceptable precision of the proficiency level estimate using an optimal number of items. Decreasing the number of items is accompanied, however, by a certain degree of bias when the true proficiency level differs significantly from the a priori estimate. G. Raiche (2000) has…
Travelers' Health: Cruise Ship Travel
... provider before travel. Passengers should practice good respiratory hygiene and cough etiquette. Passengers should report their respiratory ... from: http://www.who.int/water_sanitation_health/hygiene/ships/en/shipsancomp.pdf?ua=1 . Chapter 6 - ...
NASA Astrophysics Data System (ADS)
Font, Yvonne; Segovia, Monica; Vaca, Sandro; Theunissen, Thomas
2013-04-01
To improve earthquake location, we create a 3-D a priori P-wave velocity model (3-DVM) that approximates the large velocity variations of the Ecuadorian subduction system. The 3-DVM is constructed from the integration of geophysical and geological data that depend on the structural geometry and velocity properties of the crust and the upper mantle. In addition, specific station selection is carried out to compensate for the high station density on the Andean Chain. 3-D synthetic experiments are then designed to evaluate the network capacity to recover the event position using only P arrivals and the MAXI technique. Three synthetic earthquake location experiments are proposed: (1) noise-free and (2) noisy arrivals used in the 3-DVM, and (3) noise-free arrivals used in a 1-DVM. Synthetic results indicate that, under the best conditions (exact arrival data set and 3-DVM), the spatiotemporal configuration of the Ecuadorian network can accurately locate 70 per cent of events in the frontal part of the subduction zone (average azimuthal gap is 289° ± 44°). Noisy P arrivals (up to ± 0.3 s) can accurately located 50 per cent of earthquakes. Processing earthquake location within a 1-DVM almost never allows accurate hypocentre position for offshore earthquakes (15 per cent), which highlights the role of using a 3-DVM in subduction zone. For the application to real data, the seismicity distribution from the 3-D-MAXI catalogue is also compared to the determinations obtained in a 1-D-layered VM. In addition to good-quality location uncertainties, the clustering and the depth distribution confirm the 3-D-MAXI catalogue reliability. The pattern of the seismicity distribution (a 13 yr record during the inter-seismic period of the seismic cycle) is compared to the pattern of rupture zone and asperity of the Mw = 7.9 1942 and the Mw = 7.7 1958 events (the Mw = 8.8 1906 asperity patch is not defined). We observe that the nucleation of 1942, 1958 and 1906 events coincides with
ERIC Educational Resources Information Center
Rowland Unified School District, Rowland Heights, CA.
Teacher-developed materials for a basic computer literacy and utilization program for elementary students in grades 3-6 are included in this 4-part packet, which was originally prepared for use with or without the Apple IIe "traveling" microcomputers shared by 15 Rowland Unified School District elementary schools. Implementation procedures are…
Bertrand, E; Lévy, R; Boyeldieu, D
2013-12-01
Following an ABO accident after transfusion of red blood cells, an a priori risk analysis study is being performed in a hospital. The scope of this analysis covers from the reception of the blood product in the medical unit to its administration. The risk analysis enables to identify the potentially dangerous situations and the evaluation of the risks in order to propose corrective measures (precautionary or protective) and bring the system back to an acceptable risk level. The innovative concept of an a priori risk analysis in the medical field allows the extension of the analysis of this transfusion risk to other hospitals. In addition, it allows the extension of the use of this approach to other medical fields. PMID:24176607
NASA Astrophysics Data System (ADS)
Maasakkers, J. D.; Jacob, D. J.; Payer Sulprizio, M.; Turner, A. J.; Weitz, M.; Wirth, T. C.; Hight, C.; DeFigueiredo, M.; Desai, M.; Schmeltz, R.; Hockstad, L.; Bloom, A. A.; Bowman, K. W.
2015-12-01
The US EPA produces annual estimates of national anthropogenic methane emissions in the Inventory of US Greenhouse Gas Emissions and Sinks (EPA inventory). These are reported to the UN and inform national climate policy. The EPA inventory uses best available information on emitting processes (IPCC Tier 2/3 approaches). However, inversions of atmospheric observations suggest that the inventory could be too low. These inversions rely on crude bottom-up estimates as a priori because the EPA inventory is only available as national totals for most sources. Reliance on an incorrect a priori greatly limits the value of inversions for testing and improving the EPA inventory as allocation of methane emissions by source types and regions can vary greatly between different bottom-up inventories. Here we present a 0.1° × 0.1° monthly version of the EPA inventory to serve as a priori for inversions of atmospheric data and to interpret inversion results. We use a wide range of process-specific information to allocate emissions, incorporating facility-level data reported through the EPA Greenhouse Gas Reporting Program where possible. As an illustration of used gridding strategies, gridded livestock emissions are based on EPA emission data per state, USDA livestock inventories per county, and USDA weighted land cover maps for sub-county localization. Allocation of emissions from natural gas systems incorporates monthly well-level production data, EIA compressor station and processing plant databases, and information on pipelines. Our gridded EPA inventory shows large differences in spatial emission patterns compared to the EDGAR v4.2 global inventory used as a priori in previous inverse studies. Our work greatly enhances the potential of future inversions to test and improve the EPA inventory and more broadly to improve understanding of the factors controlling methane concentrations and their trends. Preliminary inversion results using GOSAT satellite data will be presented.
Terry, Anne C; Haulman, N Jean
2016-03-01
"The traveler's medical kit is an essential tool for both the novice and expert traveler. It is designed to treat travel-related illness and injury and to ensure preexisting medical conditions are managed appropriately. Travelers are at increased risk for common gastrointestinal issues during travel. Respiratory illnesses make up approximately 8% of the ailments present in returned international travelers. Approximately 12% of travelers experience a travel-related skin condition. First aid treatment for minor injuries is essential to all travel medical kits. The complexity ranges from a small, simple case for the urban traveler to a larger, extensive case for wilderness travel." PMID:26900112
NASA Technical Reports Server (NTRS)
2006-01-01
10 April 2006 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows a dust devil traveling across a plain west-southwest of Schiaparelli Crater, in far eastern Sinus Meridiani. The dust devil is casting a shadow toward the northeast, just south (below) of an egg-shaped crater.
Location near: 6.4oS, 349.3oW Image width: 3 km (1.9 mi) Illumination from: lower left Season: Southern Summer
Travel Schooling: Helping Children Learn through Travel.
ERIC Educational Resources Information Center
Byrnes, Deborah A.
2001-01-01
Provides information for teachers to help parents create rewarding and educational travel experiences for children. Examines the benefits of travel schooling, fundamental elements of a meaningful travel schooling experience, fostering cross cultural sensitivity through travel, and returning to the traditional classroom. (SD)
Travel counseling for the elderly traveler.
Schindler, Kasey J
2005-01-01
As the baby boomer's generation retirees, many will have the time and money to travel abroad to see the world's exotic wonders or visit family and friends. When the travelers are elderly, they are particularly vulnerable to the effects of travel. Healthcare professionals are responsible for counseling elders on travel health based on their medical history, destination, method of transportation, and exposure risks. Important areas of travel counseling include preparing for travel, air travel, safety, sun and heat, insect precautions, food and water precautions, and vaccinations. PMID:16271122
Ihme, Matthias; Pitsch, Heinz
2008-10-15
Previously conducted studies of the flamelet/progress variable model for the prediction of nonpremixed turbulent combustion processes identified two areas for model improvements: the modeling of the presumed probability density function (PDF) for the reaction progress parameter and the consideration of unsteady effects [Ihme et al., Proc. Combust. Inst. 30 (2005) 793]. These effects are of particular importance during local flame extinction and subsequent reignition. Here, the models for the presumed PDFs for conserved and reactive scalars are re-examined and a statistically most likely distribution (SMLD) is employed and tested in a priori studies using direct numerical simulation (DNS) data and experimental results from the Sandia flame series. In the first part of the paper, the SMLD model is employed for a reactive scalar distribution. Modeling aspects of the a priori PDF, accounting for the bias in composition space, are discussed. The convergence of the SMLD with increasing number of enforced moments is demonstrated. It is concluded that information about more than two moments is beneficial to accurately represent the reactive scalar distribution in turbulent flames with strong extinction and reignition. In addition to the reactive scalar analysis, the potential of the SMLD for the representation of conserved scalar distributions is also analyzed. In the a priori study using DNS data it is found that the conventionally employed beta distribution provides a better representation for the scalar distribution. This is attributed to the fact that the beta-PDF implicitly enforces higher moment information that is in excellent agreement with the DNS data. However, the SMLD outperforms the beta distribution in free shear flow applications, which are typically characterized by strongly skewed scalar distributions, in the case where higher moment information can be enforced. (author)
Shah, Dhaval K; King, Lindsay E; Han, Xiaogang; Wentland, Jo-Ann; Zhang, Yanhua; Lucas, Judy; Haddish-Berhane, Nahor; Betts, Alison; Leal, Mauricio
2014-05-01
The objectives of this investigation were as follows: (a) to validate a mechanism-based pharmacokinetic (PK) model of ADC for its ability to a priori predict tumor concentrations of ADC and released payload, using anti-5T4 ADC A1mcMMAF, and (b) to analyze the PK model to find out main pathways and parameters model outputs are most sensitive to. Experiential data containing biomeasures, and plasma and tumor concentrations of ADC and payload, following A1mcMMAF administration in two different xenografts, were used to build and validate the model. The model performed reasonably well in terms of a priori predicting tumor exposure of total antibody, ADC, and released payload, and the exposure of released payload in plasma. Model predictions were within two fold of the observed exposures. Pathway analysis and local sensitivity analysis were conducted to investigate main pathways and set of parameters the model outputs are most sensitive to. It was discovered that payload dissociation from ADC and tumor size were important determinants of plasma and tumor payload exposure. It was also found that the sensitivity of the model output to certain parameters is dose-dependent, suggesting caution before generalizing the results from the sensitivity analysis. Model analysis also revealed the importance of understanding and quantifying the processes responsible for ADC and payload disposition within tumor cell, as tumor concentrations were sensitive to these parameters. Proposed ADC PK model provides a useful tool for a priori predicting tumor payload concentrations of novel ADCs preclinically, and possibly translating them to the clinic. PMID:24578215
NASA Astrophysics Data System (ADS)
Kozlovskaya, Elena
2000-06-01
This paper presents an inversion algorithm that can be used to solve a wide range of geophysical nonlinear inverse problems. The algorithm in based upon the principle of a direct search for the optimal solution in the parameter space. The main difference of the algorithm from existing techniques such as genetic algorithms and simulated annealing is that the optimum search is performed under control of a priori information formulated as a fuzzy set in the parameter space. In such a formulation the inverse problem becomes a multiobjective optimization problem with two objective functions, one of them is a membership function of the fuzzy set of feasible solutions, the other is the conditional probability density function of the observed data. The solution to such a problem is a set of Pareto optimal solutions that is constructed in the parameter space by a three-stage search procedure. The advantage of the proposed technique is that it provides the possibility of involving a wide range of non-probabilistic a priori information into the inversion procedure and can be applied to the solution of strongly nonlinear problems. It allows one to decrease the number of forward-problem calculations due to selective sampling of trial points from the parameter space. The properties of the algorithm are illustrated with an application to a local earthquake hypocentre location problem with synthetic and real data.
Leggat, P A; Carne, J; Kedjarune, U
1999-12-01
Travel insurance normally underwrites travel, medical, and dental expenses incurred by travelers abroad and arranges aeromedical evacuation of travelers under conditions specified by the travel insurance policy. Because of the costs of medical and dental treatment abroad and the high cost associated with aeromedical evacuation, all travelers should be advised of the need for comprehensive travel insurance and be advised to read their policies carefully to see what is covered and to check for any exclusions. In particular, those travelers who have known preexisting conditions, who are working overseas, or who are going to undertake any form of hazardous recreational pursuit may need to obtain a special travel insurance policy, which may attract a higher premium. Conservatively, it is estimated that between 30-50% of travelers become ill or injured whilst traveling. Relative estimated monthly incidence rates of various health problems have been compiled elsewhere. The risk of severe injury is thought to be greater for people when traveling abroad. These risks should be covered by travel insurance to protect the traveler, however it is not known what proportion of travel agents or airlines give advice routinely on travel insurance. Travel insurance is the most important safety net for travelers in the event of misadventure, and should be reinforced by travel health advisers. Although only 4% of general practitioners (GPs) in a late 1980's study in the United Kingdom would advise a traveler going to Turkey about travel insurance,4 more recent studies have shown about 60% of GPs in New Zealand and 39% of travel clinics worldwide usually advised travelers concerning travel insurance. In addition, 54% of GPs in New Zealand usually also advised travelers about finding medical assistance abroad, but only 19% of GPs recommended travel insurance companies as a source of medical assistance while traveling. PMID:10575173
Travelers' Health: Water Disinfection for Travelers
... be superior to tap water. Moreover, the plastic bottles create an ecological problem, since most developing countries do not recycle plastic bottles. All international travelers, especially long-term travelers or ...
NASA Astrophysics Data System (ADS)
Navari, M.; Margulis, S. A.; Bateni, S. M.; Tedesco, M.; Alexander, P.; Fettweis, X.
2016-01-01
The Greenland ice sheet (GrIS) has been the focus of climate studies due to its considerable impact on sea level rise. Accurate estimates of surface mass fluxes would contribute to understanding the cause of its recent changes and would help to better estimate the past, current and future contribution of the GrIS to sea level rise. Though the estimates of the GrIS surface mass fluxes have improved significantly over the last decade, there is still considerable disparity between the results from different methodologies (e.g., Rae et al., 2012; Vernon et al., 2013). The data assimilation approach can merge information from different methodologies in a consistent way to improve the GrIS surface mass fluxes. In this study, an ensemble batch smoother data assimilation approach was developed to assess the feasibility of generating a reanalysis estimate of the GrIS surface mass fluxes via integrating remotely sensed ice surface temperature measurements with a regional climate model (a priori) estimate. The performance of the proposed methodology for generating an improved posterior estimate was investigated within an observing system simulation experiment (OSSE) framework using synthetically generated ice surface temperature measurements. The results showed that assimilation of ice surface temperature time series were able to overcome uncertainties in near-surface meteorological forcing variables that drive the GrIS surface processes. Our findings show that the proposed methodology is able to generate posterior reanalysis estimates of the surface mass fluxes that are in good agreement with the synthetic true estimates. The results also showed that the proposed data assimilation framework improves the root-mean-square error of the posterior estimates of runoff, sublimation/evaporation, surface condensation, and surface mass loss fluxes by 61, 64, 76, and 62 %, respectively, over the nominal a priori climate model estimates.
Childhood and Travel Literature.
ERIC Educational Resources Information Center
Espey, David
If children are not present in most travel literature--precisely because the genre has most typically been the domain of solitary male travelers who are escaping domestic obligation, routine, the familiar, and the family--they nevertheless are an integral part of the genre. The traveler is in many ways a child, an innocent abroad. Traveler writers…
Ziegler, Carol C
2013-06-01
Travel abroad for business and pleasure should be safe and meaningful for the traveler. To assure that safe experience, certain processes should be considered before travel. A thorough pretravel health assessment will offer patients and health care providers valuable information for anticipatory guidance before travel. The destination-based risk assessment will help determine the risks involved in travel to specific locations and guide in the development of contingency plans for all travelers, especially those with chronic conditions. Diseases are more prevalent overseas, and immunizations and vaccinations are all important considerations for persons traveling abroad. PMID:23692948
Hayes, G.P.; Wald, D.J.; Keranen, K.
2009-01-01
Ongoing developments in earthquake source inversions incorporate nonplanar fault geometries as inputs to the inversion process, improving previous approaches that relied solely on planar fault surfaces. This evolution motivates advancing the existing framework for constraining fault geometry, particularly in subduction zones where plate boundary surfaces that host highly hazardous earthquakes are clearly nonplanar. Here, we improve upon the existing framework for the constraint of the seismic rupture plane of subduction interfaces by incorporating active seismic and seafloor sediment thickness data with existing independent data sets and inverting for the most probable nonplanar subduction geometry. Constraining the rupture interface a priori with independent geological and seismological information reduces the uncertainty in the derived earthquake source inversion parameters over models that rely on simpler assumptions, such as the moment tensor inferred fault plane. Examples are shown for a number of wellconstrained global locations. We expand the coverage of previous analyses to a more uniform global data set and show that even in areas of sparse data this approach is able to accurately constrain the approximate subduction geometry, particularly when aided with the addition of data from local active seismic surveys. In addition, we show an example of the integration of many two-dimensional profiles into a threedimensional surface for the Sunda subduction zone and introduce the development of a new global threedimensional subduction interface model: Slab1.0. ?? 2009 by the American Geophysical Union.
Soldati, Nicola; Calhoun, Vince D.; Bruzzone, Lorenzo; Jovicich, Jorge
2013-01-01
Independent component analysis (ICA) techniques offer a data-driven possibility to analyze brain functional MRI data in real-time. Typical ICA methods used in functional magnetic resonance imaging (fMRI), however, have been until now mostly developed and optimized for the off-line case in which all data is available. Real-time experiments are ill-posed for ICA in that several constraints are added: limited data, limited analysis time and dynamic changes in the data and computational speed. Previous studies have shown that particular choices of ICA parameters can be used to monitor real-time fMRI (rt-fMRI) brain activation, but it is unknown how other choices would perform. In this rt-fMRI simulation study we investigate and compare the performance of 14 different publicly available ICA algorithms systematically sampling different growing window lengths (WLs), model order (MO) as well as a priori conditions (none, spatial or temporal). Performance is evaluated by computing the spatial and temporal correlation to a target component as well as computation time. Four algorithms are identified as best performing (constrained ICA, fastICA, amuse, and evd), with their corresponding parameter choices. Both spatial and temporal priors are found to provide equal or improved performances in similarity to the target compared with their off-line counterpart, with greatly reduced computation costs. This study suggests parameter choices that can be further investigated in a sliding-window approach for a rt-fMRI experiment. PMID:23378835
NASA Astrophysics Data System (ADS)
Alam, Aftab; Johnson, D. D.
2009-09-01
Site-centered, electronic-structure methods use an expansion inside nonoverlapping “muffin-tin” (MT) spheres plus an interstitial basis set. As the boundary separating the more spherical from nonspherical density between atoms, the “saddle-point” radii (SPR) in the density provide an optimal spherical region for expanding in spherical harmonics, as used in augmented plane wave, muffin-tin orbital, and multiple-scattering [Korringa, Kohn, and Rostoker (KKR)] methods. These MT-SPR guarantee unique, convex Voronoi polyhedra at each site, in distinction to Bader topological cells. We present a numerically fast, two-center expansion to find SPR a priori from overlapping atomic charge densities, valid also for disordered alloys. We adopt this MT-SPR basis for KKR in the atomic sphere approximation and study (dis)ordered alloys with large differences in atomic size (fcc CoPt and bcc CrW). For this simple and unique improvement, we find formation energies and structural parameters in strikingly better agreement with more exact methods or experiment, and resolve issues with former results.
Medlyn, Belinda E; De Kauwe, Martin G; Zaehle, Sönke; Walker, Anthony P; Duursma, Remko A; Luus, Kristina; Mishurov, Mikhail; Pak, Bernard; Smith, Benjamin; Wang, Ying-Ping; Yang, Xiaojuan; Crous, Kristine Y; Drake, John E; Gimeno, Teresa E; Macdonald, Catriona A; Norby, Richard J; Power, Sally A; Tjoelker, Mark G; Ellsworth, David S
2016-08-01
The response of terrestrial ecosystems to rising atmospheric CO2 concentration (Ca ), particularly under nutrient-limited conditions, is a major uncertainty in Earth System models. The Eucalyptus Free-Air CO2 Enrichment (EucFACE) experiment, recently established in a nutrient- and water-limited woodland presents a unique opportunity to address this uncertainty, but can best do so if key model uncertainties have been identified in advance. We applied seven vegetation models, which have previously been comprehensively assessed against earlier forest FACE experiments, to simulate a priori possible outcomes from EucFACE. Our goals were to provide quantitative projections against which to evaluate data as they are collected, and to identify key measurements that should be made in the experiment to allow discrimination among alternative model assumptions in a postexperiment model intercomparison. Simulated responses of annual net primary productivity (NPP) to elevated Ca ranged from 0.5 to 25% across models. The simulated reduction of NPP during a low-rainfall year also varied widely, from 24 to 70%. Key processes where assumptions caused disagreement among models included nutrient limitations to growth; feedbacks to nutrient uptake; autotrophic respiration; and the impact of low soil moisture availability on plant processes. Knowledge of the causes of variation among models is now guiding data collection in the experiment, with the expectation that the experimental data can optimally inform future model improvements. PMID:26946185
NASA Astrophysics Data System (ADS)
Hayes, Gavin P.; Wald, David J.; Keranen, Katie
2009-09-01
Ongoing developments in earthquake source inversions incorporate nonplanar fault geometries as inputs to the inversion process, improving previous approaches that relied solely on planar fault surfaces. This evolution motivates advancing the existing framework for constraining fault geometry, particularly in subduction zones where plate boundary surfaces that host highly hazardous earthquakes are clearly nonplanar. Here, we improve upon the existing framework for the constraint of the seismic rupture plane of subduction interfaces by incorporating active seismic and seafloor sediment thickness data with existing independent data sets and inverting for the most probable nonplanar subduction geometry. Constraining the rupture interface a priori with independent geological and seismological information reduces the uncertainty in the derived earthquake source inversion parameters over models that rely on simpler assumptions, such as the moment tensor inferred fault plane. Examples are shown for a number of well-constrained global locations. We expand the coverage of previous analyses to a more uniform global data set and show that even in areas of sparse data this approach is able to accurately constrain the approximate subduction geometry, particularly when aided with the addition of data from local active seismic surveys. In addition, we show an example of the integration of many two-dimensional profiles into a three-dimensional surface for the Sunda subduction zone and introduce the development of a new global three-dimensional subduction interface model: Slab1.0.
Travelers' Health: Leishmaniasis, Visceral
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... Infected travelers should be advised to consult an infectious disease or tropical medicine specialist. Therapy for VL should ...
... Citizens and Residents Living in Areas with Ongoing Zika Virus Transmission Guidelines for Travelers Visiting Friends and Family ... with Zika . For the most current information about Zika virus, please visit CDC’s Zika website . Traveling soon? Get ...
Travelers' Health: HIV Infection
... 448-4911 ( www.nccc.ucsf.edu ). HIV TESTING REQUIREMENTS FOR US TRAVELERS ENTERING FOREIGN COUNTRIES International travelers ... extended stay should review that country’s policies and requirements. This information is usually available from the consular ...
Traveling Safely with Medicines
... Medications Safely My Medicine List How to Administer Traveling Safely with Medicines Planes, trains, cars – even boats ... your trip, ask your pharmacist about how to travel safely with your medicines. Make sure that you ...
... airplane References Centers for Disease Control and Prevention. Travelers' health: common travel health topics. Updated 10/23/2014. ... A.M. Editorial team. Related MedlinePlus Health Topics Traveler's Health Browse the Encyclopedia A.D.A.M., Inc. ...
ERIC Educational Resources Information Center
British Columbia Dept. of Education, Victoria.
Written for college entry-level travel agent training courses, this course outline can also be used for inservice training programs offered by travel agencies. The outline provides information on the work of a travel agent and gives clear statements on what learners must be able to do by the end of their training. Material is divided into eight…
ERIC Educational Resources Information Center
Roman, Harry T.
2007-01-01
Airplane travelers are dismayed by the long lines and seemingly chaotic activities that precede boarding a full airplane. Surely, the one who can solve this problem is going to make many travelers happy. This article describes the Jet Travel Challenge, an activity that challenges students to create some alternatives to this now frustrating…
MENU Return to Web version Air Travel Health Tips Air Travel Health Tips How can I improve plane travel? Most people don't have any problems when ... and dosages of all of your medicines. The air in airplanes is dry, so drink nonalcoholic, decaffeinated ...
NASA Astrophysics Data System (ADS)
Van Grootel, V.; Charpinet, S.; Fontaine, G.; Brassard, P.; Green, E. M.; Chayer, P.; Randall, S. K.
2008-09-01
Context: Balloon 090100001, the brightest of the known pulsating hot B subdwarfs, exhibits simultaneoulsy both short- and long-period pulsation modes, and shows relatively large amplitudes for its dominant modes. For these reasons, it has been studied extensively over the past few years, including a successful experiment carried out at the Canada-France-Hawaii Telescope to pin down or constrain the value of the degree index ℓ of several pulsation modes through multicolor photometry. Aims: The primary goal of this paper is to take advantage of such partial mode identification to test the robustness of our standard approach to the asteroseismology of pulsating subdwarf B stars. The latter is based on the forward approach whereby a model that best matches the observed periods is searched for in parameter space with no a priori assumption about mode identification. When successful, this method leads to the determination of the global structural parameters of the pulsator. As a bonus, it also leads, after the fact, to complete mode identification. For the first time, with the availability of partial mode identification for Balloon 090100001, we are able to evaluate the sensitivity of the inferred seismic model to possible uncertainty in mode identification. Methods: We carry out a number of exercises based on the double optimization technique that we developed within the framework of the forward modeling approach in asteroseismology. We use the set of ten periods corresponding to the independent pulsation modes for which values of ℓ have been either formally identified or constrained through multicolor photometry in Balloon 090100001. These exercises differ in that they assume different a priori mode identification. Results: Our primary result is that the asteroseismic solution stands very robust, whether or not external constraints on the values of the degree ℓ are used. Although this may come as a small surprise, the test proves to be conclusive, and small
Borghesi, Giulio; Bellan, Josette
2015-03-15
A Direct Numerical Simulation (DNS) database was created representing mixing of species under high-pressure conditions. The configuration considered is that of a temporally evolving mixing layer. The database was examined and analyzed for the purpose of modeling some of the unclosed terms that appear in the Large Eddy Simulation (LES) equations. Several metrics are used to understand the LES modeling requirements. First, a statistical analysis of the DNS-database large-scale flow structures was performed to provide a metric for probing the accuracy of the proposed LES models as the flow fields obtained from accurate LESs should contain structures of morphology statistically similar to those observed in the filtered-and-coarsened DNS (FC-DNS) fields. To characterize the morphology of the large-scales structures, the Minkowski functionals of the iso-surfaces were evaluated for two different fields: the second-invariant of the rate of deformation tensor and the irreversible entropy production rate. To remove the presence of the small flow scales, both of these fields were computed using the FC-DNS solutions. It was found that the large-scale structures of the irreversible entropy production rate exhibit higher morphological complexity than those of the second invariant of the rate of deformation tensor, indicating that the burden of modeling will be on recovering the thermodynamic fields. Second, to evaluate the physical effects which must be modeled at the subfilter scale, an a priori analysis was conducted. This a priori analysis, conducted in the coarse-grid LES regime, revealed that standard closures for the filtered pressure, the filtered heat flux, and the filtered species mass fluxes, in which a filtered function of a variable is equal to the function of the filtered variable, may no longer be valid for the high-pressure flows considered in this study. The terms requiring modeling are the filtered pressure, the filtered heat flux, the filtered pressure work
Petyuk, Vladislav A.; Jaitly, Navdeep; Moore, Ronald J.; Ding, Jie; Metz, Thomas O.; Tang, Keqi; Monroe, Matthew E.; Tolmachev, Aleksey V.; Adkins, Joshua N.; Belov, Mikhail E.; Dabney, Alan R.; Qian, Weijun; Camp, David G.; Smith, Richard D.
2008-02-01
The high mass measurement accuracy and precision available with recently developed mass spectrometers is increasingly used in proteomics analyses to confidently identify tryptic peptides from complex mixtures of proteins, as well as post-translational modifications and peptides from non-annotated proteins. To take full advantage of high mass measurement accuracy instruments it is necessary to limit systematic mass measurement errors. It is well known that errors in the measurement of m/z can be affected by experimental parameters that include e.g., outdated calibration coefficients, ion intensity, and temperature changes during the measurement. Traditionally, these variations have been corrected through the use of internal calibrants (well-characterized standards introduced with the sample being analyzed). In this paper we describe an alternative approach where the calibration is provided through the use of a priori knowledge of the sample being analyzed. Such an approach has previously been demonstrated based on the dependence of systematic error on m/z alone. To incorporate additional explanatory variables, we employed multidimensional, nonparametric regression models, which were evaluated using several commercially available instruments. The applied approach is shown to remove any noticeable biases from the overall mass measurement errors, and decreases the overall standard deviation of the mass measurement error distribution by 1.2- to 2-fold, depending on instrument type. Subsequent reduction of the random errors based on multiple measurements over consecutive spectra further improves accuracy and results in an overall decrease of the standard deviation by 1.8- to 3.7-fold. This new procedure will decrease the false discovery rates for peptide identifications using high accuracy mass measurements.
Comprehensive care of travelers.
Pust, R E; Peate, W F; Cordes, D H
1986-12-01
Travel, especially if it is international, often means major changes for the family. Family physicians should assess the epidemiologic risk and psychosocial significance of travel or relocation in light of the family's life-cycle stage and antecedent health. Using core references, which are kept current in partnership with public health agencies, family physicians are able to provide comprehensive immunization, medications, and patient education for all travel risks. Families are given medical record summaries and recommended sources of care at their destination. Eight weeks after their return patients are reassessed for newly acquired illness and helped to integrate the perspectives gained during the travel into the family's future dynamics. Taking advantage of growing travel medicine opportunities, family medicine educators should base the care of travelers and teaching of residents on defined competence priorities. Travelers' health provides a mutually rewarding model of shared care with public health consultants in the community medicine curriculum. PMID:3537200
Student Travel: Policies - Regulations - Exhibits.
ERIC Educational Resources Information Center
Trujillo, Lorenzo A.; And Others
The Jefferson County (Colorado) Public Schools' regulations and policies concerning student travel covers these forms of travel: student activity travel, extended student travel, district sponsored student travel, district authorized student travel, student exchange, and bonus learning trips. Issues and items addressed include: (1) authorization…
NASA Astrophysics Data System (ADS)
Afonso, J. C.; Fullea, J.; Griffin, W. L.; Yang, Y.; Jones, A. G.; D. Connolly, J. A.; O'Reilly, S. Y.
2013-05-01
Traditional inversion techniques applied to the problem of characterizing the thermal and compositional structure of the upper mantle are not well suited to deal with the nonlinearity of the problem, the trade-off between temperature and compositional effects on wave velocities, the nonuniqueness of the compositional space, and the dissimilar sensitivities of physical parameters to temperature and composition. Probabilistic inversions, on the other hand, offer a powerful formalism to cope with all these difficulties, while allowing for an adequate treatment of the intrinsic uncertainties associated with both data and physical theories. This paper presents a detailed analysis of the two most important elements controlling the outputs of probabilistic (Bayesian) inversions for temperature and composition of the Earth's mantle, namely the a priori information on model parameters, ρ(m), and the likelihood function, L(m). The former is mainly controlled by our current understanding of lithosphere and mantle composition, while the latter conveys information on the observed data, their uncertainties, and the physical theories used to relate model parameters to observed data. The benefits of combining specific geophysical datasets (Rayleigh and Love dispersion curves, body wave tomography, magnetotelluric, geothermal, petrological, gravity, elevation, and geoid), and their effects on L(m), are demonstrated by analyzing their individual and combined sensitivities to composition and temperature as well as their observational uncertainties. The dependence of bulk density, electrical conductivity, and seismic velocities to major-element composition is systematically explored using Monte Carlo simulations. We show that the dominant source of uncertainty in the identification of compositional anomalies within the lithosphere is the intrinsic nonuniqueness in compositional space. A general strategy for defining ρ(m) is proposed based on statistical analyses of a large database
NASA Astrophysics Data System (ADS)
Béthoux, Nicole; Theunissen, Thomas; Beslier, Marie-Odile; Font, Yvonne; Thouvenot, François; Dessa, Jean-Xavier; Simon, Soazig; Courrioux, Gabriel; Guillen, Antonio
2016-02-01
The region between the inner zones of the Alps and Corsica juxtaposes an overthickened crust to an oceanic domain, which makes difficult to ascertain the focal depth of seismic events using routine location codes and average 1D velocity models. The aim of this article is to show that, even with a rather lose monitoring network, accurate routine locations can be achieved by using realistic 3D modelling and advanced location techniques. Previous earthquake tomography studies cover the whole region with spatial resolutions of several tens of kilometres on land, but they fail to resolve the marine domain due to the absence of station coverage and sparse seismicity. To overcome these limitations, we first construct a 3D a-priori P and S velocity model integrating known geophysical and geological information. Significant progress has been achieved in the 3D numerical modelling of complex geological structures by the development of dedicated softwares (e.g. 3D GeoModeller), capable at once of elaborating a 3D structural model from geological and geophysical constraints and, possibly, of refining it by inversion processes (Calcagno et al., 2008). Then, we build an arrival-time catalogue of 1500 events recorded from 2000 to 2011. Hypocentres are then located in this model using a numerical code based on the maximum intersection method (Font et al., 2004), updated by Theunissen et al. (2012), as well as another 3D location technique, the NonLinLoc software (Lomax and Curtis, 2001). The reduction of arrival-time residuals and uncertainties (dh, dz) with respect to classical 1D locations demonstrates the improved accuracy allowed by our approach and confirms the coherence of the 3D geological model built and used in this study. Our results are also compared with previous works that benefitted from the installation of dense temporary networks surrounding the studied epicentre area. The resulting 3D location catalogue allows us to improve the regional seismic hazard assessment
Immunizations for foreign travel.
Hill, D. R.
1992-01-01
One of the most important aspects of preparing travelers for destinations throughout the world is providing them with immunizations. Before administering any vaccines, however, a careful health and immunization history and travel itinerary should be obtained in order to determine vaccine indications and contraindications. There are three categories of immunizations for foreign travel. The first category includes immunizations which are routinely recommended whether or not the individual is traveling. Many travelers are due for primary vaccination or boosting against tetanus-diphtheria, measles-mumps-rubella, pneumococcal pneumonia, and influenza, for example, and the pre-travel visit is an ideal time to administer these. The second category are immunizations which might be required by a country as a condition for entry; these are yellow fever and cholera. The final category contains immunizations which are recommended because there is a risk of acquiring a particular disease during travel. Typhoid fever, meningococcal disease, rabies, and hepatitis are some examples. Travelers who are pregnant or who are infected with the human immunodeficiency virus require special consideration. Provision of appropriate immunizations for foreign travel is an important aspect of preventing illness in travelers. PMID:1337807
Uniqueness and stability of traveling waves for cellular neural networks with multiple delays
NASA Astrophysics Data System (ADS)
Yu, Zhi-Xian; Mei, Ming
2016-01-01
In this paper, we investigate the properties of traveling waves to a class of lattice differential equations for cellular neural networks with multiple delays. Following the previous study [38] on the existence of the traveling waves, here we focus on the uniqueness and the stability of these traveling waves. First of all, by establishing the a priori asymptotic behavior of traveling waves and applying Ikehara's theorem, we prove the uniqueness (up to translation) of traveling waves ϕ (n - ct) with c ≤c* for the cellular neural networks with multiple delays, where c* < 0 is the critical wave speed. Then, by the weighted energy method together with the squeezing technique, we further show the global stability of all non-critical traveling waves for this model, that is, for all monotone waves with the speed c
[Vaccination for international travelers].
Arrazola, M Pilar; Serrano, Almudena; López-Vélez, Rogelio
2016-05-01
Traveler's vaccination is one of the key strategies for the prevention of infectious diseases during international travel. The risk of acquiring an infectious disease is determined in each case by the characteristics of the traveler and the travel, so the pre-departure medical advice of the traveler must be individualized. The World Health Organization classifies travelerś vaccines into three groups. - Vaccines for routine use in national immunization programs: Haemophilus influenzae type b, hepatitis B, polio, measles-mumps-rubella, tetanus-diphtheria-whooping a cough, and chickenpox. - Vaccinations required by law in certain countries before to enter them: yellow fever, meningococcal disease and poliomyelitis. - Vaccines recommended depending on the circumstances: cholera, japanese encephalitis, tick-borne encephalitis, meningococcal disease, typhoid fever, influenza, hepatitis A, hepatitis B, rabies and BCG. This review is intended to introduce the reader to the field of international vaccination. PMID:26920587
[Fever in returning travelers].
Burchard, G
2014-03-01
Travel-related illness is most often due to gastrointestinal, febrile, and dermatologic diseases. Fever in a returned traveler demands prompt attention because it may be a manifestation of an infection that could be rapidly progressive and lethal. The approach to the febrile patient should be stepwise and consider travel and exposure history. Malaria is the most common cause of fever in patients returning from Sub-Saharan Africa, whereas dengue is more frequent in travelers from other tropical and subtropical areas. Other serious diseases are typhoid and paratyphoid fever, amebic liver abscess, visceral leishmaniasis, leptospirosis and-rarely-viral hemorrhagic fevers. PMID:24557143
[Physical exposure by travelling].
Lange, U
2011-06-01
Approximately 40 million Germans travel abroad every year. Air travel is the most frequently used mean of transportation followed by the automobile. During airplane flights rheumatic patients are subjected to numerous physical, biological and climatic factors which can cause stress and adverse effects on general health. Therefore, preventive strategies are helpful to protect against health damage, provided that there is general fitness for air travel. The present article focuses on physical and biological stress as well as psychological aspects during air travel and reviews prophylactic measures. PMID:21533614
Pre-Travel Medical Preparation of Business and Occupational Travelers
Khan, Nomana M.; Jentes, Emily S.; Brown, Clive; Han, Pauline; Rao, Sowmya R.; Kozarsky, Phyllis; Hagmann, Stefan H.F.; LaRocque, Regina C.; Ryan, Edward T.
2016-01-01
Objectives: The aim of the study was to understand more about pre-travel preparations and itineraries of business and occupational travelers. Methods: De-identified data from 18 Global TravEpiNet clinics from January 2009 to December 2012 were analyzed. Results: Of 23,534 travelers, 61% were non-occupational and 39% occupational. Business travelers were more likely to be men, had short times to departure and shorter trip durations, and commonly refused influenza, meningococcal, and hepatitis B vaccines. Most business travelers indicated that employers suggested the pre-travel health consultation, whereas non-occupational travelers sought consultations because of travel health concerns. Conclusions: Sub-groups of occupational travelers have characteristic profiles, with business travelers being particularly distinct. Employers play a role in encouraging business travelers to seek pre-travel consultations. Such consultations, even if scheduled immediately before travel, can identify vaccination gaps and increase coverage. PMID:26479857
Illness in Returned Travellers
Lawee, D.; Scappatura, P.; Gutman, E.
1989-01-01
Intercontinental travel is more common now than it has ever been before, and so are travel-related diseases. A thorough history and physical examination provide many clues to possible pathogens, particularly when combined with knowledge of the geographic distribution of specific diseases. Prompt diagnosis and proper treatment are imperative. PMID:21249095
... a cruise, it may not be the best time to go. Travel by sea may cause motion sickness or nausea. ... out of the country. Plan ahead to allow time for any shots or medicines you may need. When you travel, take a copy of your prenatal care record ...
Information for Travellers' Physicians
Allison, David J.; Blinco, Kimberley
1990-01-01
Physicians can obtain advice about international travel for their patients from many different sources of information. The authors review some of the most common sources based on their experience at the International Travellers' Clinic operated by the New Brunswick Department of Health and Community Services in Fredericton. They identify readily available handbooks and periodicals and compare two computer software programs. PMID:21233910
[Vaccinations for the travellers].
Gendrel, Dominique
2004-03-15
Immunisations for the traveller include, before specific vaccine, a correct immunisation schedule according to national recommendations with appropriate boosters and hepatitis B immunisation. The yellow fever vaccine is required to entry in countries of endemic area and quadrivalent ACYW135 meningococcal vaccine for entry in Saudi Arabia. Hepatitis A immunisation could be performed at 1 year of age and is recommended for travellers in tropical areas and children vaccination control the disease both in the patient and in the contacts. Meningococcal A+C vaccines are required for travellers in meningitis-prone areas of tropical Africa during the dry season (December to June), and quadrivalent ACYW135 is useful only in Burkina-Faso and Niger. Typhoid and rabies vaccines are required for ambulatory travellers in endemic areas, as Japanese encephalitis in south-west Asia. In central Europe, tick-borne encephalitis vaccination is recommended for patients travelling in forest areas during spring and summer. PMID:15176511
Gallus, Alexander S; Goghlan, Douglas C
2002-09-01
Debate continues about whether and to what extent travel predisposes to venous thrombosis and pulmonary embolism (PE). Almost certainly, the strength of any association was greatly exaggerated in recent press reports. Conclusions from case-control studies vary, with some finding no excess of recent travel among patients with venous thromboembolism and others reporting a two-four fold excess. The strongest evidence that prolonged air travel predisposes to thrombosis comes from the travel history of people who present with PE immediately after landing. Two independent analyses suggest that the risk of early embolism increases exponentially with travel times beyond 6 hours and may reach 1:200,000 passengers traveling for more than 12 hours. The most likely explanation is venous stasis in the legs from prolonged sitting, and there is evidence (preliminary and controversial) that elastic support stockings may prevent deep vein thrombosis in people who travel long-distances. There is an urgent need for more and better studies to define the absolute hazard from travel-related thrombosis and the personal risk factors that may contribute. Without these, it is difficult to give a balanced account to people who intend to travel or to consider definitive prevention trials. Case reports suggest that in most cases, travel-related thrombosis has affected people who were also at risk because of previous thrombosis, recent injury, or other predispositions. This makes it sensible to target such "at risk" people with advice about hazards and precautions, at least until formal study validates some other approach. PMID:12172438
Understanding taxi travel patterns
NASA Astrophysics Data System (ADS)
Cai, Hua; Zhan, Xiaowei; Zhu, Ji; Jia, Xiaoping; Chiu, Anthony S. F.; Xu, Ming
2016-09-01
Taxis play important roles in modern urban transportation systems, especially in mega cities. While providing necessary amenities, taxis also significantly contribute to traffic congestion, urban energy consumption, and air pollution. Understanding the travel patterns of taxis is thus important for addressing many urban sustainability challenges. Previous research has primarily focused on examining the statistical properties of passenger trips, which include only taxi trips occupied with passengers. However, unoccupied trips are also important for urban sustainability issues because they represent potential opportunities to improve the efficiency of the transportation system. Therefore, we need to understand the travel patterns of taxis as an integrated system, instead of focusing only on the occupied trips. In this study we examine GPS trajectory data of 11,880 taxis in Beijing, China for a period of three weeks. Our results show that taxi travel patterns share similar traits with travel patterns of individuals but also exhibit differences. Trip displacement distribution of taxi travels is statistically greater than the exponential distribution and smaller than the truncated power-law distribution. The distribution of short trips (less than 30 miles) can be best fitted with power-law while long trips follow exponential decay. We use radius of gyration to characterize individual taxi's travel distance and find that it does not follow a truncated power-law as observed in previous studies. Spatial and temporal regularities exist in taxi travels. However, with increasing spatial coverage, taxi trips can exhibit dual high probability density centers.
Hu, Xiaowen; Cowl, Clayton T; Baqir, Misbah; Ryu, Jay H
2014-04-01
The number of medical emergencies onboard aircraft is increasing as commercial air traffic increases and the general population ages, becomes more mobile, and includes individuals with serious medical conditions. Travelers with respiratory diseases are at particular risk for in-flight events because exposure to lower atmospheric pressure in a pressurized cabin at cruising altitude may result in not only hypoxemia but also pneumothorax due to gas expansion within enclosed pulmonary parenchymal spaces based on Boyle's law. Risks of pneumothorax during air travel pertain particularly to those patients with cystic lung diseases, recent pneumothorax or thoracic surgery, and chronic pneumothorax. Currently available guidelines are admittedly based on sparse data and include recommendations to delay air travel for 1 to 3 weeks after thoracic surgery or resolution of the pneumothorax. One of these guidelines declares existing pneumothorax to be an absolute contraindication to air travel although there are reports of uneventful air travel for those with chronic stable pneumothorax. In this article, we review the available data regarding pneumothorax and air travel that consist mostly of case reports and retrospective surveys. There is clearly a need for additional data that will inform decisions regarding air travel for patients at risk for pneumothorax, including those with recent thoracic surgery and transthoracic needle biopsy. PMID:24687705
[Thromboembolism in travelers].
Bihari, I; Sándor, T
2001-11-11
The association between long haul travel and the risk of venous thromboembolism are suspected for long time. Mostly air travel related thrombosis series have been reported in the literature. Risk factors can be classified as: 1. travel related factors (coach position, immobilization, prolonged air travel, narrow seat and room, diuretic effect of alcohol, insufficient fluid intake, dehydration, direct pressure on leg veins, rare inspiration). 2. air plane related risk factors (low humidity, relative hypoxia, stress). 3. patient related factors (hereditary and acquired thrombophylia, previous deep venous thrombosis, age over 40, recent surgery or trauma, gravidity, puerperium, oestrogen containing pills, varicosity, chronic heart disease, obesity, fever, diarrhoea, vomiting, smoking). No patient related factors were found in some cases. To reduce the hazards air travellers are rightly concerned to know the level of the risk and the airlines should be responsible for this information. People should discuss with their physician what prophlylactic measures should be taken, such as compression stockings or low molecular weight heparin. Not only flight but car, bus and train travellers are also at risk of developing venous thromboembolism. Long haul travel alone is a separate risk factor for venous thromboembolism. PMID:11778354
ERIC Educational Resources Information Center
Instructor, 1981
1981-01-01
Describes the winners of the Space Traveler Project, a contest jointly sponsored by Rockwell International, NASA, and this magazine to identify worthwhile elementary science programs relating to the Space Shuttle. (SJL)
In an effort to inspire and motivate the next generation of space explorers, NASAâs Ames Research Center teamed up with the Traveling Space Museum to teach students the way astronauts are taughtâ...
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
Travelers' Health: Hepatitis C
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... American Association for Study of Liver Diseases (AASLD), Infectious Diseases Society of America (IDSA). Recommendations for testing, managing, ...
Travelers' Health: Hepatitis E
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
Travelers' Health: Cryptosporidiosis
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
Travelers' Health: Leishmaniasis, Cutaneous
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
Travelers' Health: Japanese Encephalitis
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
Travelers' Health: Tickborne Encephalitis
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... Gabriel PS, Alonso M, et al. Prevalence of infectious diseases among internationally adopted children. Pediatrics. 2001 Sep 3; ...
Travelers' Health: Varicella (Chickenpox)
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... Red Book: 2012 Report of the Committee on Infectious Diseases. 29th ed. Elk Grove Village, IL: American Academy ...
Travelers' Health: Hepatitis B
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... Red Book: 2012 Report of the Committee on Infectious Diseases. 29th ed. Elk Grove Village, IL: American Academy ...
Travelers' Health: Coccidioidomycosis
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
Travelers' Health: Meningococcal Disease
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... Red Book: 2012 Report of the Committee on Infectious Diseases. 29th ed. Elk Grove Village, IL: American Academy ...
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... Red Book: 2012 Report of the Committee on Infectious Diseases. 29th ed. Elk Grove Village, IL: American Academy ...
Travelers' Health: Yellow Fever
... Global TravEpiNet Mobile Apps RSS Feeds Chapter 3 Infectious Diseases Related to Travel Recommend on Facebook Tweet Share ... and Prevention National Center for Emerging and Zoonotic Infectious Diseases (NCEZID) Division of Global Migration and Quarantine (DGMQ) ...
Hietala, V.M.; Vawter, G.A.
1993-12-14
The traveling-wave photodetector of the present invention combines an absorptive optical waveguide and an electrical transmission line, in which optical absorption in the waveguide results in a photocurrent at the electrodes of the electrical transmission line. The optical waveguide and electrical transmission line of the electrically distributed traveling-wave photodetector are designed to achieve matched velocities between the light in the optical waveguide and electrical signal generated on the transmission line. This velocity synchronization provides the traveling-wave photodetector with a large electrical bandwidth and a high quantum efficiency, because of the effective extended volume for optical absorption. The traveling-wave photodetector also provides large power dissipation, because of its large physical size. 4 figures.
Hietala, Vincent M.; Vawter, Gregory A.
1993-01-01
The traveling-wave photodetector of the present invention combines an absorptive optical waveguide and an electrical transmission line, in which optical absorption in the waveguide results in a photocurrent at the electrodes of the electrical transmission line. The optical waveguide and electrical transmission line of the electrically distributed traveling-wave photodetector are designed to achieve matched velocities between the light in the optical waveguide and electrical signal generated on the transmission line. This velocity synchronization provides the traveling-wave photodetector with a large electrical bandwidth and a high quantum efficiency, because of the effective extended volume for optical absorption. The traveling-wave photodetector also provides large power dissipation, because of its large physical size.
... of pregnancy. If you are planning an international flight, the cutoff point for traveling with international airlines ... up and stretch your legs during a long flight. Avoid gas-producing foods and carbonated drinks before ...
Bomsztyk, Mayan; Arnold, Richard W
2013-07-01
Travel medicine continues to grow as international tourism and patient medical complexity increases. This article reflects the state of the current field, but new recommendations on immunizations, resistance patterns, and treatment modalities constantly change. The US Centers for Disease Control and the World Health Organization maintain helpful Web sites for both patient and physician. With thoughtful preparation and prevention, risks can be minimized and travel can continue as safely as possible. PMID:23809721
Choice, changeover, and travel
Baum, William M.
1982-01-01
Since foraging in nature can be viewed as instrumental behavior, choice between sources of food, known as “patches,” can be viewed as choice between instrumental response alternatives. Whereas the travel required to change alternatives deters changeover in nature, the changeover delay (COD) usually deters changeover in the laboratory. In this experiment, pigeons were exposed to laboratory choice situations, concurrent variable-interval schedules, that were standard except for the introduction of a travel requirement for changeover. As the travel requirement increased, rate of changeover decreased and preference for a favored alternative strengthened. When the travel requirement was small, the relations between choice and relative reinforcement revealed the usual tendencies toward matching and undermatching. When the travel requirement was large, strong overmatching occurred. These results, together with those from experiments in which changeover was deterred by punishment or a fixed-ratio requirement, deviate from the matching law, even when a correction is made for cost of changeover. If one accepted an argument that the COD is analogous to travel, the results suggest that the norm in choice relations would be overmatching. This overmatching, however, might only be the sign of an underlying strategy approximating optimization. PMID:16812283
Riddle, Mark S; Connor, Bradley A
2016-09-01
Given the recent interest in the human gut microbiome in health and disease, we have undertaken a review of the role of the gut microbiome as it relates to travel. Considering the microbiome as the interface with the external world of the traveler, not only from the perspective of protection from enteric infection by colonization resistance but also the possibility that a traveler's unique microbiome may place him or her at lesser or greater risk for enteric infection. We review available data on travel, travelers' diarrhea, and the use of antibiotics as it relates to changes in the microbiome and the acquisition of multi-drug-resistant bacteria and explore the interplay of these factors in the development of dysbiosis and the post-infectious sequelae of TD, specifically PI-IBS. In addition, we explore whether dietary changes in travel affect the gut microbiome in a way which modulates gastrointestinal function and susceptibility to infection and discuss whether pre- or probiotics have any meaningful role in prevention or treatment of TD. Finally, a discussion of important research gaps and opportunities in this area is identified. PMID:27447891
Ivanov, J.; Miller, R.D.; Xia, J.; Steeples, D.; Park, C.B.
2005-01-01
In a set of two papers we study the inverse problem of refraction travel times. The purpose of this work is to use the study as a basis for development of more sophisticated methods for finding more reliable solutions to the inverse problem of refraction travel times, which is known to be nonunique. The first paper, "Types of Geophysical Nonuniqueness through Minimization," emphasizes the existence of different forms of nonuniqueness in the realm of inverse geophysical problems. Each type of nonuniqueness requires a different type and amount of a priori information to acquire a reliable solution. Based on such coupling, a nonuniqueness classification is designed. Therefore, since most inverse geophysical problems are nonunique, each inverse problem must be studied to define what type of nonuniqueness it belongs to and thus determine what type of a priori information is necessary to find a realistic solution. The second paper, "Quantifying Refraction Nonuniqueness Using a Three-layer Model," serves as an example of such an approach. However, its main purpose is to provide a better understanding of the inverse refraction problem by studying the type of nonuniqueness it possesses. An approach for obtaining a realistic solution to the inverse refraction problem is planned to be offered in a third paper that is in preparation. The main goal of this paper is to redefine the existing generalized notion of nonuniqueness and a priori information by offering a classified, discriminate structure. Nonuniqueness is often encountered when trying to solve inverse problems. However, possible nonuniqueness diversity is typically neglected and nonuniqueness is regarded as a whole, as an unpleasant "black box" and is approached in the same manner by applying smoothing constraints, damping constraints with respect to the solution increment and, rarely, damping constraints with respect to some sparse reference information about the true parameters. In practice, when solving geophysical
Voortman, Trudy; Leermakers, Elisabeth T M; Franco, Oscar H; Jaddoe, Vincent W V; Moll, Henriette A; Hofman, Albert; van den Hooven, Edith H; Kiefte-de Jong, Jessica C
2016-08-01
Dietary patterns have been linked to obesity in adults, however, not much is known about this association in early childhood. We examined associations of different types of dietary patterns in 1-year-old children with body composition at school age in 2026 children participating in a population-based cohort study. Dietary intake at the age of 1 year was assessed with a food-frequency questionnaire. At the children's age of 6 years we measured their body composition with dual-energy X-ray absorptiometry and we calculated body mass index, fat mass index (FMI), and fat-free mass index (FFMI). Three dietary pattern approaches were used: (1) An a priori-defined diet quality score; (2) dietary patterns based on variation in food intake, derived from principal-component-analysis (PCA); and (3) dietary patterns based on variations in FMI and FFMI, derived with reduced-rank-regression (RRR). Both the a priori-defined diet score and a 'Health-conscious' PCA-pattern were characterized by a high intake of fruit, vegetables, grains, and vegetable oils, and, after adjustment for confounders, children with higher adherence to these patterns had a higher FFMI at 6 years [0.19 SD (95 % CI 0.08;0.30) per SD increase in diet score], but had no different FMI. One of the two RRR-patterns was also positively associated with FFMI and was characterized by intake of whole grains, pasta and rice, and vegetable oils. Our results suggest that different a priori- and a posteriori-derived health-conscious dietary patterns in early childhood are associated with a higher fat-free mass, but not with fat mass, in later childhood. PMID:27384175
Management of travellers' diarrhoea.
Caeiro, J P; DuPont, H L
1998-07-01
The most common health problem encountered in international travellers to topical and subtropical areas is diarrhoea. Even though it is not a life-threatening condition, it may influence deeply the quality of a vacation or the success of a business trip. The majority of cases of travellers' diarrhoea are due to bacterial pathogens, but viruses have also been implicated in a minority of patients. It is advocated that travellers with diarrhoea provide themselves with sources of salt (crackers or soup) and mineral water, to prevent and treat dehydration. Otherwise, treatment recommendations follow illness severity. For mild cases, symptomatic relief alone can be recommended. Loperamide is an effective agent improving diarrhoea and associated symptoms. For moderate diarrhoea (requiring a forced change in itinerary) combination therapy is advised using a fluoroquinolone together with loperamide. Severe diarrhoea [fever > 38 degrees C, dysentery (bloody stools) or incapacitating symptoms] should prompt the voyager to take an antibiotic alone for 3 to 5 days. Loperamide is relatively contraindicated in these cases. For the minority of patients receiving chemoprophylaxis to prevent travellers' diarrhoea, fluoroquinolones taken once a day while in the area at risk produce the highest protection rate (up to 95%). However, most authorities do not recommend routine prophylaxis for travellers. PMID:9664200
Hawkes, S J; Hart, G J
1993-01-01
This is a review of recent publications on the subject of travel (taken in its widest sense) and HIV/AIDS. As with all epidemics caused by transmissible pathogens, AIDS has been seen in many countries as an imported problem. What this perspective fails to recognize is that with the explosion of international travel in the past thirty years it is virtually impossible to prevent the spread of infectious disease across international frontiers. Here we highlight the relative paucity of studies that describe or investigate the context in which sexual risk behaviour of travellers takes place, and suggest areas of further research which could increase understanding of the nature of sexual risk taking, and help in the design of health education programmes. PMID:8329484
[Viral hepatitis in travellers].
Abreu, Cândida
2007-01-01
Considering the geographical asymmetric distribution of viral hepatitis A, B and E, having a much higher prevalence in the less developed world, travellers from developed countries are exposed to a considerable and often underestimated risk of hepatitis infection. In fact a significant percentage of viral hepatitis occurring in developed countries is travel related. This results from globalization and increased mobility from tourism, international work, humanitarian and religious missions or other travel related activities. Several studies published in Europe and North America shown that more than 50% of reported cases of hepatitis A are travel related. On the other hand frequent outbreaks of hepatitis A and E in specific geographic areas raise the risk of infection in these restricted zones and that should be clearly identified. Selected aspects related with the distribution of hepatitis A, B and E are reviewed, particularly the situation in Portugal according to the published studies, as well as relevant clinical manifestations and differential diagnosis of viral hepatitis. Basic prevention rules considering enteric transmitted hepatitis (hepatitis A and hepatitis E) and parenteral transmitted (hepatitis B) are reviewed as well as hepatitis A and B immunoprophylaxis. Common clinical situations and daily practice "pre travel" advice issues are discussed according to WHO/CDC recommendations and the Portuguese National Vaccination Program. Implications from near future availability of a hepatitis E vaccine, a currently in phase 2 trial, are highlighted. Potential indications for travellers to endemic countries like India, Nepal and some regions of China, where up to 30% of sporadic cases of acute viral hepatitis are caused by hepatitis E virus, are considered. Continued epidemiological surveillance for viral hepatitis is essential to recognize and control possible outbreaks, but also to identify new viral hepatitis agents that may emerge as important global health
Walentiny, C
2009-03-01
The second trimester is the safest time for travelling, because the pregnant woman feels generally most at ease and the risk of spontaneous abortion and pre-term labour is very low. Possible risks must be discussed with the obstetrician before travelling. If the pregnancy is uncomplicated most airlines allow flying up to the 36th (domestic flights) and 35th (international flights) week of gestation. Unless the fetal oxygen supply is already impaired at ground level due to an underlying disease, flying does not pose a risk of fetal hypoxia. Radiation exposure during a long distant flight is low compared to the average annual exposure dosage, but the risk of thrombosis is increased. Altitudes up to 2,500 m pose no problem. Sufficient time to acclimatize must be taken when travelling to high altitudes and exercise kept to a minimum. Scuba diving is contraindicated. Since only a few drugs are completely safe during pregnancy a thorough risk/benefit evaluation is mandatory. Treatment of infections can be considerably complicated, but any necessary treatment should not be withheld because of the fear of potential fetal injury. Good knowledge of local medical resources is essential before travelling. Several personal protective measures minimize the risk of infection: food and water precautions, protection from insect bites and avoidance of crowds, unsafe sex and, if need be, freshwater. Many vaccinations are recommended for travellers. However, live vaccines are contraindicated in pregnant women because of theoretical considerations. Exceptionally a yellow fever vaccination may be given after the first trimester. Killed, inactivated or polysaccharide vaccines can be given after the first trimester after a thorough risk/benefit evaluation. Because of the potentially devastating effect of malaria to the mother and the child, travelling to endemic malaria regions should be avoided. If the risk of infection is high chemoprophylaxis with mefloquine is indicated. In low
NASA Astrophysics Data System (ADS)
Koski, Olivia; Rosin, Mark; Guerilla Science Team
2014-03-01
The Intergalactic Travel Bureau is an interactive theater outreach experience that engages the public in the incredible possibilities of space tourism. The Bureau is staffed by professional actors, who play the role of space travel agents, and professional astrophysicists, who play the role of resident scientists. Members of the public of all ages were invited to visit with bureau staff to plan the vacation of their dreams-to space. We describe the project's successful nine day run in New York in August 2013. Funded by the American Physical Society Public Outreach and Informing the Public Grants.
Marra, Nicholas J; Eo, Soo Hyung; Hale, Matthew C; Waser, Peter M; DeWoody, J Andrew
2012-12-01
One common goal in evolutionary biology is the identification of genes underlying adaptive traits of evolutionary interest. Recently next-generation sequencing techniques have greatly facilitated such evolutionary studies in species otherwise depauperate of genomic resources. Kangaroo rats (Dipodomys sp.) serve as exemplars of adaptation in that they inhabit extremely arid environments, yet require no drinking water because of ultra-efficient kidney function and osmoregulation. As a basis for identifying water conservation genes in kangaroo rats, we conducted a priori bioinformatics searches in model rodents (Mus musculus and Rattus norvegicus) to identify candidate genes with known or suspected osmoregulatory function. We then obtained 446,758 reads via 454 pyrosequencing to characterize genes expressed in the kidney of banner-tailed kangaroo rats (Dipodomys spectabilis). We also determined candidates a posteriori by identifying genes that were overexpressed in the kidney. The kangaroo rat sequences revealed nine different a priori candidate genes predicted from our Mus and Rattus searches, as well as 32 a posteriori candidate genes that were overexpressed in kidney. Mutations in two of these genes, Slc12a1 and Slc12a3, cause human renal diseases that result in the inability to concentrate urine. These genes are likely key determinants of physiological water conservation in desert rodents. PMID:22841684
Travel health. Part 1: preparing the tropical traveller.
Carroll, Bernadette; Daniel, Amanda; Behrens, Ron H
The health threats of modern day travel change as population, wealth and tourism increase across the world. A series of three articles have been written to describe the spectrum of health issues associated with travel. Pre-travel health advice has become more focused on risk assessment and educating the traveller about infectious disease and the more frequent non-infectious hazards associated with travel, while ensuring they are not unnecessarily exposed to injury from vaccines and drugs. In part one, the role of the health advisor and the needs of the traveller are examined. The importance of risk assessment during a consultation is described and factors that influence recommendations and prescribing are explored. As most travel-associated morbidity and mortality is non-vaccine preventable, the focus of the pre-travel consultation should be on educating the traveller and influencing behaviour change. The second article in this series deals with the highest risk group of travellers--residents who visit friends and relatives. It highlights their specific problems and special needs and how to influence their risk of disease by addressing their health beliefs and their cultural dimension of risk. The third article explores the common, and not so common, clinical problems found in returned travellers. Nurses have to deal with a large range of clinical problems and diagnostic dilemmas when attending to the returned traveller. The review provides a perspective on the frequency and severity of problems and how nurses should manage travel associated disease. PMID:19062458
... J. Martin, MD, MPH, ABIM Board Certified in Internal Medicine and Hospice and Palliative Medicine, Atlanta, GA. Also reviewed by David Zieve, MD, MHA, Isla Ogilvie, PhD, and the A.D.A.M. Editorial team. Related MedlinePlus Health Topics Diarrhea Traveler's Health Browse the Encyclopedia A. ...
Teachers and Gypsy Travellers.
ERIC Educational Resources Information Center
Lloyd, Gwynedd; Stead, Joan; Jordan, Elizabeth; Norris, Claire
1999-01-01
Interviews in 12 Scottish schools examined how teachers and staff perceived and responded to the culture and behavior of Traveller children--both Gypsies and occupational migrants. The findings raise issues about "difference" versus deviance and the extent to which schools can accommodate cultural diversity when it challenges norms of behavior and…
... specified risk materials from animal feed and human food chains as of October 1, 2000; such bans had already been instituted in most member states. To reduce any risk of acquiring vCJD from food, concerned travelers to Europe or other areas with ...
ERIC Educational Resources Information Center
Ambrosino, Roberta
2009-01-01
A junior faculty member arrives at an unfamiliar university for a new teaching assignment. She is poised for the adventure, but feels like a traveler at the edge of long, unknown road. She does not know what obstacles or vistas may appear on the road, and wants to avoid major potholes. She takes a nervous look around and finds an experienced…
Interstellar Travel without 'Magic'
NASA Astrophysics Data System (ADS)
Woodcock, G.
The possibility of interstellar space travel has become a popular subject. Distances of light years are an entirely new realm for human space travel. New means of propulsion are needed. Speculation about propulsion has included "magic", space warps, faster-than-light travel, known physics such as antimatter for which no practical implementation is known and also physics for which current research offers at least a hint of implementation, i.e. fusion. Performance estimates are presented for the latter and used to create vehicle concepts. Fusion propulsion will mean travel times of hundreds of years, so we adopt the "space colony" concepts of O'Neill as a ship design that could support a small civilization indefinitely; this provides the technical means. Economic reasoning is presented, arguing that development and production of "space colony" habitats for relief of Earth's population, with addition of fusion engines, will lead to vessels that can go interstellar. Scenarios are presented and a speculative estimate of a timetable is given.