NASA Astrophysics Data System (ADS)
Houdayer, Cyril; Isono, Yusuke
2016-12-01
We investigate the asymptotic structure of (possibly type III) crossed product von Neumann algebras {M = B rtimes Γ} arising from arbitrary actions {Γ \\curvearrowright B} of bi-exact discrete groups (e.g. free groups) on amenable von Neumann algebras. We prove a spectral gap rigidity result for the central sequence algebra {N' \\cap M^ω} of any nonamenable von Neumann subalgebra with normal expectation {N subset M}. We use this result to show that for any strongly ergodic essentially free nonsingular action {Γ \\curvearrowright (X, μ)} of any bi-exact countable discrete group on a standard probability space, the corresponding group measure space factor {L^∞(X) rtimes Γ} has no nontrivial central sequence. Using recent results of Boutonnet et al. (Local spectral gap in simple Lie groups and applications, 2015), we construct, for every {0 < λ ≤ 1}, a type {III_λ} strongly ergodic essentially free nonsingular action {F_∞ \\curvearrowright (X_λ, μ_λ)} of the free group {{F}_∞} on a standard probability space so that the corresponding group measure space type {III_λ} factor {L^∞(X_λ, μ_λ) rtimes F_∞} has no nontrivial central sequence by our main result. In particular, we obtain the first examples of group measure space type {III} factors with no nontrivial central sequence.
NASA Astrophysics Data System (ADS)
Niestegge, Gerd
2014-09-01
In quantum mechanics, the selfadjoint Hilbert space operators play a triple role as observables, generators of the dynamical groups and statistical operators defining the mixed states. One might expect that this is typical of Hilbert space quantum mechanics, but it is not. The same triple role occurs for the elements of a certain ordered Banach space in a much more general theory based upon quantum logics and a conditional probability calculus (which is a quantum logical model of the Lueders-von Neumann measurement process). It is shown how positive groups, automorphism groups, Lie algebras and statistical operators emerge from one major postulate - the non-existence of third-order interference (third-order interference and its impossibility in quantum mechanics were discovered by R. Sorkin in 1994). This again underlines the power of the combination of the conditional probability calculus with the postulate that there is no third-order interference. In two earlier papers, its impact on contextuality and nonlocality had already been revealed.
A Comparison of Approaches to Group Counseling.
ERIC Educational Resources Information Center
Zimpfer, David G.; And Others
This panel is based on the assumptions that: (1) group counseling has a valuable contribution to make, (2) group counseling is feasible in terms of time and space at local institutions, (3) group counseling is particularly concerned with affective material, and (4) group counseling probably cannot be conducted effectively in groups as large as 30.…
On the inequivalence of the CH and CHSH inequalities due to finite statistics
NASA Astrophysics Data System (ADS)
Renou, M. O.; Rosset, D.; Martin, A.; Gisin, N.
2017-06-01
Different variants of a Bell inequality, such as CHSH and CH, are known to be equivalent when evaluated on nonsignaling outcome probability distributions. However, in experimental setups, the outcome probability distributions are estimated using a finite number of samples. Therefore the nonsignaling conditions are only approximately satisfied and the robustness of the violation depends on the chosen inequality variant. We explain that phenomenon using the decomposition of the space of outcome probability distributions under the action of the symmetry group of the scenario, and propose a method to optimize the statistical robustness of a Bell inequality. In the process, we describe the finite group composed of relabeling of parties, measurement settings and outcomes, and identify correspondences between the irreducible representations of this group and properties of outcome probability distributions such as normalization, signaling or having uniform marginals.
Use of uninformative priors to initialize state estimation for dynamical systems
NASA Astrophysics Data System (ADS)
Worthy, Johnny L.; Holzinger, Marcus J.
2017-10-01
The admissible region must be expressed probabilistically in order to be used in Bayesian estimation schemes. When treated as a probability density function (PDF), a uniform admissible region can be shown to have non-uniform probability density after a transformation. An alternative approach can be used to express the admissible region probabilistically according to the Principle of Transformation Groups. This paper uses a fundamental multivariate probability transformation theorem to show that regardless of which state space an admissible region is expressed in, the probability density must remain the same under the Principle of Transformation Groups. The admissible region can be shown to be analogous to an uninformative prior with a probability density that remains constant under reparameterization. This paper introduces requirements on how these uninformative priors may be transformed and used for state estimation and the difference in results when initializing an estimation scheme via a traditional transformation versus the alternative approach.
NASA Astrophysics Data System (ADS)
Gürbüz, Ramazan
2010-09-01
The purpose of this study is to investigate and compare the effects of activity-based and traditional instructions on students' conceptual development of certain probability concepts. The study was conducted using a pretest-posttest control group design with 80 seventh graders. A developed 'Conceptual Development Test' comprising 12 open-ended questions was administered on both groups of students before and after the intervention. The data were analysed using analysis of covariance, with the pretest as covariate. The results revealed that activity-based instruction (ABI) outperformed the traditional counterpart in the development of probability concepts. Furthermore, ABI was found to contribute students' conceptual development of the concept of 'Probability of an Event' the most, whereas to the concept of 'Sample Space' the least. As a consequence, it can be deduced that the designed instructional process was effective in the instruction of probability concepts.
How MAG4 Improves Space Weather Forecasting
NASA Technical Reports Server (NTRS)
Falconer, David; Khazanov, Igor; Barghouty, Nasser
2013-01-01
Dangerous space weather is driven by solar flares and Coronal Mass Ejection (CMEs). Forecasting flares and CMEs is the first step to forecasting either dangerous space weather or All Clear. MAG4 (Magnetogram Forecast), developed originally for NASA/SRAG (Space Radiation Analysis Group), is an automated program that analyzes magnetograms from the HMI (Helioseismic and Magnetic Imager) instrument on NASA SDO (Solar Dynamics Observatory), and automatically converts the rate (or probability) of major flares (M- and X-class), Coronal Mass Ejections (CMEs), and Solar Energetic Particle Events.
NASA Technical Reports Server (NTRS)
Markley, F. Landis
2005-01-01
A new method is presented for the simultaneous estimation of the attitude of a spacecraft and an N-vector of bias parameters. This method uses a probability distribution function defined on the Cartesian product of SO(3), the group of rotation matrices, and the Euclidean space W N .The Fokker-Planck equation propagates the probability distribution function between measurements, and Bayes s formula incorporates measurement update information. This approach avoids all the issues of singular attitude representations or singular covariance matrices encountered in extended Kalman filters. In addition, the filter has a consistent initialization for a completely unknown initial attitude, owing to the fact that SO(3) is a compact space.
Algorithmic universality in F-theory compactifications
NASA Astrophysics Data System (ADS)
Halverson, James; Long, Cody; Sung, Benjamin
2017-12-01
We study universality of geometric gauge sectors in the string landscape in the context of F-theory compactifications. A finite time construction algorithm is presented for 4/3 ×2.96 ×10755 F-theory geometries that are connected by a network of topological transitions in a connected moduli space. High probability geometric assumptions uncover universal structures in the ensemble without explicitly constructing it. For example, non-Higgsable clusters of seven-branes with intricate gauge sectors occur with a probability above 1 - 1.01 ×10-755 , and the geometric gauge group rank is above 160 with probability 0.999995. In the latter case there are at least 10 E8 factors, the structure of which fixes the gauge groups on certain nearby seven-branes. Visible sectors may arise from E6 or S U (3 ) seven-branes, which occur in certain random samples with probability ≃1 /200 .
An introduction to data reduction: space-group determination, scaling and intensity statistics.
Evans, Philip R
2011-04-01
This paper presents an overview of how to run the CCP4 programs for data reduction (SCALA, POINTLESS and CTRUNCATE) through the CCP4 graphical interface ccp4i and points out some issues that need to be considered, together with a few examples. It covers determination of the point-group symmetry of the diffraction data (the Laue group), which is required for the subsequent scaling step, examination of systematic absences, which in many cases will allow inference of the space group, putting multiple data sets on a common indexing system when there are alternatives, the scaling step itself, which produces a large set of data-quality indicators, estimation of |F| from intensity and finally examination of intensity statistics to detect crystal pathologies such as twinning. An appendix outlines the scoring schemes used by the program POINTLESS to assign probabilities to possible Laue and space groups.
Microbiology studies in the Space Shuttle
NASA Technical Reports Server (NTRS)
Taylor, G. R.
1976-01-01
Past space microbiology studies have evaluated three general areas: microbe detection in extraterrestrial materials; monitoring of autoflora and medically important species on crewmembers, equipment, and cabin air; and in vitro evaluations of isolated terrestrial species carried on manned and unmanned spaceflights. These areas are briefly reviewed to establish a basis for presenting probable experiment subjects applicable to the Space Shuttle era. Most extraterrestrial life detection studies involve visitations to other heavenly bodies. Although this is not applicable to the first series of Shuttle flights, attempts to capture meteors and spores in space could be important. Human pathogen and autoflora monitoring will become more important with increased variety among crewmembers. Inclusion of contaminated animal and plant specimens in the space lab will necessitate inflight evaluation of cross-contamination and infection potentials. The majority of Shuttle microbiology studies will doubtless fall into the third study area. Presence of a space lab will permit a whole range of experimentation under conditions similar to these experienced in earth-based laboratories. The recommendations of various study groups are analyzed, and probable inflight microbiological experiment areas are identified for the Life Sciences Shuttle Laboratory.
Transition probability spaces in loop quantum gravity
NASA Astrophysics Data System (ADS)
Guo, Xiao-Kan
2018-03-01
We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.
Shao, Jing-Yuan; Qu, Hai-Bin; Gong, Xing-Chu
2018-05-01
In this work, two algorithms (overlapping method and the probability-based method) for design space calculation were compared by using the data collected from extraction process of Codonopsis Radix as an example. In the probability-based method, experimental error was simulated to calculate the probability of reaching the standard. The effects of several parameters on the calculated design space were studied, including simulation number, step length, and the acceptable probability threshold. For the extraction process of Codonopsis Radix, 10 000 times of simulation and 0.02 for the calculation step length can lead to a satisfactory design space. In general, the overlapping method is easy to understand, and can be realized by several kinds of commercial software without coding programs, but the reliability of the process evaluation indexes when operating in the design space is not indicated. Probability-based method is complex in calculation, but can provide the reliability to ensure that the process indexes can reach the standard within the acceptable probability threshold. In addition, there is no probability mutation in the edge of design space by probability-based method. Therefore, probability-based method is recommended for design space calculation. Copyright© by the Chinese Pharmaceutical Association.
On Replacing "Quantum Thinking" with Counterfactual Reasoning
NASA Astrophysics Data System (ADS)
Narens, Louis
The probability theory used in quantum mechanics is currently being employed by psychologists to model the impact of context on decision. Its event space consists of closed subspaces of a Hilbert space, and its probability function sometimes violate the law of the finite additivity of probabilities. Results from the quantum mechanics literature indicate that such a "Hilbert space probability theory" cannot be extended in a useful way to standard, finitely additive, probability theory by the addition of new events with specific probabilities. This chapter presents a new kind of probability theory that shares many fundamental algebraic characteristics with Hilbert space probability theory but does extend to standard probability theory by adjoining new events with specific probabilities. The new probability theory arises from considerations about how psychological experiments are related through counterfactual reasoning.
Park, Wooram; Liu, Yan; Zhou, Yu; Moses, Matthew; Chirikjian, Gregory S
2008-04-11
A nonholonomic system subjected to external noise from the environment, or internal noise in its own actuators, will evolve in a stochastic manner described by an ensemble of trajectories. This ensemble of trajectories is equivalent to the solution of a Fokker-Planck equation that typically evolves on a Lie group. If the most likely state of such a system is to be estimated, and plans for subsequent motions from the current state are to be made so as to move the system to a desired state with high probability, then modeling how the probability density of the system evolves is critical. Methods for solving Fokker-Planck equations that evolve on Lie groups then become important. Such equations can be solved using the operational properties of group Fourier transforms in which irreducible unitary representation (IUR) matrices play a critical role. Therefore, we develop a simple approach for the numerical approximation of all the IUR matrices for two of the groups of most interest in robotics: the rotation group in three-dimensional space, SO(3), and the Euclidean motion group of the plane, SE(2). This approach uses the exponential mapping from the Lie algebras of these groups, and takes advantage of the sparse nature of the Lie algebra representation matrices. Other techniques for density estimation on groups are also explored. The computed densities are applied in the context of probabilistic path planning for kinematic cart in the plane and flexible needle steering in three-dimensional space. In these examples the injection of artificial noise into the computational models (rather than noise in the actual physical systems) serves as a tool to search the configuration spaces and plan paths. Finally, we illustrate how density estimation problems arise in the characterization of physical noise in orientational sensors such as gyroscopes.
Quantum mechanics of Klein-Gordon fields I: Hilbert Space, localized states, and chiral symmetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mostafazadeh, A.; Zamani, F.
2006-09-15
We derive an explicit manifestly covariant expression for the most general positive-definite and Lorentz-invariant inner product on the space of solutions of the Klein-Gordon equation. This expression involves a one-parameter family of conserved current densities J{sub a}{sup {mu}}, with a-bar (-1,1), that are analogous to the chiral current density for spin half fields. The conservation of J{sub a}{sup {mu}} is related to a global gauge symmetry of the Klein-Gordon fields whose gauge group is U(1) for rational a and the multiplicative group of positive real numbers for irrational a. We show that the associated gauge symmetry is responsible for themore » conservation of the total probability of the localization of the field in space. This provides a simple resolution of the paradoxical situation resulting from the fact that the probability current density for free scalar fields is neither covariant nor conserved. Furthermore, we discuss the implications of our approach for free real scalar fields offering a direct proof of the uniqueness of the relativistically invariant positive-definite inner product on the space of real Klein-Gordon fields. We also explore an extension of our results to scalar fields minimally coupled to an electromagnetic field.« less
Brick tunnel randomization and the momentum of the probability mass.
Kuznetsova, Olga M
2015-12-30
The allocation space of an unequal-allocation permuted block randomization can be quite wide. The development of unequal-allocation procedures with a narrower allocation space, however, is complicated by the need to preserve the unconditional allocation ratio at every step (the allocation ratio preserving (ARP) property). When the allocation paths are depicted on the K-dimensional unitary grid, where allocation to the l-th treatment is represented by a step along the l-th axis, l = 1 to K, the ARP property can be expressed in terms of the center of the probability mass after i allocations. Specifically, for an ARP allocation procedure that randomizes subjects to K treatment groups in w1 :⋯:wK ratio, w1 +⋯+wK =1, the coordinates of the center of the mass are (w1 i,…,wK i). In this paper, the momentum with respect to the center of the probability mass (expected imbalance in treatment assignments) is used to compare ARP procedures in how closely they approximate the target allocation ratio. It is shown that the two-arm and three-arm brick tunnel randomizations (BTR) are the ARP allocation procedures with the tightest allocation space among all allocation procedures with the same allocation ratio; the two-arm BTR is the minimum-momentum two-arm ARP allocation procedure. Resident probabilities of two-arm and three-arm BTR are analytically derived from the coordinates of the center of the probability mass; the existence of the respective transition probabilities is proven. Probability of deterministic assignments with BTR is found generally acceptable. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
MAG4 Versus Alternative Techniques for Forecasting Active-Region Flare Productivity
NASA Technical Reports Server (NTRS)
Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor
2014-01-01
MAG4 (Magnetogram Forecast), developed originally for NASA/SRAG (Space Radiation Analysis Group), is an automated program that analyzes magnetograms from the HMI (Helioseismic and Magnetic Imager) instrument on NASA SDO (Solar Dynamics Observatory), and automatically converts the rate (or probability) of major flares (M- and X-class), Coronal Mass Ejections (CMEs), and Solar Energetic Particle Events. MAG4 does not forecast that a flare will occur at a particular time in the next 24 or 48 hours; rather the probability of one occurring.
On quantum symmetries of compact metric spaces
NASA Astrophysics Data System (ADS)
Chirvasitu, Alexandru
2015-08-01
An action of a compact quantum group on a compact metric space (X , d) is (D)-isometric if the distance function is preserved by a diagonal action on X × X. In this study, we show that an isometric action in this sense has the following additional property: the corresponding action on the algebra of continuous functions on X by the convolution semigroup of probability measures on the quantum group contracts Lipschitz constants. In other words, it is isometric in another sense due to Li, Quaegebeur, and Sabbe, which partially answers a question posed by Goswami. We also introduce other possible notions of isometric quantum actions in terms of the Wasserstein p-distances between probability measures on X for p ≥ 1, which are used extensively in optimal transportation. Indeed, all of these definitions of quantum isometry belong to a hierarchy of implications, where the two described above lie at the extreme ends of the hierarchy. We conjecture that they are all equivalent.
2017-03-07
You probably don’t know what you’ll be doing six months from today, but there’s a group at NASA’s Marshall Space Flight Center in Huntsville, Alabama, that’s making just such a plan for scientific research on the International Space Station. Learn how these men and women map out science activity for the crew in space to support the cutting-edge research now underway that’s benefitting life on Earth. For more on ISS science, visit us online: https://www.nasa.gov/mission_pages/station/research/index.html www.twitter.com/iss_research _______________________________________ FOLLOW THE SPACE STATION! Twitter: https://twitter.com/Space_Station Facebook: https://www.facebook.com/ISS Instagram: https://instagram.com/iss/
Working group report on advanced high-voltage high-power and energy-storage space systems
NASA Technical Reports Server (NTRS)
Cohen, H. A.; Cooke, D. L.; Evans, R. W.; Hastings, D.; Jongeward, G.; Laframboise, J. G.; Mahaffey, D.; Mcintyre, B.; Pfizer, K. A.; Purvis, C.
1986-01-01
Space systems in the future will probably include high-voltage, high-power energy-storage and -production systems. Two such technologies are high-voltage ac and dc systems and high-power electrodynamic tethers. The working group identified several plasma interaction phenomena that will occur in the operation of these power systems. The working group felt that building an understanding of these critical interaction issues meant that several gaps in our knowledge had to be filled, and that certain aspects of dc power systems have become fairly well understood. Examples of these current collection are in quiescent plasmas and snap over effects. However, high-voltage dc and almost all ac phenomena are, at best, inadequately understood. In addition, there is major uncertainty in the knowledge of coupling between plasmas and large scale current flows in space plasmas. These gaps in the knowledge are addressed.
Wong, Henry; Collins, Jill; Tinsley, David; Sandler, Jonathan; Benson, Philip
2013-01-01
Objective: To investigate the effect of bracket–ligature combination on the amount of orthodontic space closure over three months. Design: Randomized clinical trial with three parallel groups. Setting: A hospital orthodontic department (Chesterfield Royal Hospital, UK). Participants: Forty-five patients requiring upper first premolar extractions. Methods: Informed consent was obtained and participants were randomly allocated into one of three groups: (1) conventional pre-adjusted edgewise brackets and elastomeric ligatures; (2) conventional pre-adjusted edgewise brackets and Super Slick® low friction elastomeric ligatures; (3) Damon 3MX® passive self-ligating brackets. Space closure was undertaken on 0·019×0·025-inch stainless steel archwires with nickel–titanium coil springs. Participants were recalled at four weekly intervals. Upper alginate impressions were taken at each visit (maximum three). The primary outcome measure was the mean amount of space closure in a 3-month period. Results: A one-way ANOVA was undertaken [dependent variable: mean space closure (mm); independent variable: group allocation]. The amount of space closure was very similar between the three groups (1 mm per 28 days); however, there was a wide variation in the rate of space closure between individuals. The differences in the amount of space closure over three months between the three groups was very small and non-significant (P = 0·718). Conclusion: The hypothesis that reducing friction by modifying the bracket/ligature interface increases the rate of space closure was not supported. The major determinant of orthodontic tooth movement is probably the individual patient response. PMID:23794696
Alternative probability theories for cognitive psychology.
Narens, Louis
2014-01-01
Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.
Is the local linearity of space-time inherited from the linearity of probabilities?
NASA Astrophysics Data System (ADS)
Müller, Markus P.; Carrozza, Sylvain; Höhn, Philipp A.
2017-02-01
The appearance of linear spaces, describing physical quantities by vectors and tensors, is ubiquitous in all of physics, from classical mechanics to the modern notion of local Lorentz invariance. However, as natural as this seems to the physicist, most computer scientists would argue that something like a ‘local linear tangent space’ is not very typical and in fact a quite surprising property of any conceivable world or algorithm. In this paper, we take the perspective of the computer scientist seriously, and ask whether there could be any inherently information-theoretic reason to expect this notion of linearity to appear in physics. We give a series of simple arguments, spanning quantum information theory, group representation theory, and renormalization in quantum gravity, that supports a surprising thesis: namely, that the local linearity of space-time might ultimately be a consequence of the linearity of probabilities. While our arguments involve a fair amount of speculation, they have the virtue of being independent of any detailed assumptions on quantum gravity, and they are in harmony with several independent recent ideas on emergent space-time in high-energy physics.
VizieR Online Data Catalog: Catalog of Suspected Nearby Young Stars (Riedel+, 2017)
NASA Astrophysics Data System (ADS)
Riedel, A. R.; Blunt, S. C.; Lambrides, E. L.; Rice, E. L.; Cruz, K. L.; Faherty, J. K.
2018-04-01
LocAting Constituent mEmbers In Nearby Groups (LACEwING) is a frequentist observation space kinematic moving group identification code. Using the spatial and kinematic information available about a target object (α, δ, Dist, μα, μδ, and γ), it determines the probability that the object is a member of each of the known nearby young moving groups (NYMGs). As with other moving group identification codes, LACEwING is capable of estimating memberships for stars with incomplete kinematic and spatial information. (2 data files).
Applicability of the Moyers' Probability Tables in Adolescents with Different Facial Biotypes
Carrillo, Jorge J. Pavani; Rubial, Maria C.; Albornoz, Cristina; Villalba, Silvina; Damiani, Patricia; de Cravero, Marta Rugani
2017-01-01
Introduction: The Moyers’ probability tables are used in mixed dentition analysis to estimate the extent of space required for the alignment of canines and premolars, by correlating the mesiodistal size of lower incisors with the size of permanent canines and premolars. Objective: This study intended to evaluate the applicability of the Moyer's probability tables for predicting the mesiodistal space needed for the correct location of premolars and permanent canines non-erupted, in adolescents of the city of Cordoba, Argentina, who show different facial biotypes. Materials and Methods: Models and tele-radiographies of 478 adolescents of both genders from 10 to 15 years of age were analyzed. The tele-radiographies were measured manually in order to determine the facial biotype. The models were scanned with a gauged scanner (HP 3670) and measured by using Image Pro Plus 4.5 software. Results: According to this study, the comparison between the Moyer´s probability table, and the table created at the National University of Córdoba (UNC) (at 95%, 75%, and 50%) shows that, in both tables, a higher value of mesiodistal width of lower incisors corresponds to a bigger difference in the space needed for permanent canines and premolars; being the need for space for permanents canines and premolars bigger in the UNC´s table. On the other hand, when contrasting the values of mesiodistal space for permanent canines and premolars associated with each facial biotype, the discrepancies between groups were not statistically significant (P >0.05). However, we found differences in the size of the space required according to the mesiodistal width range of the lower incisors for each biotype: a) The comparison of lower-range values, with a mesialdistal width of lower incisors less than 22 mm, the space required for permanent canines and premolars resulted smaller in patients with dolichofacial biotype than in patients with mesofacial and braquifacial biotypes. The latter biotypes have meager differences between them. b) The comparison of mid-range values, with a mesialdistal width of lower incisors from 22 to 25 millimeters, shows that the values of required alignment space are similar in the three facial biotypes. c) Finally, the comparison of upper range values, with a mesialdistal width of lower incisors greater than 25 millimeters, indicates that the space required for dolichofacial biotypes tends to be higher than in mesofacial and brachyfacial biotypes. Conclusion: The Moyer´s probability tables should be created to meet the needs of the population under study, with no consideration of patients’ facial biotypes. PMID:28567145
Park, Wooram; Liu, Yan; Zhou, Yu; Moses, Matthew; Chirikjian, Gregory S.
2010-01-01
SUMMARY A nonholonomic system subjected to external noise from the environment, or internal noise in its own actuators, will evolve in a stochastic manner described by an ensemble of trajectories. This ensemble of trajectories is equivalent to the solution of a Fokker–Planck equation that typically evolves on a Lie group. If the most likely state of such a system is to be estimated, and plans for subsequent motions from the current state are to be made so as to move the system to a desired state with high probability, then modeling how the probability density of the system evolves is critical. Methods for solving Fokker-Planck equations that evolve on Lie groups then become important. Such equations can be solved using the operational properties of group Fourier transforms in which irreducible unitary representation (IUR) matrices play a critical role. Therefore, we develop a simple approach for the numerical approximation of all the IUR matrices for two of the groups of most interest in robotics: the rotation group in three-dimensional space, SO(3), and the Euclidean motion group of the plane, SE(2). This approach uses the exponential mapping from the Lie algebras of these groups, and takes advantage of the sparse nature of the Lie algebra representation matrices. Other techniques for density estimation on groups are also explored. The computed densities are applied in the context of probabilistic path planning for kinematic cart in the plane and flexible needle steering in three-dimensional space. In these examples the injection of artificial noise into the computational models (rather than noise in the actual physical systems) serves as a tool to search the configuration spaces and plan paths. Finally, we illustrate how density estimation problems arise in the characterization of physical noise in orientational sensors such as gyroscopes. PMID:20454468
A probability space for quantum models
NASA Astrophysics Data System (ADS)
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Probability of satellite collision
NASA Technical Reports Server (NTRS)
Mccarter, J. W.
1972-01-01
A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.
THE SEMIGROUP OF METRIC MEASURE SPACES AND ITS INFINITELY DIVISIBLE PROBABILITY MEASURES
EVANS, STEVEN N.; MOLCHANOV, ILYA
2015-01-01
A metric measure space is a complete, separable metric space equipped with a probability measure that has full support. Two such spaces are equivalent if they are isometric as metric spaces via an isometry that maps the probability measure on the first space to the probability measure on the second. The resulting set of equivalence classes can be metrized with the Gromov–Prohorov metric of Greven, Pfaffelhuber and Winter. We consider the natural binary operation ⊞ on this space that takes two metric measure spaces and forms their Cartesian product equipped with the sum of the two metrics and the product of the two probability measures. We show that the metric measure spaces equipped with this operation form a cancellative, commutative, Polish semigroup with a translation invariant metric. There is an explicit family of continuous semicharacters that is extremely useful for, inter alia, establishing that there are no infinitely divisible elements and that each element has a unique factorization into prime elements. We investigate the interaction between the semigroup structure and the natural action of the positive real numbers on this space that arises from scaling the metric. For example, we show that for any given positive real numbers a, b, c the trivial space is the only space that satisfies a ⊞ b = c . We establish that there is no analogue of the law of large numbers: if X1, X2, … is an identically distributed independent sequence of random spaces, then no subsequence of 1n⊞k=1nXk converges in distribution unless each Xk is almost surely equal to the trivial space. We characterize the infinitely divisible probability measures and the Lévy processes on this semigroup, characterize the stable probability measures and establish a counterpart of the LePage representation for the latter class. PMID:28065980
Across Space and Time: Infants Learn from Backward and Forward Visual Statistics
ERIC Educational Resources Information Center
Tummeltshammer, Kristen; Amso, Dima; French, Robert M.; Kirkham, Natasha Z.
2017-01-01
This study investigates whether infants are sensitive to backward and forward transitional probabilities within temporal and spatial visual streams. Two groups of 8-month-old infants were familiarized with an artificial grammar of shapes, comprising backward and forward base pairs (i.e. two shapes linked by strong backward or forward transitional…
Sensitivity study of Space Station Freedom operations cost and selected user resources
NASA Technical Reports Server (NTRS)
Accola, Anne; Fincannon, H. J.; Williams, Gregory J.; Meier, R. Timothy
1990-01-01
The results of sensitivity studies performed to estimate probable ranges for four key Space Station parameters using the Space Station Freedom's Model for Estimating Space Station Operations Cost (MESSOC) are discussed. The variables examined are grouped into five main categories: logistics, crew, design, space transportation system, and training. The modification of these variables implies programmatic decisions in areas such as orbital replacement unit (ORU) design, investment in repair capabilities, and crew operations policies. The model utilizes a wide range of algorithms and an extensive trial logistics data base to represent Space Station operations. The trial logistics data base consists largely of a collection of the ORUs that comprise the mature station, and their characteristics based on current engineering understanding of the Space Station. A nondimensional approach is used to examine the relative importance of variables on parameters.
Quantum mechanics on phase space: The hydrogen atom and its Wigner functions
NASA Astrophysics Data System (ADS)
Campos, P.; Martins, M. G. R.; Fernandes, M. C. B.; Vianna, J. D. M.
2018-03-01
Symplectic quantum mechanics (SQM) considers a non-commutative algebra of functions on a phase space Γ and an associated Hilbert space HΓ, to construct a unitary representation for the Galilei group. From this unitary representation the Schrödinger equation is rewritten in phase space variables and the Wigner function can be derived without the use of the Liouville-von Neumann equation. In this article the Coulomb potential in three dimensions (3D) is resolved completely by using the phase space Schrödinger equation. The Kustaanheimo-Stiefel(KS) transformation is applied and the Coulomb and harmonic oscillator potentials are connected. In this context we determine the energy levels, the amplitude of probability in phase space and correspondent Wigner quasi-distribution functions of the 3D-hydrogen atom described by Schrödinger equation in phase space.
Statistical Short-Range Guidance for Peak Wind Speed Forecasts at Edwards Air Force Base, CA
NASA Technical Reports Server (NTRS)
Dreher, Joseph; Crawford, Winifred; Lafosse, Richard; Hoeth, Brian; Burns, Kerry
2008-01-01
The peak winds near the surface are an important forecast element for Space Shuttle landings. As defined in the Shuttle Flight Rules (FRs), there are peak wind thresholds that cannot be exceeded in order to ensure the safety of the shuttle during landing operations. The National Weather Service Spaceflight Meteorology Group (SMG) is responsible for weather forecasts for all shuttle landings. They indicate peak winds are a challenging parameter to forecast. To alleviate the difficulty in making such wind forecasts, the Applied Meteorology Unit (AMTJ) developed a personal computer based graphical user interface (GUI) for displaying peak wind climatology and probabilities of exceeding peak-wind thresholds for the Shuttle Landing Facility (SLF) at Kennedy Space Center. However, the shuttle must land at Edwards Air Force Base (EAFB) in southern California when weather conditions at Kennedy Space Center in Florida are not acceptable, so SMG forecasters requested that a similar tool be developed for EAFB. Marshall Space Flight Center (MSFC) personnel archived and performed quality control of 2-minute average and 10-minute peak wind speeds at each tower adjacent to the main runway at EAFB from 1997- 2004. They calculated wind climatologies and probabilities of average peak wind occurrence based on the average speed. The climatologies were calculated for each tower and month, and were stratified by hour, direction, and direction/hour. For the probabilities of peak wind occurrence, MSFC calculated empirical and modeled probabilities of meeting or exceeding specific 10-minute peak wind speeds using probability density functions. The AMU obtained and reformatted the data into Microsoft Excel PivotTables, which allows users to display different values with point-click-drag techniques. The GUT was then created from the PivotTables using Visual Basic for Applications code. The GUI is run through a macro within Microsoft Excel and allows forecasters to quickly display and interpret peak wind climatology and likelihoods in a fast-paced operational environment. A summary of how the peak wind climatologies and probabilities were created and an overview of the GUT will be presented.
Study of recreational land and open space using Skylab imagery
NASA Technical Reports Server (NTRS)
Sattinger, I. J. (Principal Investigator)
1975-01-01
The author has identified the following significant results. An analysis of the statistical uniqueness of each of the signatures of the Gratiot-Saginaw State Game Area was made by computing a matrix of probabilities of misclassification for all possible signature pairs. Within each data set, the 35 signatures were then aggregated into a smaller set of composite signatures by combining groups of signatures having high probabilities of misclassification. Computer separation of forest denisty classes was poor with multispectral scanner data collected on 5 August 1973. Signatures from the scanner data were further analyzed to determine the ranking of spectral channels for computer separation of the scene classes. Probabilities of misclassification were computed for composite signatures using four separate combinations of data source and channel selection.
Analysis and Assessment of Peak Lightning Current Probabilities at the NASA Kennedy Space Center
NASA Technical Reports Server (NTRS)
Johnson, D. L.; Vaughan, W. W.
1999-01-01
This technical memorandum presents a summary by the Electromagnetics and Aerospace Environments Branch at the Marshall Space Flight Center of lightning characteristics and lightning criteria for the protection of aerospace vehicles. Probability estimates are included for certain lightning strikes (peak currents of 200, 100, and 50 kA) applicable to the National Aeronautics and Space Administration Space Shuttle at the Kennedy Space Center, Florida, during rollout, on-pad, and boost/launch phases. Results of an extensive literature search to compile information on this subject are presented in order to answer key questions posed by the Space Shuttle Program Office at the Johnson Space Center concerning peak lightning current probabilities if a vehicle is hit by a lightning cloud-to-ground stroke. Vehicle-triggered lightning probability estimates for the aforementioned peak currents are still being worked. Section 4.5, however, does provide some insight on estimating these same peaks.
Spacecraft Robustness to Orbital Debris: Guidelines & Recommendations
NASA Astrophysics Data System (ADS)
Heinrich, S.; Legloire, D.; Tromba, A.; Tholot, M.; Nold, O.
2013-09-01
The ever increasing number of orbital debris has already led the space community to implement guidelines and requirements for "cleaner" and "safer" space operations as non-debris generating missions and end of mission disposal in order to get preserved orbits rid of space junks. It is nowadays well-known that man-made orbital debris impacts are now a higher threat than natural micro-meteoroids and that recent events intentionally or accidentally generated so many new debris that may initiate a cascade chain effect known as "the Kessler Syndrome" potentially jeopardizing the useful orbits.The main recommendations on satellite design is to demonstrate an acceptable Probability of Non-Penetration (PNP) with regard to small population (<5cm) of MMOD (Micro-Meteoroids and Orbital Debris). Compliance implies to think about spacecraft robustness as redundancies, segregations and shielding devices (as implemented in crewed missions but in a more complex mass - cost - criticality trade- off). Consequently the need is non-only to demonstrate the PNP compliance requirement but also the PNF (probability of Non-Failure) per impact location on all parts of the vehicle and investigate the probabilities for the different fatal scenarios: loss of mission, loss of spacecraft (space environment critical) and spacecraft fragmentation (space environment catastrophic).The recent THALES experience known on ESA Sentinel-3, of increasing need of robustness has led the ALTRAN company to initiate an internal innovative working group on those topics which conclusions may be attractive for their prime manufacturer customers.The intention of this paper is to present a status of this study : * Regulations, requirements and tools available * Detailed FMECA studies dedicated specifically to the MMOD risks with the introduction of new of probability and criticality classification scales. * Examples of design risks assessment with regard to the specific MMOD impact risks. * Lessons learnt on robustness survivability of systems (materials, shieldings, rules) coming from other industrial domains (automotive, military vehicles) * Guidelines and Recommendations implementable on satellite systems and mechanical architecture.
Electrodynamic Dust Shield for Space Applications
NASA Technical Reports Server (NTRS)
Mackey, Paul J.; Johansen, Michael R.; Olsen, Robert C.; Raines, Matthew G.; Phillips, James R., III; Cox, Rachel E.; Hogue, Michael D.; Pollard, Jacob R. S.; Calle, Carlos I.
2016-01-01
Dust mitigation technology has been highlighted by NASA and the International Space Exploration Coordination Group (ISECG) as a Global Exploration Roadmap (GER) critical technology need in order to reduce life cycle cost and risk, and increase the probability of mission success. The Electrostatics and Surface Physics Lab in Swamp Works at the Kennedy Space Center has developed an Electrodynamic Dust Shield (EDS) to remove dust from multiple surfaces, including glass shields and thermal radiators. Further development is underway to improve the operation and reliability of the EDS as well as to perform material and component testing outside of the International Space Station (ISS) on the Materials on International Space Station Experiment (MISSE). This experiment is designed to verify that the EDS can withstand the harsh environment of space and will look to closely replicate the solar environment experienced on the Moon.
Probabilities of good, marginal, and poor flying conditions for space shuttle ferry flights
NASA Technical Reports Server (NTRS)
Whiting, D. M.; Guttman, N. B.
1977-01-01
Empirical probabilities are provided for good, marginal, and poor flying weather for ferrying the Space Shuttle Orbiter from Edwards AFB, California, to Kennedy Space Center, Florida, and from Edwards AFB to Marshall Space Flight Center, Alabama. Results are given by month for each overall route plus segments of each route. The criteria for defining a day as good, marginal, or poor and the method of computing the relative frequencies and conditional probabilities for monthly reference periods are described.
Probable reasons for expressed agitation in persons with dementia.
Ragneskog, H; Gerdner, L A; Josefsson, K; Kihlgren, M
1998-05-01
Nursing home patients with dementia were videotaped in three previous studies. Sixty sequences of nine patients exhibiting agitated behaviors were examined to identify the most probable antecedents to agitation. Probable reasons were interpreted and applied to the Progressively Lowered Stress Threshold model, which suggests that agitation is stress related. Analysis suggests that agitation often serves as a form of communication. Two underlying reasons seem to be that the patient had loss of control over the situation and deficient autonomy. The most common causes for expressed agitation were interpreted as discomfort, a wish to be served immediately, conflict between patients or with nursing staff, reactions to environmental noises or sound, and invasion of personal space. It is recommended that nursing staff promote autonomy and independency for this group of patients whenever possible. By evaluating probable reasons for expressed agitation, the nursing staff can take steps to prevent or alleviate agitation.
Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future
NASA Technical Reports Server (NTRS)
Cates, Grant R.
2014-01-01
The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.
Sadasivan, Chander; Brownstein, Jeremy; Patel, Bhumika; Dholakia, Ronak; Santore, Joseph; Al-Mufti, Fawaz; Puig, Enrique; Rakian, Audrey; Fernandez-Prada, Kenneth D; Elhammady, Mohamed S; Farhat, Hamad; Fiorella, David J; Woo, Henry H; Aziz-Sultan, Mohammad A; Lieber, Baruch B
2013-03-01
Endovascular coiling of cerebral aneurysms remains limited by coil compaction and associated recanalization. Recent coil designs which effect higher packing densities may be far from optimal because hemodynamic forces causing compaction are not well understood since detailed data regarding the location and distribution of coil masses are unavailable. We present an in vitro methodology to characterize coil masses deployed within aneurysms by quantifying intra-aneurysmal void spaces. Eight identical aneurysms were packed with coils by both balloon- and stent-assist techniques. The samples were embedded, sequentially sectioned and imaged. Empty spaces between the coils were numerically filled with circles (2D) in the planar images and with spheres (3D) in the three-dimensional composite images. The 2D and 3D void size histograms were analyzed for local variations and by fitting theoretical probability distribution functions. Balloon-assist packing densities (31±2%) were lower ( p =0.04) than the stent-assist group (40±7%). The maximum and average 2D and 3D void sizes were higher ( p =0.03 to 0.05) in the balloon-assist group as compared to the stent-assist group. None of the void size histograms were normally distributed; theoretical probability distribution fits suggest that the histograms are most probably exponentially distributed with decay constants of 6-10 mm. Significant ( p <=0.001 to p =0.03) spatial trends were noted with the void sizes but correlation coefficients were generally low (absolute r <=0.35). The methodology we present can provide valuable input data for numerical calculations of hemodynamic forces impinging on intra-aneurysmal coil masses and be used to compare and optimize coil configurations as well as coiling techniques.
Moments of the Particle Phase-Space Density at Freeze-out and Coincidence Probabilities
NASA Astrophysics Data System (ADS)
Bialas, A.; Czyż, W.; Zalewski, K.
2005-10-01
It is pointed out that the moments of phase-space particle density at freeze-out can be determined from the coincidence probabilities of the events observed in multiparticle production. A method to measure the coincidence probabilities is described and its validity examined.
Six new mechanics corresponding to further shape theories
NASA Astrophysics Data System (ADS)
Anderson, Edward
2016-02-01
In this paper, suite of relational notions of shape are presented at the level of configuration space geometry, with corresponding new theories of shape mechanics and shape statistics. These further generalize two quite well known examples: (i) Kendall’s (metric) shape space with his shape statistics and Barbour’s mechanics thereupon. (ii) Leibnizian relational space alias metric scale-and-shape space to which corresponds Barbour-Bertotti mechanics. This paper’s new theories include, using the invariant and group namings, (iii) Angle alias conformal shape mechanics. (iv) Area ratio alias e shape mechanics. (v) Area alias e scale-and-shape mechanics. (iii)-(v) rest respectively on angle space, area-ratio space, and area space configuration spaces. Probability and statistics applications are also pointed to in outline. (vi) Various supersymmetric counterparts of (i)-(v) are considered. Since supergravity differs considerably from GR-based conceptions of background independence, some of the new supersymmetric shape mechanics are compared with both. These reveal compatibility between supersymmetry and GR-based conceptions of background independence, at least within these simpler model arenas.
Probability Forecasting Using Monte Carlo Simulation
NASA Astrophysics Data System (ADS)
Duncan, M.; Frisbee, J.; Wysack, J.
2014-09-01
Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a collision probability distribution given known, predicted uncertainty. This paper presents the details of the collision probability forecasting method. We examine various conjunction event scenarios and numerically demonstrate the utility of this approach in typical event scenarios. We explore the utility of a probability-based track scenario simulation that models expected tracking data frequency as the tasking levels are increased. The resulting orbital uncertainty is subsequently used in the forecasting algorithm.
INFORMATION-THEORETIC INEQUALITIES ON UNIMODULAR LIE GROUPS
Chirikjian, Gregory S.
2010-01-01
Classical inequalities used in information theory such as those of de Bruijn, Fisher, Cramér, Rao, and Kullback carry over in a natural way from Euclidean space to unimodular Lie groups. These are groups that possess an integration measure that is simultaneously invariant under left and right shifts. All commutative groups are unimodular. And even in noncommutative cases unimodular Lie groups share many of the useful features of Euclidean space. The rotation and Euclidean motion groups, which are perhaps the most relevant Lie groups to problems in geometric mechanics, are unimodular, as are the unitary groups that play important roles in quantum computing. The extension of core information theoretic inequalities defined in the setting of Euclidean space to this broad class of Lie groups is potentially relevant to a number of problems relating to information gathering in mobile robotics, satellite attitude control, tomographic image reconstruction, biomolecular structure determination, and quantum information theory. In this paper, several definitions are extended from the Euclidean setting to that of Lie groups (including entropy and the Fisher information matrix), and inequalities analogous to those in classical information theory are derived and stated in the form of fifteen small theorems. In all such inequalities, addition of random variables is replaced with the group product, and the appropriate generalization of convolution of probability densities is employed. An example from the field of robotics demonstrates how several of these results can be applied to quantify the amount of information gained by pooling different sensory inputs. PMID:21113416
Cao, Youfang; Terebus, Anna; Liang, Jie
2016-01-01
The discrete chemical master equation (dCME) provides a general framework for studying stochasticity in mesoscopic reaction networks. Since its direct solution rapidly becomes intractable due to the increasing size of the state space, truncation of the state space is necessary for solving most dCMEs. It is therefore important to assess the consequences of state space truncations so errors can be quantified and minimized. Here we describe a novel method for state space truncation. By partitioning a reaction network into multiple molecular equivalence groups (MEG), we truncate the state space by limiting the total molecular copy numbers in each MEG. We further describe a theoretical framework for analysis of the truncation error in the steady state probability landscape using reflecting boundaries. By aggregating the state space based on the usage of a MEG and constructing an aggregated Markov process, we show that the truncation error of a MEG can be asymptotically bounded by the probability of states on the reflecting boundary of the MEG. Furthermore, truncating states of an arbitrary MEG will not undermine the estimated error of truncating any other MEGs. We then provide an overall error estimate for networks with multiple MEGs. To rapidly determine the appropriate size of an arbitrary MEG, we also introduce an a priori method to estimate the upper bound of its truncation error. This a priori estimate can be rapidly computed from reaction rates of the network, without the need of costly trial solutions of the dCME. As examples, we show results of applying our methods to the four stochastic networks of 1) the birth and death model, 2) the single gene expression model, 3) the genetic toggle switch model, and 4) the phage lambda bistable epigenetic switch model. We demonstrate how truncation errors and steady state probability landscapes can be computed using different sizes of the MEG(s) and how the results validate out theories. Overall, the novel state space truncation and error analysis methods developed here can be used to ensure accurate direct solutions to the dCME for a large number of stochastic networks. PMID:27105653
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Youfang; Terebus, Anna; Liang, Jie
The discrete chemical master equation (dCME) provides a general framework for studying stochasticity in mesoscopic reaction networks. Since its direct solution rapidly becomes intractable due to the increasing size of the state space, truncation of the state space is necessary for solving most dCMEs. It is therefore important to assess the consequences of state space truncations so errors can be quantified and minimized. Here we describe a novel method for state space truncation. By partitioning a reaction network into multiple molecular equivalence groups (MEGs), we truncate the state space by limiting the total molecular copy numbers in each MEG. Wemore » further describe a theoretical framework for analysis of the truncation error in the steady-state probability landscape using reflecting boundaries. By aggregating the state space based on the usage of a MEG and constructing an aggregated Markov process, we show that the truncation error of a MEG can be asymptotically bounded by the probability of states on the reflecting boundary of the MEG. Furthermore, truncating states of an arbitrary MEG will not undermine the estimated error of truncating any other MEGs. We then provide an overall error estimate for networks with multiple MEGs. To rapidly determine the appropriate size of an arbitrary MEG, we also introduce an a priori method to estimate the upper bound of its truncation error. This a priori estimate can be rapidly computed from reaction rates of the network, without the need of costly trial solutions of the dCME. As examples, we show results of applying our methods to the four stochastic networks of (1) the birth and death model, (2) the single gene expression model, (3) the genetic toggle switch model, and (4) the phage lambda bistable epigenetic switch model. We demonstrate how truncation errors and steady-state probability landscapes can be computed using different sizes of the MEG(s) and how the results validate our theories. Overall, the novel state space truncation and error analysis methods developed here can be used to ensure accurate direct solutions to the dCME for a large number of stochastic networks.« less
Cao, Youfang; Terebus, Anna; Liang, Jie
2016-04-22
The discrete chemical master equation (dCME) provides a general framework for studying stochasticity in mesoscopic reaction networks. Since its direct solution rapidly becomes intractable due to the increasing size of the state space, truncation of the state space is necessary for solving most dCMEs. It is therefore important to assess the consequences of state space truncations so errors can be quantified and minimized. Here we describe a novel method for state space truncation. By partitioning a reaction network into multiple molecular equivalence groups (MEGs), we truncate the state space by limiting the total molecular copy numbers in each MEG. Wemore » further describe a theoretical framework for analysis of the truncation error in the steady-state probability landscape using reflecting boundaries. By aggregating the state space based on the usage of a MEG and constructing an aggregated Markov process, we show that the truncation error of a MEG can be asymptotically bounded by the probability of states on the reflecting boundary of the MEG. Furthermore, truncating states of an arbitrary MEG will not undermine the estimated error of truncating any other MEGs. We then provide an overall error estimate for networks with multiple MEGs. To rapidly determine the appropriate size of an arbitrary MEG, we also introduce an a priori method to estimate the upper bound of its truncation error. This a priori estimate can be rapidly computed from reaction rates of the network, without the need of costly trial solutions of the dCME. As examples, we show results of applying our methods to the four stochastic networks of (1) the birth and death model, (2) the single gene expression model, (3) the genetic toggle switch model, and (4) the phage lambda bistable epigenetic switch model. We demonstrate how truncation errors and steady-state probability landscapes can be computed using different sizes of the MEG(s) and how the results validate our theories. Overall, the novel state space truncation and error analysis methods developed here can be used to ensure accurate direct solutions to the dCME for a large number of stochastic networks.« less
14 CFR 417.224 - Probability of failure analysis.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...
14 CFR 417.224 - Probability of failure analysis.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...
14 CFR 417.224 - Probability of failure analysis.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...
14 CFR 417.224 - Probability of failure analysis.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...
14 CFR 417.224 - Probability of failure analysis.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Probability of failure analysis. 417.224 Section 417.224 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... phase of normal flight or when any anomalous condition exhibits the potential for a stage or its debris...
NASA Astrophysics Data System (ADS)
Liland, Kristian Hovde; Snipen, Lars
When a series of Bernoulli trials occur within a fixed time frame or limited space, it is often interesting to assess if the successful outcomes have occurred completely at random, or if they tend to group together. One example, in genetics, is detecting grouping of genes within a genome. Approximations of the distribution of successes are possible, but they become inaccurate for small sample sizes. In this article, we describe the exact distribution of time between random, non-overlapping successes in discrete time of fixed length. A complete description of the probability mass function, the cumulative distribution function, mean, variance and recurrence relation is included. We propose an associated test for the over-representation of short distances and illustrate the methodology through relevant examples. The theory is implemented in an R package including probability mass, cumulative distribution, quantile function, random number generator, simulation functions, and functions for testing.
Probabilistic #D data fusion for multiresolution surface generation
NASA Technical Reports Server (NTRS)
Manduchi, R.; Johnson, A. E.
2002-01-01
In this paper we present an algorithm for adaptive resolution integration of 3D data collected from multiple distributed sensors. The input to the algorithm is a set of 3D surface points and associated sensor models. Using a probabilistic rule, a surface probability function is generated that represents the probability that a particular volume of space contains the surface. The surface probability function is represented using an octree data structure; regions of space with samples of large conariance are stored at a coarser level than regions of space containing samples with smaller covariance. The algorithm outputs an adaptive resolution surface generated by connecting points that lie on the ridge of surface probability with triangles scaled to match the local discretization of space given by the algorithm, we present results from 3D data generated by scanning lidar and structure from motion.
Excluding joint probabilities from quantum theory
NASA Astrophysics Data System (ADS)
Allahverdyan, Armen E.; Danageozian, Arshag
2018-03-01
Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.
NASA Technical Reports Server (NTRS)
Lambert, Winifred; Roeder, William
2007-01-01
This conference presentation describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability, and an ability to distinguish between lightning and non-lightning days.
NASA Technical Reports Server (NTRS)
Lambert, Winifred; Roeder, William
2007-01-01
This conference presentation describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May- September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability, and an ability to distinguish between lightning and non-lightning days.
A Framework to Understand Extreme Space Weather Event Probability.
Jonas, Seth; Fronczyk, Kassandra; Pratt, Lucas M
2018-03-12
An extreme space weather event has the potential to disrupt or damage infrastructure systems and technologies that many societies rely on for economic and social well-being. Space weather events occur regularly, but extreme events are less frequent, with a small number of historical examples over the last 160 years. During the past decade, published works have (1) examined the physical characteristics of the extreme historical events and (2) discussed the probability or return rate of select extreme geomagnetic disturbances, including the 1859 Carrington event. Here we present initial findings on a unified framework approach to visualize space weather event probability, using a Bayesian model average, in the context of historical extreme events. We present disturbance storm time (Dst) probability (a proxy for geomagnetic disturbance intensity) across multiple return periods and discuss parameters of interest to policymakers and planners in the context of past extreme space weather events. We discuss the current state of these analyses, their utility to policymakers and planners, the current limitations when compared to other hazards, and several gaps that need to be filled to enhance space weather risk assessments. © 2018 Society for Risk Analysis.
Geometrization of quantum physics
NASA Astrophysics Data System (ADS)
Ol'Khov, O. A.
2009-12-01
It is shown that the Dirac equation for free particle can be considered as a description of specific distortion of the space euclidean geometry (space topological defect). This approach is based on possibility of interpretation of the wave function as vector realizing representation of the fundamental group of the closed topological space-time 4-manifold. Mass and spin appear to be topological invariants. Such concept explains all so called “strange” properties of quantum formalism: probabilities, wave-particle duality, nonlocal instantaneous correlation between noninteracting particles (EPR-paradox) and so on. Acceptance of suggested geometrical concept means rejection of atomistic concept where all matter is considered as consisting of more and more small elementary particles. There is no any particles a priori, before measurement: the notions of particles appear as a result of classical interpretation of the contact of the region of the curved space with a device.
Modeling utilization distributions in space and time
Keating, K.A.; Cherry, S.
2009-01-01
W. Van Winkle defined the utilization distribution (UD) as a probability density that gives an animal's relative frequency of occurrence in a two-dimensional (x, y) plane. We extend Van Winkle's work by redefining the UD as the relative frequency distribution of an animal's occurrence in all four dimensions of space and time. We then describe a product kernel model estimation method, devising a novel kernel from the wrapped Cauchy distribution to handle circularly distributed temporal covariates, such as day of year. Using Monte Carlo simulations of animal movements in space and time, we assess estimator performance. Although not unbiased, the product kernel method yields models highly correlated (Pearson's r - 0.975) with true probabilities of occurrence and successfully captures temporal variations in density of occurrence. In an empirical example, we estimate the expected UD in three dimensions (x, y, and t) for animals belonging to each of two distinct bighorn sheep {Ovis canadensis) social groups in Glacier National Park, Montana, USA. Results show the method can yield ecologically informative models that successfully depict temporal variations in density of occurrence for a seasonally migratory species. Some implications of this new approach to UD modeling are discussed. ?? 2009 by the Ecological Society of America.
Data analysis using scale-space filtering and Bayesian probabilistic reasoning
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Kutulakos, Kiriakos; Robinson, Peter
1991-01-01
This paper describes a program for analysis of output curves from Differential Thermal Analyzer (DTA). The program first extracts probabilistic qualitative features from a DTA curve of a soil sample, and then uses Bayesian probabilistic reasoning to infer the mineral in the soil. The qualifier module employs a simple and efficient extension of scale-space filtering suitable for handling DTA data. We have observed that points can vanish from contours in the scale-space image when filtering operations are not highly accurate. To handle the problem of vanishing points, perceptual organizations heuristics are used to group the points into lines. Next, these lines are grouped into contours by using additional heuristics. Probabilities are associated with these contours using domain-specific correlations. A Bayes tree classifier processes probabilistic features to infer the presence of different minerals in the soil. Experiments show that the algorithm that uses domain-specific correlation to infer qualitative features outperforms a domain-independent algorithm that does not.
Sides, Marian B; Vernikos, Joan; Convertino, Victor A; Stepanek, Jan; Tripp, Lloyd D; Draeger, Jorg; Hargens, Alan R; Kourtidou-Papadeli, Chrysoula; Pavy-LeTraon, Anne; Russomano, Thais; Wong, Julielynn Y; Buccello, Regina R; Lee, Peter H; Nangalia, Vishal; Saary, M Joan
2005-09-01
Long-duration space missions, as well as emerging civilian tourist space travel activities, prompted review and assessment of data available to date focusing on cardiovascular risk and available risk mitigation strategies. The goal was the creation of tools for risk priority assessments taking into account the probability of the occurrence of an adverse cardiovascular event and available and published literature from spaceflight data as well as available risk mitigation strategies. An international group of scientists convened in Bellagio, Italy, in 2004 under the auspices of the Aerospace Medical Association to review available literature for cardiac risks identified in the Bioastronautics Critical Path Roadmap (versions 2000, 2004). This effort led to the creation of a priority assessment framework to allow for an objective assessment of the hazard, probability of its occurrence, mission impact, and available risk mitigation measures. Spaceflight data are presented regarding evidence/ no evidence of cardiac dysrhythmias, cardiovascular disease, and cardiac function as well as orthostatic intolerance, exercise capacity, and peripheral resistance in presyncopal astronauts compared to non-presyncopal astronauts. Assessment of the priority of different countermeasures was achieved with a tabular framework with focus on probability of occurrence, mission impact, compliance, practicality, and effectiveness of countermeasures. Special operational settings and circumstances related to sensitive portions of any mission and the impact of environmental influences on mission effectiveness are addressed. The need for development of diagnostic tools, techniques, and countermeasure devices, food preparation, preservation technologies and medication, as well as an infrastructure to support these operations are stressed. Selected countermeasure options, including artificial gravity and pharmacological countermeasures need to be systematically evaluated and validated in flight, especially after long-duration exposures. Data need to be collected regarding the emerging field of suborbital and orbital civilian space travel, to allow for sound risk assessment.
NASA Technical Reports Server (NTRS)
Soneira, R. M.; Bahcall, J. N.
1981-01-01
Probabilities are calculated for acquiring suitable guide stars (GS) with the fine guidance system (FGS) of the space telescope. A number of the considerations and techniques described are also relevant for other space astronomy missions. The constraints of the FGS are reviewed. The available data on bright star densities are summarized and a previous error in the literature is corrected. Separate analytic and Monte Carlo calculations of the probabilities are described. A simulation of space telescope pointing is carried out using the Weistrop north galactic pole catalog of bright stars. Sufficient information is presented so that the probabilities of acquisition can be estimated as a function of position in the sky. The probability of acquiring suitable guide stars is greatly increased if the FGS can allow an appreciable difference between the (bright) primary GS limiting magnitude and the (fainter) secondary GS limiting magnitude.
Target intersection probabilities for parallel-line and continuous-grid types of search
McCammon, R.B.
1977-01-01
The expressions for calculating the probability of intersection of hidden targets of different sizes and shapes for parallel-line and continuous-grid types of search can be formulated by vsing the concept of conditional probability. When the prior probability of the orientation of a widden target is represented by a uniform distribution, the calculated posterior probabilities are identical with the results obtained by the classic methods of probability. For hidden targets of different sizes and shapes, the following generalizations about the probability of intersection can be made: (1) to a first approximation, the probability of intersection of a hidden target is proportional to the ratio of the greatest dimension of the target (viewed in plane projection) to the minimum line spacing of the search pattern; (2) the shape of the hidden target does not greatly affect the probability of the intersection when the largest dimension of the target is small relative to the minimum spacing of the search pattern, (3) the probability of intersecting a target twice for a particular type of search can be used as a lower bound if there is an element of uncertainty of detection for a particular type of tool; (4) the geometry of the search pattern becomes more critical when the largest dimension of the target equals or exceeds the minimum spacing of the search pattern; (5) for elongate targets, the probability of intersection is greater for parallel-line search than for an equivalent continuous square-grid search when the largest dimension of the target is less than the minimum spacing of the search pattern, whereas the opposite is true when the largest dimension exceeds the minimum spacing; (6) the probability of intersection for nonorthogonal continuous-grid search patterns is not greatly different from the probability of intersection for the equivalent orthogonal continuous-grid pattern when the orientation of the target is unknown. The probability of intersection for an elliptically shaped target can be approximated by treating the ellipse as intermediate between a circle and a line. A search conducted along a continuous rectangular grid can be represented as intermediate between a search along parallel lines and along a continuous square grid. On this basis, an upper and lower bound for the probability of intersection of an elliptically shaped target for a continuous rectangular grid can be calculated. Charts have been constructed that permit the values for these probabilities to be obtained graphically. The use of conditional probability allows the explorationist greater flexibility in considering alternate search strategies for locating hidden targets. ?? 1977 Plenum Publishing Corp.
Temperature-resolved study of three [M(M'O4)4(TBPO)4] complexes (MM' = URe, ThRe, ThTc).
Helliwell, Madeleine; Collison, David; John, Gordon H; May, Iain; Sarsfield, Mark J; Sharrad, Clint A; Sutton, Andrew D
2006-02-01
The crystal structures of the title complexes were measured at several temperatures between room temperature and 100 K. Each sample shows reversible crystal-to-crystal phase transitions as the temperature is varied. The behaviour of [U(ReO4)4(TBPO)4] (I) and [Th(ReO4)4(TBPO)4] (II) (TBPO = tri-n-butylphosphine oxide) is very similar; at room temperature, crystals of (I) and (II) are isostructural, with space group I42m, and reducing the temperature to 100 K causes a lowering of the space-group symmetry to C-centred cells, space groups Cc for (I) and Cmc2(1) for (II). The variation of lattice symmetry of [Th(TcO4)4(TBPO)4] (III) was found to be somewhat different, with the body-centred cubic space group, I43m, occurring at 293 K, a reduction of symmetry at 230 K to the C-centred orthorhombic space group, Cmc2(1), and a further transition to the primitive orthorhombic space group, Pbc2(1), below 215 K. Elucidation of the correct space-group symmetry and the subsequent refinement was complicated in some cases by the twinning by pseudo-merohedry that arises from the lowering of the space-group symmetry, occurring as the temperature is reduced. All three of the crystal structures determined at room temperature have high atomic displacement parameters, particularly of the (n)Bu groups, and (III) shows disorder of some of the O atoms. The structures in the space group Cmc2(1), show some disorder of nBu groups, but are otherwise reasonably well ordered; the structures of (I) in Cc and (III) in Pbc2(1) are ordered, even to the ends of the alkyl chains. Inter-comparison of the structures measured below 293 K, using the program OFIT from the SHELXTL package, showed that generally, they are remarkably alike, with weighted r.m.s. deviations of the M, M' and P atoms of less than 0.1 A, as are the 293 K structures of (I) and (II) with their low-temperature counterparts. However, the structure of (III) measured in the space group Cmc2(1) is significantly different from both the structure of (III) at 293 K and that found below 215 K, with weighted r.m.s. deviations of the Th, Tc and P atoms of 0.40 and 0.37 A, respectively. An extensive network of weak intra- and intermolecular C-H...O hydrogen bonds found between the atoms of the nBu and [M'O4] groups probably influences the packing and the overall geometry of the molecules.
Metabolic changes in rats subjected to space flight for 18.5 days in the biosatellite Cosmos 936
NASA Astrophysics Data System (ADS)
Németh, Š.; Macho, L.; Palkovič, M.; Škottová, N.; Tigranyan, R. A.
From an investigation of the activity of six glucocorticoid dependent liver enzymes, the existence of chronic, transient, stress-induced hypercorticosteronaemia during flight is probable. This hypercorticosteronaemia arises from weightlessness and induces gluconeogenesis. Weightlessness also caused substantial increases in liver glycogen level. The increased lipolytic activity and that of lipoprotein lipase in several groups of animals could be interpreted as enhancement of fat mobilization and utilization under the influence of stress. As this latter enhancement was also found in ground-based controls, it may have been due to the stress of handling rather than to space flight per se.
Urban green space and obesity in older adults: Evidence from Ireland.
Dempsey, Seraphim; Lyons, Seán; Nolan, Anne
2018-04-01
We examine the association between living in an urban area with more or less green space and the probability of being obese. This work involves the creation of a new dataset which combines geo-coded data at the individual level from the Irish Longitudinal Study on Ageing with green space data from the European Urban Atlas 2012. We find evidence suggestive of a u-shaped relationship between green space in urban areas and obesity; those living in areas with the lowest and highest shares of green space within a 1.6 km buffer zone have a higher probability of being classified as obese (BMI [Formula: see text]). The unexpected result that persons in areas with both the lowest and highest shares of green space have a higher probability of being obese than those in areas with intermediate shares, suggests that other characteristics of urban areas may be mediating this relationship.
Simple gain probability functions for large reflector antennas of JPL/NASA
NASA Technical Reports Server (NTRS)
Jamnejad, V.
2003-01-01
Simple models for the patterns as well as their cumulative gain probability and probability density functions of the Deep Space Network antennas are developed. These are needed for the study and evaluation of interference from unwanted sources such as the emerging terrestrial system, High Density Fixed Service, with the Ka-band receiving antenna systems in Goldstone Station of the Deep Space Network.
Investigating prior probabilities in a multiple hypothesis test for use in space domain awareness
NASA Astrophysics Data System (ADS)
Hardy, Tyler J.; Cain, Stephen C.
2016-05-01
The goal of this research effort is to improve Space Domain Awareness (SDA) capabilities of current telescope systems through improved detection algorithms. Ground-based optical SDA telescopes are often spatially under-sampled, or aliased. This fact negatively impacts the detection performance of traditionally proposed binary and correlation-based detection algorithms. A Multiple Hypothesis Test (MHT) algorithm has been previously developed to mitigate the effects of spatial aliasing. This is done by testing potential Resident Space Objects (RSOs) against several sub-pixel shifted Point Spread Functions (PSFs). A MHT has been shown to increase detection performance for the same false alarm rate. In this paper, the assumption of a priori probability used in a MHT algorithm is investigated. First, an analysis of the pixel decision space is completed to determine alternate hypothesis prior probabilities. These probabilities are then implemented into a MHT algorithm, and the algorithm is then tested against previous MHT algorithms using simulated RSO data. Results are reported with Receiver Operating Characteristic (ROC) curves and probability of detection, Pd, analysis.
Cluster-based control of a separating flow over a smoothly contoured ramp
NASA Astrophysics Data System (ADS)
Kaiser, Eurika; Noack, Bernd R.; Spohn, Andreas; Cattafesta, Louis N.; Morzyński, Marek
2017-12-01
The ability to manipulate and control fluid flows is of great importance in many scientific and engineering applications. The proposed closed-loop control framework addresses a key issue of model-based control: The actuation effect often results from slow dynamics of strongly nonlinear interactions which the flow reveals at timescales much longer than the prediction horizon of any model. Hence, we employ a probabilistic approach based on a cluster-based discretization of the Liouville equation for the evolution of the probability distribution. The proposed methodology frames high-dimensional, nonlinear dynamics into low-dimensional, probabilistic, linear dynamics which considerably simplifies the optimal control problem while preserving nonlinear actuation mechanisms. The data-driven approach builds upon a state space discretization using a clustering algorithm which groups kinematically similar flow states into a low number of clusters. The temporal evolution of the probability distribution on this set of clusters is then described by a control-dependent Markov model. This Markov model can be used as predictor for the ergodic probability distribution for a particular control law. This probability distribution approximates the long-term behavior of the original system on which basis the optimal control law is determined. We examine how the approach can be used to improve the open-loop actuation in a separating flow dominated by Kelvin-Helmholtz shedding. For this purpose, the feature space, in which the model is learned, and the admissible control inputs are tailored to strongly oscillatory flows.
Structural, microstructural and vibrational analyses of the monoclinic tungstate BiLuWO{sub 6}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ait Ahsaine, H.; Taoufyq, A.; Institut Matériaux Microélectronique et Nanosciences de Provence, IM2NP, UMR CNRS 7334, Université de Toulon, BP 20132, 83957 La Garde Cedex
2014-10-15
The bismuth lutetium tungstate phase BiLuWO{sub 6} has been prepared using a solid state route with stoichiometric mixtures of oxide precursors. The obtained polycrystalline phase has been characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), transmission electron microscopy (TEM), and Raman spectroscopy. In the first step, the crystal structure has been refined using Rietveld method: the crystal cell was resolved using monoclinic system (parameters a, b, c, β) with space group A2/m. SEM images showed the presence of large crystallites with a constant local nominal composition (BiLuW). TEM analyses showed that the actual local structure could be better representedmore » by a superlattice (a, 2b, c, β) associated with space groups P2 or P2/m. The Raman spectroscopy showed the presence of vibrational bands similar to those observed in the compounds BiREWO{sub 6} with RE=Y, Gd, Nd. However, these vibrational bands were characterized by large full width at half maximum, probably resulting from the long range Bi/Lu disorder and local WO{sub 6} octahedron distortions in the structure. - Graphical abstract: The average structure of BiLuWO{sub 6} determined from X-ray diffraction data can be represented by A2/m space group. Experimental Electron Diffraction patterns along the [0vw] zone axes of the monoclinic structure and associated simulated patterns show the existence of a monoclinic superstructure with space group P2 or P2/m. - Highlights: • A new monoclinic BiLuWO{sub 6} phase has been elaborated from solid-state reaction. • The space group of the monoclinic disordered average structure should be A2/m. • Transmission electron microscopy leads to a superlattice with P2/m space group. • Raman spectroscopy suggests existence of local disorder.« less
Combined-probability space and certainty or uncertainty relations for a finite-level quantum system
NASA Astrophysics Data System (ADS)
Sehrawat, Arun
2017-08-01
The Born rule provides a probability vector (distribution) with a quantum state for a measurement setting. For two settings, we have a pair of vectors from the same quantum state. Each pair forms a combined-probability vector that obeys certain quantum constraints, which are triangle inequalities in our case. Such a restricted set of combined vectors, called the combined-probability space, is presented here for a d -level quantum system (qudit). The combined space is a compact convex subset of a Euclidean space, and all its extreme points come from a family of parametric curves. Considering a suitable concave function on the combined space to estimate the uncertainty, we deliver an uncertainty relation by finding its global minimum on the curves for a qudit. If one chooses an appropriate concave (or convex) function, then there is no need to search for the absolute minimum (maximum) over the whole space; it will be on the parametric curves. So these curves are quite useful for establishing an uncertainty (or a certainty) relation for a general pair of settings. We also demonstrate that many known tight certainty or uncertainty relations for a qubit can be obtained with the triangle inequalities.
NASA Astrophysics Data System (ADS)
Papaioannou, Athanasios; Mavromichalaki, Helen; Souvatzoglou, George; Paschalis, Pavlos; Sarlanis, Christos; Dimitroulakos, John; Gerontidou, Maria
2013-04-01
High-energy particles released at the Sun during a solar flare or a very energetic coronal mass ejection, result to a significant intensity increase at neutron monitor measurements known as Ground Level Enhancements (GLEs). Due to their space weather impact (i.e. risks and failures at communication and navigation systems, spacecraft electronics and operations, space power systems, manned space missions, and commercial aircraft operations) it is crucial to establish a real-time operational system that would be in place to issue reliable and timely GLE Alerts. Currently, the Cosmic Ray group of the National and Kapodistrian University of Athens is working towards the establishment of a Neutron Monitor Service that will be made available via the Space Weather Portal operated by the European Space Agency (ESA), under the Space Situational Awareness (SSA) Program. To this end, a web interface providing data from multiple Neutron Monitor stations as well as an upgraded GLE Alert will be provided. Both services are now under testing and validation and they will probably enter to an operational phase next year. The core of this Neutron Monitor Service is the GLE Alert software, and therefore, the main goal of this research effort is to upgrade the existing GLE Alert software, to minimize the probability of a false alarm and to enhance the usability of the corresponding results. The ESA Neutron Monitor Service is building upon the infrastructure made available with the implementation of the High-Resolution Neutron Monitor Database (NMDB). In this work the structure of the Neutron Monitor Service for ESA SSA Program and the impact of the novel GLE Alert Service that will be made available to future users via ESA SSA web portal will be presented and further discussed.
Positive phase space distributions and uncertainty relations
NASA Technical Reports Server (NTRS)
Kruger, Jan
1993-01-01
In contrast to a widespread belief, Wigner's theorem allows the construction of true joint probabilities in phase space for distributions describing the object system as well as for distributions depending on the measurement apparatus. The fundamental role of Heisenberg's uncertainty relations in Schroedinger form (including correlations) is pointed out for these two possible interpretations of joint probability distributions. Hence, in order that a multivariate normal probability distribution in phase space may correspond to a Wigner distribution of a pure or a mixed state, it is necessary and sufficient that Heisenberg's uncertainty relation in Schroedinger form should be satisfied.
Group Task Force on Satellite Rescue and Repair
NASA Astrophysics Data System (ADS)
1992-09-01
The Group Task Force was chartered by the Administrator of NASA to recommend 'a policy outlining the criteria, the design standards, and the pricing model to guide NASA in assessing the responsibilities for government and nongovernment Satellite Rescue and Repair Missions.' Criteria for accepting such missions, risks, and benefits to all sectors of our economy involved in satellite services, adequacy of planning and training, and the impact on NASA's primary mission were reviewed. The Group began by asking a more fundamental question; is satellite rescue and repair a logical element of NASA's mission? Factors considered were: (1) the probability of rescue or repair opportunities arising; (2) the economic justification for such attempts; (3) the benefits to NASA, both from such ad hoc learning experiences in space operations and the impact on the public perception of NASA; (4) the effect of such unanticipated missions on NASA's scheduled activities; (5) any potential effect on NASA's technical capability to work in space; and (6) any potential effect on U.S. economic competitiveness.
Group Task Force on Satellite Rescue and Repair
NASA Technical Reports Server (NTRS)
1992-01-01
The Group Task Force was chartered by the Administrator of NASA to recommend 'a policy outlining the criteria, the design standards, and the pricing model to guide NASA in assessing the responsibilities for government and nongovernment Satellite Rescue and Repair Missions.' Criteria for accepting such missions, risks, and benefits to all sectors of our economy involved in satellite services, adequacy of planning and training, and the impact on NASA's primary mission were reviewed. The Group began by asking a more fundamental question; is satellite rescue and repair a logical element of NASA's mission? Factors considered were: (1) the probability of rescue or repair opportunities arising; (2) the economic justification for such attempts; (3) the benefits to NASA, both from such ad hoc learning experiences in space operations and the impact on the public perception of NASA; (4) the effect of such unanticipated missions on NASA's scheduled activities; (5) any potential effect on NASA's technical capability to work in space; and (6) any potential effect on U.S. economic competitiveness.
NASA Astrophysics Data System (ADS)
D'silva, Oneil; Kerrison, Roger
2013-09-01
A key feature for the increased utilization of space robotics is to automate Extra-Vehicular manned space activities and thus significantly reduce the potential for catastrophic hazards while simultaneously minimizing the overall costs associated with manned space. The principal scope of the paper is to evaluate the use of industry standard accepted Probability risk/safety assessment (PRA/PSA) methodologies and Hazard Risk frequency Criteria as a hazard control. This paper illustrates the applicability of combining the selected Probability risk assessment methodology and hazard risk frequency criteria, in order to apply the necessary safety controls that allow for the increased use of the Mobile Servicing system (MSS) robotic system on the International Space Station. This document will consider factors such as component failure rate reliability, software reliability, and periods of operation and dormancy, fault tree analyses and their effects on the probability risk assessments. The paper concludes with suggestions for the incorporation of existing industry Risk/Safety plans to create an applicable safety process for future activities/programs
Interference in the classical probabilistic model and its representation in complex Hilbert space
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei Yu.
2005-10-01
The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.
Preprocessed Consortium for Neuropsychiatric Phenomics dataset.
Gorgolewski, Krzysztof J; Durnez, Joke; Poldrack, Russell A
2017-01-01
Here we present preprocessed MRI data of 265 participants from the Consortium for Neuropsychiatric Phenomics (CNP) dataset. The preprocessed dataset includes minimally preprocessed data in the native, MNI and surface spaces accompanied with potential confound regressors, tissue probability masks, brain masks and transformations. In addition the preprocessed dataset includes unthresholded group level and single subject statistical maps from all tasks included in the original dataset. We hope that availability of this dataset will greatly accelerate research.
Koorehdavoudi, Hana; Bogdan, Paul
2016-01-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity. PMID:27297496
NASA Astrophysics Data System (ADS)
Koorehdavoudi, Hana; Bogdan, Paul
2016-06-01
Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.
Short-term capture of the Earth-Moon system
NASA Astrophysics Data System (ADS)
Qi, Yi; de Ruiter, Anton
2018-06-01
In this paper, the short-term capture (STC) of an asteroid in the Earth-Moon system is proposed and investigated. First, the space condition of STC is analysed and five subsets of the feasible region are defined and discussed. Then, the time condition of STC is studied by parameter scanning in the Sun-Earth-Moon-asteroid restricted four-body problem. Numerical results indicate that there is a clear association between the distributions of the time probability of STC and the five subsets. Next, the influence of the Jacobi constant on STC is examined using the space and time probabilities of STC. Combining the space and time probabilities of STC, we propose a STC index to evaluate the probability of STC comprehensively. Finally, three potential STC asteroids are found and analysed.
Integrating resource selection information with spatial capture--recapture
Royle, J. Andrew; Chandler, Richard B.; Sun, Catherine C.; Fuller, Angela K.
2013-01-01
4. Finally, we find that SCR models using standard symmetric and stationary encounter probability models may not fully explain variation in encounter probability due to space usage, and therefore produce biased estimates of density when animal space usage is related to resource selection. Consequently, it is important that space usage be taken into consideration, if possible, in studies focused on estimating density using capture–recapture methods.
Exploring the Structure of Spatial Representations
Madl, Tamas; Franklin, Stan; Chen, Ke; Trappl, Robert; Montaldi, Daniela
2016-01-01
It has been suggested that the map-like representations that support human spatial memory are fragmented into sub-maps with local reference frames, rather than being unitary and global. However, the principles underlying the structure of these ‘cognitive maps’ are not well understood. We propose that the structure of the representations of navigation space arises from clustering within individual psychological spaces, i.e. from a process that groups together objects that are close in these spaces. Building on the ideas of representational geometry and similarity-based representations in cognitive science, we formulate methods for learning dissimilarity functions (metrics) characterizing participants’ psychological spaces. We show that these learned metrics, together with a probabilistic model of clustering based on the Bayesian cognition paradigm, allow prediction of participants’ cognitive map structures in advance. Apart from insights into spatial representation learning in human cognition, these methods could facilitate novel computational tools capable of using human-like spatial concepts. We also compare several features influencing spatial memory structure, including spatial distance, visual similarity and functional similarity, and report strong correlations between these dimensions and the grouping probability in participants’ spatial representations, providing further support for clustering in spatial memory. PMID:27347681
NASA Astrophysics Data System (ADS)
Mikami, Masato; Saputro, Herman; Seo, Takehiko; Oyagi, Hiroshi
2018-03-01
Stable operation of liquid-fueled combustors requires the group combustion of fuel spray. Our study employs a percolation approach to describe unsteady group-combustion excitation based on findings obtained from microgravity experiments on the flame spread of fuel droplets. We focus on droplet clouds distributed randomly in three-dimensional square lattices with a low-volatility fuel, such as n-decane in room-temperature air, where the pre-vaporization effect is negligible. We also focus on the flame spread in dilute droplet clouds near the group-combustion-excitation limit, where the droplet interactive effect is assumed negligible. The results show that the occurrence probability of group combustion sharply decreases with the increase in mean droplet spacing around a specific value, which is termed the critical mean droplet spacing. If the lattice size is at smallest about ten times as large as the flame-spread limit distance, the flame-spread characteristics are similar to those over an infinitely large cluster. The number density of unburned droplets remaining after completion of burning attained maximum around the critical mean droplet spacing. Therefore, the critical mean droplet spacing is a good index for stable combustion and unburned hydrocarbon. In the critical condition, the flame spreads through complicated paths, and thus the characteristic time scale of flame spread over droplet clouds has a very large value. The overall flame-spread rate of randomly distributed droplet clouds is almost the same as the flame-spread rate of a linear droplet array except over the flame-spread limit.
Pollitz, F.F.; Schwartz, D.P.
2008-01-01
We construct a viscoelastic cycle model of plate boundary deformation that includes the effect of time-dependent interseismic strain accumulation, coseismic strain release, and viscoelastic relaxation of the substrate beneath the seismogenic crust. For a given fault system, time-averaged stress changes at any point (not on a fault) are constrained to zero; that is, kinematic consistency is enforced for the fault system. The dates of last rupture, mean recurrence times, and the slip distributions of the (assumed) repeating ruptures are key inputs into the viscoelastic cycle model. This simple formulation allows construction of stress evolution at all points in the plate boundary zone for purposes of probabilistic seismic hazard analysis (PSHA). Stress evolution is combined with a Coulomb failure stress threshold at representative points on the fault segments to estimate the times of their respective future ruptures. In our PSHA we consider uncertainties in a four-dimensional parameter space: the rupture peridocities, slip distributions, time of last earthquake (for prehistoric ruptures) and Coulomb failure stress thresholds. We apply this methodology to the San Francisco Bay region using a recently determined fault chronology of area faults. Assuming single-segment rupture scenarios, we find that fature rupture probabilities of area faults in the coming decades are the highest for the southern Hayward, Rodgers Creek, and northern Calaveras faults. This conclusion is qualitatively similar to that of Working Group on California Earthquake Probabilities, but the probabilities derived here are significantly higher. Given that fault rupture probabilities are highly model-dependent, no single model should be used to assess to time-dependent rupture probabilities. We suggest that several models, including the present one, be used in a comprehensive PSHA methodology, as was done by Working Group on California Earthquake Probabilities.
A stochastic differential equation model for the foraging behavior of fish schools.
Tạ, Tôn Việt; Nguyen, Linh Thi Hoai
2018-03-15
Constructing models of living organisms locating food sources has important implications for understanding animal behavior and for the development of distribution technologies. This paper presents a novel simple model of stochastic differential equations for the foraging behavior of fish schools in a space including obstacles. The model is studied numerically. Three configurations of space with various food locations are considered. In the first configuration, fish swim in free but limited space. All individuals can find food with large probability while keeping their school structure. In the second and third configurations, they move in limited space with one and two obstacles, respectively. Our results reveal that the probability of foraging success is highest in the first configuration, and smallest in the third one. Furthermore, when school size increases up to an optimal value, the probability of foraging success tends to increase. When it exceeds an optimal value, the probability tends to decrease. The results agree with experimental observations.
A stochastic differential equation model for the foraging behavior of fish schools
NASA Astrophysics Data System (ADS)
Tạ, Tôn ệt, Vi; Hoai Nguyen, Linh Thi
2018-05-01
Constructing models of living organisms locating food sources has important implications for understanding animal behavior and for the development of distribution technologies. This paper presents a novel simple model of stochastic differential equations for the foraging behavior of fish schools in a space including obstacles. The model is studied numerically. Three configurations of space with various food locations are considered. In the first configuration, fish swim in free but limited space. All individuals can find food with large probability while keeping their school structure. In the second and third configurations, they move in limited space with one and two obstacles, respectively. Our results reveal that the probability of foraging success is highest in the first configuration, and smallest in the third one. Furthermore, when school size increases up to an optimal value, the probability of foraging success tends to increase. When it exceeds an optimal value, the probability tends to decrease. The results agree with experimental observations.
NASA Astrophysics Data System (ADS)
Sarantopoulou, E.; Gomoiu, I.; Kollia, Z.; Cefalas, A. C.
2011-01-01
This work is a part of ESA/EU SURE project aiming to quantify the survival probability of fungal spores in space under solar irradiation in the vacuum ultraviolet (VUV) (110-180 nm) spectral region. The contribution and impact of VUV photons, vacuum, low temperature and their synergies on the survival probability of Aspergillus terreus spores is measured at simulated space conditions on Earth. To simulate the solar VUV irradiation, the spores are irradiated with a continuous discharge VUV hydrogen photon source and a molecular fluorine laser, at low and high photon intensities at 10 15 photon m -2 s -1 and 3.9×10 27 photons pulse -1 m -2 s -1, respectively. The survival probability of spores is independent from the intensity and the fluence of photons, within certain limits, in agreement with previous studies. The spores are shielded from a thin carbon layer, which is formed quickly on the external surface of the proteinaceous membrane at higher photon intensities at the start of the VUV irradiation. Extrapolating the results in space conditions, for an interplanetary direct transfer orbit from Mars to Earth, the spores will be irradiated with 3.3×10 21 solar VUV photons m -2. This photon fluence is equivalent to the irradiation of spores on Earth with 54 laser pulses with an experimental ˜92% survival probability, disregarding the contribution of space vacuum and low temperature, or to continuous solar VUV irradiation for 38 days in space near the Earth with an extrapolated ˜61% survival probability. The experimental results indicate that the damage of spores is mainly from the dehydration stress in vacuum. The high survival probability after 4 days in vacuum (˜34%) is due to the exudation of proteins on the external membrane, thus preventing further dehydration of spores. In addition, the survival probability is increasing to ˜54% at 10 K with 0.12 K/s cooling and heating rates.
Coincidence probability as a measure of the average phase-space density at freeze-out
NASA Astrophysics Data System (ADS)
Bialas, A.; Czyz, W.; Zalewski, K.
2006-02-01
It is pointed out that the average semi-inclusive particle phase-space density at freeze-out can be determined from the coincidence probability of the events observed in multiparticle production. The method of measurement is described and its accuracy examined.
Linking of uniform random polygons in confined spaces
NASA Astrophysics Data System (ADS)
Arsuaga, J.; Blackstone, T.; Diao, Y.; Karadayi, E.; Saito, M.
2007-03-01
In this paper, we study the topological entanglement of uniform random polygons in a confined space. We derive the formula for the mean squared linking number of such polygons. For a fixed simple closed curve in the confined space, we rigorously show that the linking probability between this curve and a uniform random polygon of n vertices is at least 1-O\\big(\\frac{1}{\\sqrt{n}}\\big) . Our numerical study also indicates that the linking probability between two uniform random polygons (in a confined space), of m and n vertices respectively, is bounded below by 1-O\\big(\\frac{1}{\\sqrt{mn}}\\big) . In particular, the linking probability between two uniform random polygons, both of n vertices, is bounded below by 1-O\\big(\\frac{1}{n}\\big) .
Statistical hydrodynamics and related problems in spaces of probability measures
NASA Astrophysics Data System (ADS)
Dostoglou, Stamatios
2017-11-01
A rigorous theory of statistical solutions of the Navier-Stokes equations, suitable for exploring Kolmogorov's ideas, has been developed by M.I. Vishik and A.V. Fursikov, culminating in their monograph "Mathematical problems of Statistical Hydromechanics." We review some progress made in recent years following this approach, with emphasis on problems concerning the correlation of velocities and corresponding questions in the space of probability measures on Hilbert spaces.
Risk Assessment of Bone Fracture During Space Exploration Missions to the Moon and Mars
NASA Technical Reports Server (NTRS)
Lewandowski, Beth E.; Myers, Jerry G.; Nelson, Emily S.; Licatta, Angelo; Griffin, Devon
2007-01-01
The possibility of a traumatic bone fracture in space is a concern due to the observed decrease in astronaut bone mineral density (BMD) during spaceflight and because of the physical demands of the mission. The Bone Fracture Risk Module (BFxRM) was developed to quantify the probability of fracture at the femoral neck and lumbar spine during space exploration missions. The BFxRM is scenario-based, providing predictions for specific activities or events during a particular space mission. The key elements of the BFxRM are the mission parameters, the biomechanical loading models, the bone loss and fracture models and the incidence rate of the activity or event. Uncertainties in the model parameters arise due to variations within the population and unknowns associated with the effects of the space environment. Consequently, parameter distributions were used in Monte Carlo simulations to obtain an estimate of fracture probability under real mission scenarios. The model predicts an increase in the probability of fracture as the mission length increases and fracture is more likely in the higher gravitational field of Mars than on the moon. The resulting probability predictions and sensitivity analyses of the BFxRM can be used as an engineering tool for mission operation and resource planning in order to mitigate the risk of bone fracture in space.
Risk Assessment of Bone Fracture During Space Exploration Missions to the Moon and Mars
NASA Technical Reports Server (NTRS)
Lewandowski, Beth E.; Myers, Jerry G.; Nelson, Emily S.; Griffin, Devon
2008-01-01
The possibility of a traumatic bone fracture in space is a concern due to the observed decrease in astronaut bone mineral density (BMD) during spaceflight and because of the physical demands of the mission. The Bone Fracture Risk Module (BFxRM) was developed to quantify the probability of fracture at the femoral neck and lumbar spine during space exploration missions. The BFxRM is scenario-based, providing predictions for specific activities or events during a particular space mission. The key elements of the BFxRM are the mission parameters, the biomechanical loading models, the bone loss and fracture models and the incidence rate of the activity or event. Uncertainties in the model parameters arise due to variations within the population and unknowns associated with the effects of the space environment. Consequently, parameter distributions were used in Monte Carlo simulations to obtain an estimate of fracture probability under real mission scenarios. The model predicts an increase in the probability of fracture as the mission length increases and fracture is more likely in the higher gravitational field of Mars than on the moon. The resulting probability predictions and sensitivity analyses of the BFxRM can be used as an engineering tool for mission operation and resource planning in order to mitigate the risk of bone fracture in space.
Williams, David M; Dechen Quinn, Amy C; Porter, William F
2014-01-01
Contacts between hosts are essential for transmission of many infectious agents. Understanding how contacts, and thus transmission rates, occur in space and time is critical to effectively responding to disease outbreaks in free-ranging animal populations. Contacts between animals in the wild are often difficult to observe or measure directly. Instead, one must infer contacts from metrics such as proximity in space and time. Our objective was to examine how contacts between white-tailed deer (Odocoileus virginianus) vary in space and among seasons. We used GPS movement data from 71 deer in central New York State to quantify potential direct contacts between deer and indirect overlap in space use across time and space. Daily probabilities of direct contact decreased from winter (0.05-0.14), to low levels post-parturition through summer (0.00-0.02), and increased during the rut to winter levels. The cumulative distribution for the spatial structure of direct and indirect contact probabilities around a hypothetical point of occurrence increased rapidly with distance for deer pairs separated by 1,000 m-7,000 m. Ninety-five percent of the probabilities of direct contact occurred among deer pairs within 8,500 m of one another, and 99% within 10,900 m. Probabilities of indirect contact accumulated across greater spatial extents: 95% at 11,900 m and 99% at 49,000 m. Contacts were spatially consistent across seasons, indicating that although contact rates differ seasonally, they occur proportionally across similar landscape extents. Distributions of contact probabilities across space can inform management decisions for assessing risk and allocating resources in response.
Searching for new young stars in the Northern hemisphere: the Pisces moving group
NASA Astrophysics Data System (ADS)
Binks, A. S.; Jeffries, R. D.; Ward, J. L.
2018-01-01
Using the kinematically unbiased technique described in Binks, Jeffries & Maxted (2015), we present optical spectra for a further 122 rapidly rotating (rotation periods <6 d), X-ray active FGK stars, selected from the SuperWASP survey. We identify 17 new examples of young, probably single stars with ages of <200 Myr and provide additional evidence for a new Northern hemisphere kinematic association: the Pisces moving group (MG). The group consists of 14 lithium-rich G- and K-type stars that have a dispersion of only ∼3 km s-1 in each Galactic space velocity coordinate. The group members are approximately coeval in the colour-magnitude diagram, with an age of 30-50 Myr, and have similar, though not identical, kinematics to the Octans-Near MG.
Dynamical Correspondence in a Generalized Quantum Theory
NASA Astrophysics Data System (ADS)
Niestegge, Gerd
2015-05-01
In order to figure out why quantum physics needs the complex Hilbert space, many attempts have been made to distinguish the C*-algebras and von Neumann algebras in more general classes of abstractly defined Jordan algebras (JB- and JBW-algebras). One particularly important distinguishing property was identified by Alfsen and Shultz and is the existence of a dynamical correspondence. It reproduces the dual role of the selfadjoint operators as observables and generators of dynamical groups in quantum mechanics. In the paper, this concept is extended to another class of nonassociative algebras, arising from recent studies of the quantum logics with a conditional probability calculus and particularly of those that rule out third-order interference. The conditional probability calculus is a mathematical model of the Lüders-von Neumann quantum measurement process, and third-order interference is a property of the conditional probabilities which was discovered by Sorkin (Mod Phys Lett A 9:3119-3127, 1994) and which is ruled out by quantum mechanics. It is shown then that the postulates that a dynamical correspondence exists and that the square of any algebra element is positive still characterize, in the class considered, those algebras that emerge from the selfadjoint parts of C*-algebras equipped with the Jordan product. Within this class, the two postulates thus result in ordinary quantum mechanics using the complex Hilbert space or, vice versa, a genuine generalization of quantum theory must omit at least one of them.
Optimum space shuttle launch times relative to natural environment
NASA Technical Reports Server (NTRS)
King, R. L.
1977-01-01
The probabilities of favorable and unfavorable weather conditions for launch and landing of the STS under different criteria were computed for every three hours on a yearly basis using 14 years of weather data. These temporal probability distributions were considered for three sets of weather criteria encompassing benign, moderate and severe weather conditions for both Kennedy Space Center and for Edwards Air Force Base. In addition, the conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed. The probabilities were computed to indicate the significance of each weather element to the overall result.
Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data
NASA Astrophysics Data System (ADS)
Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei
2009-03-01
We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.
Optimum space shuttle launch times relative to natural environment
NASA Technical Reports Server (NTRS)
King, R. L.
1977-01-01
Three sets of meteorological criteria were analyzed to determine the probabilities of favorable launch and landing conditions. Probabilities were computed for every 3 hours on a yearly basis using 14 years of weather data. These temporal probability distributions, applicable to the three sets of weather criteria encompassing benign, moderate and severe weather conditions, were computed for both Kennedy Space Center (KSC) and Edwards Air Force Base. In addition, conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also, for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed have been computed so that mission probabilities may be more accurately computed for those time periods when persistence strongly correlates weather conditions. Moreover, the probabilities and conditional probabilities of the occurrence of both favorable and unfavorable events for each individual criterion were computed to indicate the significance of each weather element to the overall result.
Álvarez, Yanaisis; Esteban-Torres, María; Acebrón, Iván; de las Rivas, Blanca; Muñoz, Rosario; Martínez-Ripoll, Martín; Mancheño, José M.
2011-01-01
Q88Y25_Lacpl is an esterase produced by the lactic acid bacterium Lactobacillus plantarum WCFS1 that shows amino-acid sequence similarity to carboxylesterases from the hormone-sensitive lipase family, in particular the AFEST esterase from the archaeon Archaeoglobus fulgidus and the hyperthermophilic esterase EstEI isolated from a metagenomic library. N-terminally His6-tagged Q88Y25_Lacpl has been overexpressed in Escherichia coli BL21 (DE3) cells, purified and crystallized at 291 K using the hanging-drop vapour-diffusion method. Mass spectrometry was used to determine the purity and homogeneity of the enzyme. Crystals of His6-tagged Q88Y25_Lacpl were prepared in a solution containing 2.8 M sodium acetate trihydrate pH 7.0. X-ray diffraction data were collected to 2.24 Å resolution on beamline ID29 at the ESRF. The apparent crystal point group was 422; however, initial global analysis of the intensity statistics (data processed with high symmetry in space group I422) and subsequent tests on data processed with low symmetry (space group I4) showed that the crystals were almost perfectly merohedrally twinned. Most probably, the true space group is I4, with unit-cell parameters a = 169.05, b = 169.05, c = 183.62 Å. PMID:22102251
1984-08-01
12. PERSONAL AUTHORISI Hiroshi Sato 13* TYPE OF REPORT TECHNICAL 13b. TIME COVERED PROM TO 14. OATE OF REPORT (Yr. Mo., Day) Aug. 1984...nectuary and identify by bloc* number) Let p and p.. be probability measures on a locally convex Hausdorff real topological linear space E. C.R. Baker [1...THIS PAGE ABSTRACT Let y and y1 be probability measures on a locally convex Hausdorff real topological linear space E. C.R. Baker [1] posed the
Realistic Covariance Prediction for the Earth Science Constellation
NASA Technical Reports Server (NTRS)
Duncan, Matthew; Long, Anne
2006-01-01
Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. One component of the risk assessment process is computing the collision probability between two space objects. The collision probability is computed using Monte Carlo techniques as well as by numerically integrating relative state probability density functions. Each algorithm takes as inputs state vector and state vector uncertainty information for both objects. The state vector uncertainty information is expressed in terms of a covariance matrix. The collision probability computation is only as good as the inputs. Therefore, to obtain a collision calculation that is a useful decision-making metric, realistic covariance matrices must be used as inputs to the calculation. This paper describes the process used by the NASA/Goddard Space Flight Center's Earth Science Mission Operations Project to generate realistic covariance predictions for three of the Earth Science Constellation satellites: Aqua, Aura and Terra.
Armstrong, Graeme; Phillips, Ben
2012-01-01
Wildfire is a fundamental disturbance process in many ecological communities, and is critical in maintaining the structure of some plant communities. In the past century, changes in global land use practices have led to changes in fire regimes that have radically altered the composition of many plant communities. As the severe biodiversity impacts of inappropriate fire management regimes are recognized, attempts are being made to manage fires within a more ‘natural’ regime. In this aim, the focus has typically been on determining the fire regime to which the community has adapted. Here we take a subtly different approach and focus on the probability of a patch being burnt. We hypothesize that competing sympatric taxa from different plant functional groups are able to coexist due to the stochasticity of the fire regime, which creates opportunities in both time and space that are exploited differentially by each group. We exploit this situation to find the fire probability at which three sympatric grasses, from different functional groups, are able to co-exist. We do this by parameterizing a spatio-temporal simulation model with the life-history strategies of the three species and then search for the fire frequency and scale at which they are able to coexist when in competition. The simulation gives a clear result that these species only coexist across a very narrow range of fire probabilities centred at 0.2. Conversely, fire scale was found only to be important at very large scales. Our work demonstrates the efficacy of using competing sympatric species with different regeneration niches to determine the probability of fire in any given patch. Estimating this probability allows us to construct an expected historical distribution of fire return intervals for the community; a critical resource for managing fire-driven biodiversity in the face of a growing carbon economy and ongoing climate change. PMID:22363670
Pometti, Carolina L; Bessega, Cecilia F; Saidman, Beatriz O; Vilardi, Juan C
2014-03-01
Bayesian clustering as implemented in STRUCTURE or GENELAND software is widely used to form genetic groups of populations or individuals. On the other hand, in order to satisfy the need for less computer-intensive approaches, multivariate analyses are specifically devoted to extracting information from large datasets. In this paper, we report the use of a dataset of AFLP markers belonging to 15 sampling sites of Acacia caven for studying the genetic structure and comparing the consistency of three methods: STRUCTURE, GENELAND and DAPC. Of these methods, DAPC was the fastest one and showed accuracy in inferring the K number of populations (K = 12 using the find.clusters option and K = 15 with a priori information of populations). GENELAND in turn, provides information on the area of membership probabilities for individuals or populations in the space, when coordinates are specified (K = 12). STRUCTURE also inferred the number of K populations and the membership probabilities of individuals based on ancestry, presenting the result K = 11 without prior information of populations and K = 15 using the LOCPRIOR option. Finally, in this work all three methods showed high consistency in estimating the population structure, inferring similar numbers of populations and the membership probabilities of individuals to each group, with a high correlation between each other.
Anvil Forecast Tool in the Advanced Weather Interactive Processing System, Phase II
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III
2008-01-01
Meteorologists from the 45th Weather Squadron (45 WS) and Spaceflight Meteorology Group have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Light Rules. As a result, the Applied Meteorology Unit (AMU) created a graphical overlay tool for the Meteorological Interactive Data Display Systems (MIDDS) to indicate the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input.
Crystal structure of alpha poly-p-xylylene.
NASA Technical Reports Server (NTRS)
Kubo, S.; Wunderlich, B.
1971-01-01
A crystal structure of alpha poly-p-xylylene is proposed with the help of data of oriented crystals grown during polymerization. The unit cell is monoclinic with the parameters a = 8.57 A, b = 10.62 A, c = 6.54 A (chain axis), and beta = 101.3 deg. Four repeating units per cell lead to a calculated density of 1.185 g/cu cm and a packing density of 0.71. The probable space group is P2 sub 1/m.
Spaced-retrieval effects on name-face recognition in older adults with probable Alzheimer's disease.
Hawley, Karri S; Cherry, Katie E
2004-03-01
Six older adults with probable Alzheimer's disease (AD) were trained to recall a name-face association using the spaced-retrieval method. We administered six training sessions over a 2-week period. On each trial, participants selected a target photograph and stated the target name, from eight other photographs, at increasingly longer retention intervals. Results yielded a positive effect of spaced-retrieval training for name-face recognition. All participants were able to select the target photograph and state the target's name for longer periods of time within and across training sessions. A live-person transfer task was administered to determine whether the name-face association, trained by spaced-retrieval, would transfer to a live person. Half of the participants were able to call the live person by the correct name. These data provide initial evidence that spaced-retrieval training can aid older adults with probable AD in recall of a name-face association and in transfer of that association to an actual person.
Williams, David M.; Dechen Quinn, Amy C.; Porter, William F.
2014-01-01
Contacts between hosts are essential for transmission of many infectious agents. Understanding how contacts, and thus transmission rates, occur in space and time is critical to effectively responding to disease outbreaks in free-ranging animal populations. Contacts between animals in the wild are often difficult to observe or measure directly. Instead, one must infer contacts from metrics such as proximity in space and time. Our objective was to examine how contacts between white-tailed deer (Odocoileus virginianus) vary in space and among seasons. We used GPS movement data from 71 deer in central New York State to quantify potential direct contacts between deer and indirect overlap in space use across time and space. Daily probabilities of direct contact decreased from winter (0.05–0.14), to low levels post-parturition through summer (0.00–0.02), and increased during the rut to winter levels. The cumulative distribution for the spatial structure of direct and indirect contact probabilities around a hypothetical point of occurrence increased rapidly with distance for deer pairs separated by 1,000 m – 7,000 m. Ninety-five percent of the probabilities of direct contact occurred among deer pairs within 8,500 m of one another, and 99% within 10,900 m. Probabilities of indirect contact accumulated across greater spatial extents: 95% at 11,900 m and 99% at 49,000 m. Contacts were spatially consistent across seasons, indicating that although contact rates differ seasonally, they occur proportionally across similar landscape extents. Distributions of contact probabilities across space can inform management decisions for assessing risk and allocating resources in response. PMID:24409293
Guédon, Yann; d'Aubenton-Carafa, Yves; Thermes, Claude
2006-03-01
The most commonly used models for analysing local dependencies in DNA sequences are (high-order) Markov chains. Incorporating knowledge relative to the possible grouping of the nucleotides enables to define dedicated sub-classes of Markov chains. The problem of formulating lumpability hypotheses for a Markov chain is therefore addressed. In the classical approach to lumpability, this problem can be formulated as the determination of an appropriate state space (smaller than the original state space) such that the lumped chain defined on this state space retains the Markov property. We propose a different perspective on lumpability where the state space is fixed and the partitioning of this state space is represented by a one-to-many probabilistic function within a two-level stochastic process. Three nested classes of lumped processes can be defined in this way as sub-classes of first-order Markov chains. These lumped processes enable parsimonious reparameterizations of Markov chains that help to reveal relevant partitions of the state space. Characterizations of the lumped processes on the original transition probability matrix are derived. Different model selection methods relying either on hypothesis testing or on penalized log-likelihood criteria are presented as well as extensions to lumped processes constructed from high-order Markov chains. The relevance of the proposed approach to lumpability is illustrated by the analysis of DNA sequences. In particular, the use of lumped processes enables to highlight differences between intronic sequences and gene untranslated region sequences.
NASA Astrophysics Data System (ADS)
Malpathak, Shreyas; Ma, Xinyou; Hase, William L.
2018-04-01
In a previous UB3LYP/6-31G* direct dynamics simulation, non-Rice-Ramsperger-Kassel-Marcus (RRKM) unimolecular dynamics was found for vibrationally excited 1,2-dioxetane (DO); [R. Sun et al., J. Chem. Phys. 137, 044305 (2012)]. In the work reported here, these dynamics are studied in more detail using the same direct dynamics method. Vibrational modes of DO were divided into 4 groups, based on their characteristic motions, and each group excited with the same energy. To compare with the dynamics of these groups, an additional group of trajectories comprising a microcanonical ensemble was also simulated. The results of these simulations are consistent with the previous study. The dissociation probability, N(t)/N(0), for these excitation groups were all different. Groups A, B, and C, without initial excitation in the O-O stretch reaction coordinate, had a time lag to of 0.25-1.0 ps for the first dissociation to occur. Somewhat surprisingly, the C-H stretch Group A and out-of-plane motion Group C excitations had exponential dissociation probabilities after to, with a rate constant ˜2 times smaller than the anharmonic RRKM value. Groups B and D, with excitation of the H-C-H bend and wag, and ring bend and stretch modes, respectively, had bi-exponential dissociation probabilities. For Group D, with excitation localized in the reaction coordinate, the initial rate constant is ˜7 times larger than the anharmonic RRKM value, substantial apparent non-RRKM dynamics. N(t)/N(0) for the random excitation trajectories was non-exponential, indicating intrinsic non-RRKM dynamics. For the trajectory integration time of 13.5 ps, 9% of these trajectories did not dissociate in comparison to the RRKM prediction of 0.3%. Classical power spectra for these trajectories indicate they have regular intramolecular dynamics. The N(t)/N(0) for the excitation groups are well described by a two-state coupled phase space model. From the intercept of N(t)/N(0) with random excitation, the anharmonic correction to the RRKM rate constant is approximately a factor of 1.5.
Behavioral and biological interactions with small groups in confined microsocieties
NASA Technical Reports Server (NTRS)
Brady, J. V.; Emurian, H. H.
1982-01-01
Requirements for high levels of human performance in the unfamiliar and stressful environments associated with space missions necessitate the development of research-based technological procedures for maximizing the probability of effective functioning at all levels of personnel participation. Where the successful accomplishment of such missions requires the coordinated contributions of several individuals collectively identified with the achievement of a common objective, the conditions for characterizing a team, crew, or functional group are operationally defined. For the most part, studies of group performances under operational conditions which emphasize relatively long exposure to extended mission environments have been limited by the constraints imposed on experimental manipulations to identify critical effectiveness factors. On the other hand, laboratory studies involving relatively brief exposures to contrived task situations have been considered of questionable generality to operational settings requiring realistic group objectives.
ERIC Educational Resources Information Center
Cherry, Katie E.; Walvoord, Ashley A. G.; Hawley, Karri S.
2010-01-01
The authors trained 4 older adults with probable Alzheimer's disease to recall a name-face-occupation association using the spaced retrieval technique. Six training sessions were administered over a 2-week period. On each trial, participants selected a target photograph and stated the target name and occupation at increasingly longer retention…
Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment
NASA Technical Reports Server (NTRS)
Carpenter, James R.; Markley, F Landis
2014-01-01
This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.
NASA Technical Reports Server (NTRS)
Yunis, Isam S.; Carney, Kelly S.
1993-01-01
A new aerospace application of structural reliability techniques is presented, where the applied forces depend on many probabilistic variables. This application is the plume impingement loading of the Space Station Freedom Photovoltaic Arrays. When the space shuttle berths with Space Station Freedom it must brake and maneuver towards the berthing point using its primary jets. The jet exhaust, or plume, may cause high loads on the photovoltaic arrays. The many parameters governing this problem are highly uncertain and random. An approach, using techniques from structural reliability, as opposed to the accepted deterministic methods, is presented which assesses the probability of failure of the array mast due to plume impingement loading. A Monte Carlo simulation of the berthing approach is used to determine the probability distribution of the loading. A probability distribution is also determined for the strength of the array. Structural reliability techniques are then used to assess the array mast design. These techniques are found to be superior to the standard deterministic dynamic transient analysis, for this class of problem. The results show that the probability of failure of the current array mast design, during its 15 year life, is minute.
Impact of high-risk conjunctions on Active Debris Removal target selection
NASA Astrophysics Data System (ADS)
Lidtke, Aleksander A.; Lewis, Hugh G.; Armellin, Roberto
2015-10-01
Space debris simulations show that if current space launches continue unchanged, spacecraft operations might become difficult in the congested space environment. It has been suggested that Active Debris Removal (ADR) might be necessary in order to prevent such a situation. Selection of objects to be targeted by ADR is considered important because removal of non-relevant objects will unnecessarily increase the cost of ADR. One of the factors to be used in this ADR target selection is the collision probability accumulated by every object. This paper shows the impact of high-probability conjunctions on the collision probability accumulated by individual objects as well as the probability of any collision occurring in orbit. Such conjunctions cannot be predicted far in advance and, consequently, not all the objects that will be involved in such dangerous conjunctions can be removed through ADR. Therefore, a debris remediation method that would address such events at short notice, and thus help prevent likely collisions, is suggested.
Pinpointing clusters of apparently sporadic cases of Legionnaires' disease.
Bhopal, R. S.; Diggle, P.; Rowlingson, B.
1992-01-01
OBJECTIVES--To test the hypothesis that many non-outbreak cases of legionnaires' disease are not sporadic and to attempt to pinpoint cases clustering in space and time. DESIGN--Descriptive study of a case series, 1978-86. SETTING--15 health boards in Scotland. PATIENTS--203 probable cases of non-outbreak, non-travel, community acquired legionnaires' disease in patients resident in Scotland. MAIN MEASURES--Date of onset of disease and postcode and health board of residence of cases. RESULTS--Space-time clustering was present and numerous groups of cases were identified, all but two being newly recognised. Nine cases occurred during three months within two postcodes in Edinburgh, and an outbreak was probably missed. In several places cases occurred in one area over a prolonged period--for example, nine cases in postcode districts G11.5 and G12.8 in Glasgow during five years (estimated mean annual incidence of community acquired, non-outbreak, non-travel legionnaires' disease of 146 per million residents v 4.8 per million for Scotland). Statistical analysis showed that the space time clustering of cases in the Glasgow and Edinburgh areas was unusual (p = 0.036, p = 0.068 respectively). CONCLUSION--Future surveillance requires greater awareness that clusters can be overlooked; case searching whenever a case is identified; collection of complete information particularly of date of onset of the disease and address or postcode; ongoing analysis for space-time clustering; and an accurate yet workable definition of sporadic cases. Other researchers should re-examine their data on apparently sporadic infection. PMID:1586784
Survival probability of a truncated radial oscillator subject to periodic kicks
NASA Astrophysics Data System (ADS)
Tanabe, Seiichi; Watanabe, Shinichi; Saif, Farhan; Matsuzawa, Michio
2002-03-01
Classical and quantum survival probabilities are compared for a truncated radial oscillator undergoing impulsive interactions with periodic laser pulses represented here as kicks. The system is truncated in the sense that the harmonic potential is made valid only within a finite range; the rest of the space is treated as a perfect absorber. Exploring extended values of the parameters of this model [Phys. Rev. A 63, 052721 (2001)], we supplement discussions on classical and quantum features near resonances. The classical system proves to be quasi-integrable and preserves phase-space area despite the momentum transfered by the kicks, exhibiting simple yet rich phase-space features. A geometrical argument reveals quantum-classical correspondence in the locations of minima in the paired survival probabilities while the ``ionization'' rates differ due to quantum tunneling.
NASA Technical Reports Server (NTRS)
Massey, J. L.
1976-01-01
Virtually all previously-suggested rate 1/2 binary convolutional codes with KE = 24 are compared. Their distance properties are given; and their performance, both in computation and in error probability, with sequential decoding on the deep-space channel is determined by simulation. Recommendations are made both for the choice of a specific KE = 24 code as well as for codes to be included in future coding standards for the deep-space channel. A new result given in this report is a method for determining the statistical significance of error probability data when the error probability is so small that it is not feasible to perform enough decoding simulations to obtain more than a very small number of decoding errors.
ERIC Educational Resources Information Center
Maher, Nicole; Muir, Tracey
2014-01-01
This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…
Reducing the Risk of Human Space Missions with INTEGRITY
NASA Technical Reports Server (NTRS)
Jones, Harry W.; Dillon-Merill, Robin L.; Tri, Terry O.; Henninger, Donald L.
2003-01-01
The INTEGRITY Program will design and operate a test bed facility to help prepare for future beyond-LEO missions. The purpose of INTEGRITY is to enable future missions by developing, testing, and demonstrating advanced human space systems. INTEGRITY will also implement and validate advanced management techniques including risk analysis and mitigation. One important way INTEGRITY will help enable future missions is by reducing their risk. A risk analysis of human space missions is important in defining the steps that INTEGRITY should take to mitigate risk. This paper describes how a Probabilistic Risk Assessment (PRA) of human space missions will help support the planning and development of INTEGRITY to maximize its benefits to future missions. PRA is a systematic methodology to decompose the system into subsystems and components, to quantify the failure risk as a function of the design elements and their corresponding probability of failure. PRA provides a quantitative estimate of the probability of failure of the system, including an assessment and display of the degree of uncertainty surrounding the probability. PRA provides a basis for understanding the impacts of decisions that affect safety, reliability, performance, and cost. Risks with both high probability and high impact are identified as top priority. The PRA of human missions beyond Earth orbit will help indicate how the risk of future human space missions can be reduced by integrating and testing systems in INTEGRITY.
NASA Technical Reports Server (NTRS)
Lambert, Winfred; Wheeler, Mark; Roeder, William
2005-01-01
The 45th Weather Squadron (45 WS) at Cape Canaveral Air-Force Station (CCAFS)ln Florida issues a probability of lightning occurrence in their daily 24-hour and weekly planning forecasts. This information is used for general planning of operations at CCAFS and Kennedy Space Center (KSC). These facilities are located in east-central Florida at the east end of a corridor known as 'Lightning Alley', an indication that lightning has a large impact on space-lift operations. Much of the current lightning probability forecast is based on a subjective analysis of model and observational data and an objective forecast tool developed over 30 years ago. The 45 WS requested that a new lightning probability forecast tool based on statistical analysis of more recent historical warm season (May-September) data be developed in order to increase the objectivity of the daily thunderstorm probability forecast. The resulting tool is a set of statistical lightning forecast equations, one for each month of the warm season, that provide a lightning occurrence probability for the day by 1100 UTC (0700 EDT) during the warm season.
Space Object Collision Probability via Monte Carlo on the Graphics Processing Unit
NASA Astrophysics Data System (ADS)
Vittaldev, Vivek; Russell, Ryan P.
2017-09-01
Fast and accurate collision probability computations are essential for protecting space assets. Monte Carlo (MC) simulation is the most accurate but computationally intensive method. A Graphics Processing Unit (GPU) is used to parallelize the computation and reduce the overall runtime. Using MC techniques to compute the collision probability is common in literature as the benchmark. An optimized implementation on the GPU, however, is a challenging problem and is the main focus of the current work. The MC simulation takes samples from the uncertainty distributions of the Resident Space Objects (RSOs) at any time during a time window of interest and outputs the separations at closest approach. Therefore, any uncertainty propagation method may be used and the collision probability is automatically computed as a function of RSO collision radii. Integration using a fixed time step and a quartic interpolation after every Runge Kutta step ensures that no close approaches are missed. Two orders of magnitude speedups over a serial CPU implementation are shown, and speedups improve moderately with higher fidelity dynamics. The tool makes the MC approach tractable on a single workstation, and can be used as a final product, or for verifying surrogate and analytical collision probability methods.
A proposed physical analog for a quantum probability amplitude
NASA Astrophysics Data System (ADS)
Boyd, Jeffrey
What is the physical analog of a probability amplitude? All quantum mathematics, including quantum information, is built on amplitudes. Every other science uses probabilities; QM alone uses their square root. Why? This question has been asked for a century, but no one previously has proposed an answer. We will present cylindrical helices moving toward a particle source, which particles follow backwards. Consider Feynman's book QED. He speaks of amplitudes moving through space like the hand of a spinning clock. His hand is a complex vector. It traces a cylindrical helix in Cartesian space. The Theory of Elementary Waves changes direction so Feynman's clock faces move toward the particle source. Particles follow amplitudes (quantum waves) backwards. This contradicts wave particle duality. We will present empirical evidence that wave particle duality is wrong about the direction of particles versus waves. This involves a paradigm shift; which are always controversial. We believe that our model is the ONLY proposal ever made for the physical foundations of probability amplitudes. We will show that our ``probability amplitudes'' in physical nature form a Hilbert vector space with adjoints, an inner product and support both linear algebra and Dirac notation.
Neutron diffraction studies of some rare earth-transition metal deuterides
NASA Astrophysics Data System (ADS)
James, W. J.
1984-04-01
Neutron diffraction studies of the ternary alloy system Y6(Fel-xMnx)23 reveal that the unusual magnetic behavior upon substitution of Mn or Fe into the end members, is a consequence of atomic ordering wherein there is strong site preference of Mn for the f sub 2 sites and of Fe for the f sub 1 sites. In the Mn-rich compositions, Fe is found to have no spontaneous moments. Therefore, the long range magnetic ordering arises solely from Mn-Mn interactions. Upon substitution of Mn into the Fe-rich ternaries, the Fe moments are considerably reduced. Neutron diffraction studies of Y6Mn23D23 show that a transition occurs below 180K from a fcc structure to a primitive tetragonal structure, space group P4/mmm with the onset of antiferromagnetic ordering. The Mn moments are directed along the c-axis. The transition probably results from atomic ordering of the D atoms at low temperature which induces c axis magnetic ordering. The question of the appropriate space group of LaNi4.5Al0.5D4.5, P6/mmm or P3/m has been resolved by a careful refinement and analysis of neutron diffraction data. The preferred space group is P6/mmm. Neutron powder diffraction and thermal magnetization measurements on small single crystals of ErNi3, ErCo3, and ErFe3 (space group R3m) show that the magnetocrystalline properties are a consequence of competing local site anisotropies between the two non-equivalent crystallographic sites of Er and two of the three non-equivalent sites of the 3d-transition metal.
Eye light flashes on the Mir space station.
Avdeev, S; Bidoli, V; Casolino, M; De Grandis, E; Furano, G; Morselli, A; Narici, L; De Pascale, M P; Picozza, P; Reali, E; Sparvoli, R; Boezio, M; Carlson, P; Bonvicini, W; Vacchi, A; Zampa, N; Castellini, G; Fuglesang, C; Galper, A; Khodarovich, A; Ozerov, Yu; Popov, A; Vavilov, N; Mazzenga, G; Ricci, M; Sannita, W G; Spillantini, P
2002-04-01
The phenomenon of light flashes (LF) in eyes for people in space has been investigated onboard Mir. Data on particles hitting the eye have been collected with the SilEye detectors, and correlated with human observations. It is found that a nucleus in the radiation environment of Mir has roughly a 1% probability to cause an LF, whereas the proton probability is almost three orders of magnitude less. As a function of LET, the LF probability increases above 10 keV/micrometer, reaching about 5% at around 50 keV/micrometer. c 2002 Elsevier Science Ltd. All rights reserved.
ERIC Educational Resources Information Center
Cherry, Katie E.; Hawley, Karri S.; Jackson, Erin M.; Boudreaux, Emily O.
2009-01-01
Six older adults with probable Alzheimer's disease (AD) were trained to recall a name-face association using the spaced retrieval technique. In this study, we retested these persons in a 6-month follow-up program. For half of the participants, three booster sessions were administered at 6, 12, and 18 weeks after original training to promote…
McMeekin, D. Scott; Tritchler, David L.; Cohn, David E.; Mutch, David G.; Lankes, Heather A.; Geller, Melissa A.; Powell, Matthew A.; Backes, Floor J.; Landrum, Lisa M.; Zaino, Richard; Broaddus, Russell D.; Ramirez, Nilsa; Gao, Feng; Ali, Shamshad; Darcy, Kathleen M.; Pearl, Michael L.; DiSilvestro, Paul A.; Lele, Shashikant B.
2016-01-01
Purpose The clinicopathologic significance of mismatch repair (MMR) defects in endometrioid endometrial cancer (EEC) has not been definitively established. We undertook tumor typing to classify MMR defects to determine if MMR status is prognostic or predictive. Methods Primary EECs from NRG/GOG0210 patients were assessed for microsatellite instability (MSI), MLH1 methylation, and MMR protein expression. Each tumor was assigned to one of four MMR classes: normal, epigenetic defect, probable mutation (MMR defect not attributable to MLH1 methylation), or MSI-low. The relationships between MMR classes and clinicopathologic variables were assessed using contingency table tests and Cox proportional hazard models. Results A total of 1,024 tumors were assigned to MMR classes. Epigenetic and probable mutations in MMR were significantly associated with higher grade and more frequent lymphovascular space invasion. Epigenetic defects were more common in patients with higher International Federation of Gynecology and Obstetrics stage. Overall, there were no differences in outcomes. Progression-free survival was, however, worse for women whose tumors had epigenetic MMR defects compared with the MMR normal group (hazard ratio, 1.37; P < .05; 95% CI, 1.00 to 1.86). An exploratory analysis of interaction between MMR status and adjuvant therapy showed a trend toward improved progression-free survival for probable MMR mutation cases. Conclusion MMR defects in EECs are associated with a number of well-established poor prognostic indicators. Women with tumors that had MMR defects were likely to have higher-grade cancers and more frequent lymphovascular space invasion. Surprisingly, outcomes in these patients were similar to patients with MMR normal tumors, suggesting that MMR defects may counteract the effects of negative prognostic factors. Altered immune surveillance of MMR-deficient tumors, and other host/tumor interactions, is likely to determine outcomes for patients with MMR-deficient tumors. PMID:27325856
McMeekin, D Scott; Tritchler, David L; Cohn, David E; Mutch, David G; Lankes, Heather A; Geller, Melissa A; Powell, Matthew A; Backes, Floor J; Landrum, Lisa M; Zaino, Richard; Broaddus, Russell D; Ramirez, Nilsa; Gao, Feng; Ali, Shamshad; Darcy, Kathleen M; Pearl, Michael L; DiSilvestro, Paul A; Lele, Shashikant B; Goodfellow, Paul J
2016-09-01
The clinicopathologic significance of mismatch repair (MMR) defects in endometrioid endometrial cancer (EEC) has not been definitively established. We undertook tumor typing to classify MMR defects to determine if MMR status is prognostic or predictive. Primary EECs from NRG/GOG0210 patients were assessed for microsatellite instability (MSI), MLH1 methylation, and MMR protein expression. Each tumor was assigned to one of four MMR classes: normal, epigenetic defect, probable mutation (MMR defect not attributable to MLH1 methylation), or MSI-low. The relationships between MMR classes and clinicopathologic variables were assessed using contingency table tests and Cox proportional hazard models. A total of 1,024 tumors were assigned to MMR classes. Epigenetic and probable mutations in MMR were significantly associated with higher grade and more frequent lymphovascular space invasion. Epigenetic defects were more common in patients with higher International Federation of Gynecology and Obstetrics stage. Overall, there were no differences in outcomes. Progression-free survival was, however, worse for women whose tumors had epigenetic MMR defects compared with the MMR normal group (hazard ratio, 1.37; P < .05; 95% CI, 1.00 to 1.86). An exploratory analysis of interaction between MMR status and adjuvant therapy showed a trend toward improved progression-free survival for probable MMR mutation cases. MMR defects in EECs are associated with a number of well-established poor prognostic indicators. Women with tumors that had MMR defects were likely to have higher-grade cancers and more frequent lymphovascular space invasion. Surprisingly, outcomes in these patients were similar to patients with MMR normal tumors, suggesting that MMR defects may counteract the effects of negative prognostic factors. Altered immune surveillance of MMR-deficient tumors, and other host/tumor interactions, is likely to determine outcomes for patients with MMR-deficient tumors. © 2016 by American Society of Clinical Oncology.
Pometti, Carolina L.; Bessega, Cecilia F.; Saidman, Beatriz O.; Vilardi, Juan C.
2014-01-01
Bayesian clustering as implemented in STRUCTURE or GENELAND software is widely used to form genetic groups of populations or individuals. On the other hand, in order to satisfy the need for less computer-intensive approaches, multivariate analyses are specifically devoted to extracting information from large datasets. In this paper, we report the use of a dataset of AFLP markers belonging to 15 sampling sites of Acacia caven for studying the genetic structure and comparing the consistency of three methods: STRUCTURE, GENELAND and DAPC. Of these methods, DAPC was the fastest one and showed accuracy in inferring the K number of populations (K = 12 using the find.clusters option and K = 15 with a priori information of populations). GENELAND in turn, provides information on the area of membership probabilities for individuals or populations in the space, when coordinates are specified (K = 12). STRUCTURE also inferred the number of K populations and the membership probabilities of individuals based on ancestry, presenting the result K = 11 without prior information of populations and K = 15 using the LOCPRIOR option. Finally, in this work all three methods showed high consistency in estimating the population structure, inferring similar numbers of populations and the membership probabilities of individuals to each group, with a high correlation between each other. PMID:24688293
Communicating the Threat of a Tropical Cyclone to the Eastern Range
NASA Technical Reports Server (NTRS)
Winters, Katherine A.; Roeder, William P.; McAleenan, Mike; Belson, Brian L.; Shafer, Jaclyn A.
2012-01-01
The 45th Weather Squadron (45 WS) has developed a tool to help visualize the Wind Speed Probability product from the National Hurricane Center (NHC) and to help communicate that information to space launch customers and decision makers at the 45th Space Wing (45 SW) and Kennedy Space Center (KSC) located in east central Florida. This paper reviews previous work and presents the new visualization tool, including initial feedback as well as the pros and cons. The NHC began issuing their Wind Speed Probability product for tropical cyclones publicly in 2006. The 45 WS uses this product to provide a threat assessment to 45 SW and KSC leadership for risk evaluations with an approaching tropical cyclone. Although the wind speed probabilities convey the uncertainty of a tropical cyclone well, communicating this information to customers is a challenge. The 45 WS continually strives to provide the wind speed probability information to customers in a context which clearly communicates the threat of a tropical cyclone. First, an intern from the Florida Institute of Technology (FIT) Atmospheric Sciences department, sponsored by Scitor Corporation, independently evaluated the NHC wind speed probability product. This work was later extended into a M.S. thesis at FIT, partially funded by Scitor Corporation and KSC. A second thesis at FIT further extended the evaluation partially funded by KSC. Using this analysis, the 45 WS categorized the probabilities into five probability interpretation categories: Very Low, Low, Moderate, High, and Very High. These probability interpretation categories convert the forecast probability and forecast interval into easily understood categories that are consistent across all ranges of probabilities and forecast intervals. As a follow-on project, KSC funded a summer intern to evaluate the human factors of the probability interpretation categories, which ultimately refined some of the thresholds. The 45 WS created a visualization tool to express the timing and risk for multiple locations in a single graphic. Preliminary results on an on-going project by FIT will be included in this paper. This project is developing a new method of assigning the probability interpretation categories and updating the evaluation of the performance of the NHC wind speed probability analysis.
Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta
2017-02-15
Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.
Granchi, Simona; Vannacci, Enrico; Biagi, Elena
2017-04-22
To evaluate the capability of the HyperSPACE (Hyper SPectral Analysis for Characterization in Echography) method in tissue characterization, in order to provide information for the laser treatment of benign thyroid nodules in respect of conventional B-mode images and elastography. The method, based on the spectral analysis of the raw radiofrequency ultrasonic signal, was applied to characterize the nodule before and after laser treatment. Thirty patients (25 females and 5 males, age between 37 and 81 years) with thyroid benign nodule at cytology (Thyr 2) were evaluated by conventional ultrasonography, elastography, and HyperSPACE, before and after laser ablation. The images processed by HyperSPACE exhibit different color distributions that are referred to different tissue features. By calculating the percentages of the color coverages, the analysed nodules were subdivided into 3 groups. Each nodule belonging to the same group experienced, on average, similar necrosis extension. The nodules exhibit different Configurations (colors) distributions that could be indicative of the response of nodular tissue to the laser treatmentConclusions: HyperSPACEcan characterize benign nodules by providing additional information in respect of conventional ultrasound and elastography which is useful for support in the laser treatment of nodules in order to increase the probability of success.
Conservative Analytical Collision Probabilities for Orbital Formation Flying
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2004-01-01
The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.
Conservative Analytical Collision Probability for Design of Orbital Formations
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2004-01-01
The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.
(abstract) Infrared Cirrus and Future Space Based Astronomy
NASA Technical Reports Server (NTRS)
Gautier, T. N.
1993-01-01
A review of the known properties of the distribution of infrared cirrus is followed by a discussion of the implications of cirrus on observations from space. Probable limitations on space observations due to IR cirrus.
Intermittent collective dynamics emerge from conflicting imperatives in sheep herds
Ginelli, Francesco; Peruani, Fernando; Pillot, Marie-Helène; Chaté, Hugues; Theraulaz, Guy; Bon, Richard
2015-01-01
Among the many fascinating examples of collective behavior exhibited by animal groups, some species are known to alternate slow group dispersion in space with rapid aggregation phenomena induced by a sudden behavioral shift at the individual level. We study this phenomenon quantitatively in large groups of grazing Merino sheep under controlled experimental conditions. Our analysis reveals strongly intermittent collective dynamics consisting of fast, avalanche-like regrouping events distributed on all experimentally accessible scales. As a proof of principle, we introduce an agent-based model with individual behavioral shifts, which we show to account faithfully for all collective properties observed. This offers, in turn, an insight on the individual stimulus/response functions that can generate such intermittent behavior. In particular, the intensity of sheep allelomimetic behavior plays a key role in the group’s ability to increase the per capita grazing surface while minimizing the time needed to regroup into a tightly packed configuration. We conclude that the emergent behavior reported probably arises from the necessity to balance two conflicting imperatives: (i) the exploration of foraging space by individuals and (ii) the protection from predators offered by being part of large, cohesive groups. We discuss our results in the context of the current debate about criticality in biology. PMID:26417082
NASA Astrophysics Data System (ADS)
Giovanis, D. G.; Shields, M. D.
2018-07-01
This paper addresses uncertainty quantification (UQ) for problems where scalar (or low-dimensional vector) response quantities are insufficient and, instead, full-field (very high-dimensional) responses are of interest. To do so, an adaptive stochastic simulation-based methodology is introduced that refines the probability space based on Grassmann manifold variations. The proposed method has a multi-element character discretizing the probability space into simplex elements using a Delaunay triangulation. For every simplex, the high-dimensional solutions corresponding to its vertices (sample points) are projected onto the Grassmann manifold. The pairwise distances between these points are calculated using appropriately defined metrics and the elements with large total distance are sub-sampled and refined. As a result, regions of the probability space that produce significant changes in the full-field solution are accurately resolved. An added benefit is that an approximation of the solution within each element can be obtained by interpolation on the Grassmann manifold. The method is applied to study the probability of shear band formation in a bulk metallic glass using the shear transformation zone theory.
Controlling quantum interference in phase space with amplitude.
Xue, Yinghong; Li, Tingyu; Kasai, Katsuyuki; Okada-Shudo, Yoshiko; Watanabe, Masayoshi; Zhang, Yun
2017-05-23
We experimentally show a quantum interference in phase space by interrogating photon number probabilities (n = 2, 3, and 4) of a displaced squeezed state, which is generated by an optical parametric amplifier and whose displacement is controlled by amplitude of injected coherent light. It is found that the probabilities exhibit oscillations of interference effect depending upon the amplitude of the controlling light field. This phenomenon is attributed to quantum interference in phase space and indicates the capability of controlling quantum interference using amplitude. This remarkably contrasts with the oscillations of interference effects being usually controlled by relative phase in classical optics.
Objective Lightning Probability Forecast Tool Phase II
NASA Technical Reports Server (NTRS)
Lambert, Winnie
2007-01-01
This presentation describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability, and an ability to distinguish between lightning and non-lightning days.
Structural evolution of calcite at high temperatures: Phase V unveiled
Ishizawa, Nobuo; Setoguchi, Hayato; Yanagisawa, Kazumichi
2013-01-01
The calcite form of calcium carbonate CaCO3 undergoes a reversible phase transition between Rc and Rm at ~1240 K under a CO2 atmosphere of ~0.4 MPa. The joint probability density function obtained from the single-crystal X-ray diffraction data revealed that the oxygen triangles of the CO3 group in the high temperature form (Phase V) do not sit still at specified positions in the space group Rm, but migrate along the undulated circular orbital about carbon. The present study also shows how the room temperature form (Phase I) develops into Phase V through an intermediate form (Phase IV) in the temperature range between ~985 K and ~1240 K. PMID:24084871
Interword Spacing and Landing Position Effects during Chinese Reading in Children and Adults
ERIC Educational Resources Information Center
Zang, Chuanli; Liang, Feifei; Bai, Xuejun; Yan, Guoli; Liversedge, Simon P.
2013-01-01
The present study examined children and adults' eye movement behavior when reading word spaced and unspaced Chinese text. The results showed that interword spacing reduced children and adults' first pass reading times and refixation probabilities indicating spaces between words facilitated word identification. Word spacing effects occurred to a…
Historical Study of Radiation Exposures and the Incidence of Cataracts in Astronauts
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Manuel, F. K.; Iszard, G.; Feiveson, A.; Peterson, L. E.; Hardy, D.; Marak, L.; Tung, W.; Wear, M.; Chylack, L. T., Jr.
2004-01-01
For over 35 years, astronauts in low Earth orbit or on missions to the moon have been exposed to space radiation comprised of high-energy protons, heavy ions, and secondary neutrons. We reviewed the radiation exposures received by astronauts in space and on Earth, and presented results from the first epidemiological study of cataract incidence in the astronauts. Our data suggested an increased risk for cataracts from space radiation exposures. Using parametric survival analysis and the maximum likelihood method, we estimated the dose-response and age distribution for cataract incidence in astronauts by space radiation. Considering the high-LET dose contributions on specific space missions as well as data from animal studies with neutrons and heavy ions, suggested a linear response with no dose-threshold for cataracts. However, there are unanswered questions related to the importance and the definition of clinically significant cataracts commonly used in radiation protection, especially in light of epidemiological data suggesting that the probability that sub-clinical cataracts will progress is highly dependent on the age at which cataracts appear. We briefly describe a new study that will address the measurement of cataract progression-rates in astronauts and a ground-based comparison group.
Historical Study of Radiation Exposures and the Incidence of Cataracts in Astronauts
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Manuel, F. K.; Iszard, G.; Feiveson, A.; Peterson, L. E.; Hardy, D.; Marak, L.; Tung, W.; Wear, M.; Chylack, L. T., Jr.
2004-01-01
For over 35 years, astronauts in low Earth orbit or on missions to the moon have been exposed to space radiation comprised of high-energy protons, heavy ions, and secondary neutrons. We reviewed the radiation exposures received by astronauts in space and on Earth, and presented results from the first epidemiological study of cataract incidence in the astronauts. Our data suggested an increased risk for cataracts from space radiation exposures*. Using parametric survival analysis and the maximum likelihood method, we estimated the dose-response and age distribution for cataract incidence in astronauts by space radiation. Considering the high-LET dose contributions on specific space missions as well as data from animal studies with neutrons and heavy ions, suggested a linear response with no dose-threshold for cataracts. However, there are unanswered questions related to the importance and the definition of "clinically significant" cataracts commonly used in radiation protection, especially in light of epidemiological data suggesting that the probability that "sub-clinical" cataracts will progress is highly dependent on the age at which cataracts appear. We briefly describe a new study that will address the measurement of cataract progression-rates in astronauts and a ground-based comparison group.
Rare Event Simulation in Radiation Transport
NASA Astrophysics Data System (ADS)
Kollman, Craig
This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiplied by the likelihood ratio between the true and simulated probabilities so as to keep our estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive "learning" algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give, with probability one, a sequence of estimates converging exponentially fast to the true solution. In the final chapter, an attempt to generalize this algorithm to a continuous state space is made. This involves partitioning the space into a finite number of cells. There is a tradeoff between additional computation per iteration and variance reduction per iteration that arises in determining the optimal grid size. All versions of this algorithm can be thought of as a compromise between deterministic and Monte Carlo methods, capturing advantages of both techniques.
The Laplace method for probability measures in Banach spaces
NASA Astrophysics Data System (ADS)
Piterbarg, V. I.; Fatalov, V. R.
1995-12-01
Contents §1. Introduction Chapter I. Asymptotic analysis of continual integrals in Banach space, depending on a large parameter §2. The large deviation principle and logarithmic asymptotics of continual integrals §3. Exact asymptotics of Gaussian integrals in Banach spaces: the Laplace method 3.1. The Laplace method for Gaussian integrals taken over the whole Hilbert space: isolated minimum points ([167], I) 3.2. The Laplace method for Gaussian integrals in Hilbert space: the manifold of minimum points ([167], II) 3.3. The Laplace method for Gaussian integrals in Banach space ([90], [174], [176]) 3.4. Exact asymptotics of large deviations of Gaussian norms §4. The Laplace method for distributions of sums of independent random elements with values in Banach space 4.1. The case of a non-degenerate minimum point ([137], I) 4.2. A degenerate isolated minimum point and the manifold of minimum points ([137], II) §5. Further examples 5.1. The Laplace method for the local time functional of a Markov symmetric process ([217]) 5.2. The Laplace method for diffusion processes, a finite number of non-degenerate minimum points ([116]) 5.3. Asymptotics of large deviations for Brownian motion in the Hölder norm 5.4. Non-asymptotic expansion of a strong stable law in Hilbert space ([41]) Chapter II. The double sum method - a version of the Laplace method in the space of continuous functions §6. Pickands' method of double sums 6.1. General situations 6.2. Asymptotics of the distribution of the maximum of a Gaussian stationary process 6.3. Asymptotics of the probability of a large excursion of a Gaussian non-stationary process §7. Probabilities of large deviations of trajectories of Gaussian fields 7.1. Homogeneous fields and fields with constant dispersion 7.2. Finitely many maximum points of dispersion 7.3. Manifold of maximum points of dispersion 7.4. Asymptotics of distributions of maxima of Wiener fields §8. Exact asymptotics of large deviations of the norm of Gaussian vectors and processes with values in the spaces L_k^p and l^2. Gaussian fields with the set of parameters in Hilbert space 8.1 Exact asymptotics of the distribution of the l_k^p-norm of a Gaussian finite-dimensional vector with dependent coordinates, p > 1 8.2. Exact asymptotics of probabilities of high excursions of trajectories of processes of type \\chi^2 8.3. Asymptotics of the probabilities of large deviations of Gaussian processes with a set of parameters in Hilbert space [74] 8.4. Asymptotics of distributions of maxima of the norms of l^2-valued Gaussian processes 8.5. Exact asymptotics of large deviations for the l^2-valued Ornstein-Uhlenbeck process Bibliography
The Camassa-Holm equation as an incompressible Euler equation: A geometric point of view
NASA Astrophysics Data System (ADS)
Gallouët, Thomas; Vialard, François-Xavier
2018-04-01
The group of diffeomorphisms of a compact manifold endowed with the L2 metric acting on the space of probability densities gives a unifying framework for the incompressible Euler equation and the theory of optimal mass transport. Recently, several authors have extended optimal transport to the space of positive Radon measures where the Wasserstein-Fisher-Rao distance is a natural extension of the classical L2-Wasserstein distance. In this paper, we show a similar relation between this unbalanced optimal transport problem and the Hdiv right-invariant metric on the group of diffeomorphisms, which corresponds to the Camassa-Holm (CH) equation in one dimension. Geometrically, we present an isometric embedding of the group of diffeomorphisms endowed with this right-invariant metric in the automorphisms group of the fiber bundle of half densities endowed with an L2 type of cone metric. This leads to a new formulation of the (generalized) CH equation as a geodesic equation on an isotropy subgroup of this automorphisms group; On S1, solutions to the standard CH thus give radially 1-homogeneous solutions of the incompressible Euler equation on R2 which preserves a radial density that has a singularity at 0. An other application consists in proving that smooth solutions of the Euler-Arnold equation for the Hdiv right-invariant metric are length minimizing geodesics for sufficiently short times.
Average fidelity between random quantum states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zyczkowski, Karol; Centrum Fizyki Teoretycznej, Polska Akademia Nauk, Aleja Lotnikow 32/44, 02-668 Warsaw; Perimeter Institute, Waterloo, Ontario, N2L 2Y5
2005-03-01
We analyze mean fidelity between random density matrices of size N, generated with respect to various probability measures in the space of mixed quantum states: the Hilbert-Schmidt measure, the Bures (statistical) measure, the measure induced by the partial trace, and the natural measure on the space of pure states. In certain cases explicit probability distributions for the fidelity are derived. The results obtained may be used to gauge the quality of quantum-information-processing schemes.
Scale Dependence of Spatiotemporal Intermittence of Rain
NASA Technical Reports Server (NTRS)
Kundu, Prasun K.; Siddani, Ravi K.
2011-01-01
It is a common experience that rainfall is intermittent in space and time. This is reflected by the fact that the statistics of area- and/or time-averaged rain rate is described by a mixed distribution with a nonzero probability of having a sharp value zero. In this paper we have explored the dependence of the probability of zero rain on the averaging space and time scales in large multiyear data sets based on radar and rain gauge observations. A stretched exponential fannula fits the observed scale dependence of the zero-rain probability. The proposed formula makes it apparent that the space-time support of the rain field is not quite a set of measure zero as is sometimes supposed. We also give an ex.planation of the observed behavior in tenus of a simple probabilistic model based on the premise that rainfall process has an intrinsic memory.
NASA Astrophysics Data System (ADS)
Staehle, Robert L.; Burke, James D.; Snyder, Gerald C.; Dowling, Richard; Spudis, Paul D.
1993-12-01
Speculation with regard to a permanent lunar base has been with us since Robert Goddard was working on the first liquid-fueled rockets in the 1920's. With the infusion of data from the Apollo Moon flights, a once speculative area of space exploration has become an exciting possibility. A Moon base is not only a very real possibility, but is probably a critical element in the continuation of our piloted space program. This article, originally drafted by World Space Foundation volunteers in conjuction with various academic and research groups, examines some of the strategies involved in selecting an appropriate site for such a lunar base. Site selection involves a number of complex variables, including raw materials for possible rocket propellant generation, hot an cold cycles, view of the sky (for astronomical considerations, among others), geological makeup of the region, and more. This article summarizes the key base siting considerations and suggests some alternatives. Availability of specific resources, including energy and certain minerals, is critical to success.
NASA Technical Reports Server (NTRS)
Staehle, Robert L.; Burke, James D.; Snyder, Gerald C.; Dowling, Richard; Spudis, Paul D.
1993-01-01
Speculation with regard to a permanent lunar base has been with us since Robert Goddard was working on the first liquid-fueled rockets in the 1920's. With the infusion of data from the Apollo Moon flights, a once speculative area of space exploration has become an exciting possibility. A Moon base is not only a very real possibility, but is probably a critical element in the continuation of our piloted space program. This article, originally drafted by World Space Foundation volunteers in conjuction with various academic and research groups, examines some of the strategies involved in selecting an appropriate site for such a lunar base. Site selection involves a number of complex variables, including raw materials for possible rocket propellant generation, hot an cold cycles, view of the sky (for astronomical considerations, among others), geological makeup of the region, and more. This article summarizes the key base siting considerations and suggests some alternatives. Availability of specific resources, including energy and certain minerals, is critical to success.
Dark energy in systems of galaxies
NASA Astrophysics Data System (ADS)
Chernin, A. D.
2013-11-01
The precise observational data of the Hubble Space Telescope have been used to study nearby galaxy systems. The main result is the detection of dark energy in groups, clusters, and flows of galaxies on a spatial scale of about 1-10 Mpc. The local density of dark energy in these systems, which is determined by various methods, is close to the global value or even coincides with it. A theoretical model of the nearby Universe has been constructed, which describes the Local Group of galaxies with the flow of dwarf galaxies receding from this system. The key physical parameter of the group-flow system is zero gravity radius, which is the distance at which the gravity of dark matter is compensated by dark-energy antigravity. The model predicts the existence of local regions of space where Einstein antigravity is stronger than Newton gravity. Six such regions have been revealed in the data of the Hubble space telescope. The nearest of these regions is at a distance of 1-3 Mpc from the center of the Milky Way. Antigravity in this region is several times stronger than gravity. Quasiregular flows of receding galaxies, which are accelerated by the dark-energy antigravity, exist in these regions. The model of the nearby Universe at the scale of groups of galaxies (˜1 Mpc) can be extended to the scale of clusters (˜10 Mpc). The systems of galaxies with accelerated receding flows constitute a new and probably widespread class of metagalactic populations. Strong dynamic effects of local dark energy constitute the main characteristic feature of these systems.
Tsirelson's bound and supersymmetric entangled states
Borsten, L.; Brádler, K.; Duff, M. J.
2014-01-01
A superqubit, belonging to a (2|1)-dimensional super-Hilbert space, constitutes the minimal supersymmetric extension of the conventional qubit. In order to see whether superqubits are more non-local than ordinary qubits, we construct a class of two-superqubit entangled states as a non-local resource in the CHSH game. Since super Hilbert space amplitudes are Grassmann numbers, the result depends on how we extract real probabilities and we examine three choices of map: (1) DeWitt (2) Trigonometric and (3) Modified Rogers. In cases (1) and (2), the winning probability reaches the Tsirelson bound pwin=cos2π/8≃0.8536 of standard quantum mechanics. Case (3) crosses Tsirelson's bound with pwin≃0.9265. Although all states used in the game involve probabilities lying between 0 and 1, case (3) permits other changes of basis inducing negative transition probabilities. PMID:25294964
Automation Activities that Support C2 Agility to Mitigate Type 7 Risks
2014-06-01
on business trip • Space ship runs into space junk What are the probabilities for these events in a 45-year career time frame? Event that...representation that information system understands State- Space Diagram Common Agility Space (CAS) A simple C2 organization representation
Detection of image structures using the Fisher information and the Rao metric.
Maybank, Stephen J
2004-12-01
In many detection problems, the structures to be detected are parameterized by the points of a parameter space. If the conditional probability density function for the measurements is known, then detection can be achieved by sampling the parameter space at a finite number of points and checking each point to see if the corresponding structure is supported by the data. The number of samples and the distances between neighboring samples are calculated using the Rao metric on the parameter space. The Rao metric is obtained from the Fisher information which is, in turn, obtained from the conditional probability density function. An upper bound is obtained for the probability of a false detection. The calculations are simplified in the low noise case by making an asymptotic approximation to the Fisher information. An application to line detection is described. Expressions are obtained for the asymptotic approximation to the Fisher information, the volume of the parameter space, and the number of samples. The time complexity for line detection is estimated. An experimental comparison is made with a Hough transform-based method for detecting lines.
Xu, Jason; Minin, Vladimir N
2015-07-01
Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes.
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.; Roeder, William P.; Merceret, Francis J.
2010-01-01
A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station.
Xu, Jason; Minin, Vladimir N.
2016-01-01
Branching processes are a class of continuous-time Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and sampling-based alternatives are computationally intensive, requiring integration over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multi-type branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for blood cell formation and evolution of self-replicating transposable elements in bacterial genomes. PMID:26949377
Some observations on the use of discriminant analysis in ecology
Williams, B.K.
1983-01-01
The application of discriminant analysis in ecological investigations is discussed. The appropriate statistical assumptions for discriminant analysis are illustrated, and both classification and group separation approaches are outlined. Three assumptions that are crucial in ecological studies are discussed at length, and the consequences of their violation are developed. These assumptions are: equality of dispersions, identifiability of prior probabilities, and precise and accurate estimation of means and dispersions. The use of discriminant functions for purposes of interpreting ecological relationships is also discussed. It is suggested that the common practice of imputing ecological 'meaning' to the signs and magnitudes of coefficients be replaced by an assessment of 'structure coefficients.' Finally, the potential and limitations of representation of data in canonical space are considered, and some cautionary points are made concerning ecological interpretation of patterns in canonical space.
Complexity Induced Anisotropic Bimodal Intermittent Turbulence in Space Plasmas
NASA Technical Reports Server (NTRS)
Chang, Tom; Tam, Sunny W. Y.; Wu, Cheng-Chin
2004-01-01
The "physics of complexity" in space plasmas is the central theme of this exposition. It is demonstrated that the sporadic and localized interactions of magnetic coherent structures arising from the plasma resonances can be the source for the coexistence of nonpropagating spatiotemporal fluctuations and propagating modes. Non-Gaussian probability distribution functions of the intermittent fluctuations from direct numerical simulations are obtained and discussed. Power spectra and local intermittency measures using the wavelet analyses are presented to display the spottiness of the small-scale turbulent fluctuations and the non-uniformity of coarse-grained dissipation that can lead to magnetic topological reconfigurations. The technique of the dynamic renormalization group is applied to the study of the scaling properties of such type of multiscale fluctuations. Charged particle interactions with both the propagating and nonpropagating portions of the intermittent turbulence are also described.
Topological and Orthomodular Modeling of Context in Behavioral Science
NASA Astrophysics Data System (ADS)
Narens, Louis
2017-02-01
Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.
NASA Technical Reports Server (NTRS)
Lambert, Winifred C.; Merceret, Francis J. (Technical Monitor)
2002-01-01
This report describes the results of the ANU's (Applied Meteorology Unit) Short-Range Statistical Forecasting task for peak winds. The peak wind speeds are an important forecast element for the Space Shuttle and Expendable Launch Vehicle programs. The Keith Weather Squadron and the Spaceflight Meteorology Group indicate that peak winds are challenging to forecast. The Applied Meteorology Unit was tasked to develop tools that aid in short-range forecasts of peak winds at tower sites of operational interest. A 7 year record of wind tower data was used in the analysis. Hourly and directional climatologies by tower and month were developed to determine the seasonal behavior of the average and peak winds. In all climatologies, the average and peak wind speeds were highly variable in time. This indicated that the development of a peak wind forecasting tool would be difficult. Probability density functions (PDF) of peak wind speed were calculated to determine the distribution of peak speed with average speed. These provide forecasters with a means of determining the probability of meeting or exceeding a certain peak wind given an observed or forecast average speed. The climatologies and PDFs provide tools with which to make peak wind forecasts that are critical to safe operations.
Janssen, Stefan; Schudoma, Christian; Steger, Gerhard; Giegerich, Robert
2011-11-03
Many bioinformatics tools for RNA secondary structure analysis are based on a thermodynamic model of RNA folding. They predict a single, "optimal" structure by free energy minimization, they enumerate near-optimal structures, they compute base pair probabilities and dot plots, representative structures of different abstract shapes, or Boltzmann probabilities of structures and shapes. Although all programs refer to the same physical model, they implement it with considerable variation for different tasks, and little is known about the effects of heuristic assumptions and model simplifications used by the programs on the outcome of the analysis. We extract four different models of the thermodynamic folding space which underlie the programs RNAFOLD, RNASHAPES, and RNASUBOPT. Their differences lie within the details of the energy model and the granularity of the folding space. We implement probabilistic shape analysis for all models, and introduce the shape probability shift as a robust measure of model similarity. Using four data sets derived from experimentally solved structures, we provide a quantitative evaluation of the model differences. We find that search space granularity affects the computed shape probabilities less than the over- or underapproximation of free energy by a simplified energy model. Still, the approximations perform similar enough to implementations of the full model to justify their continued use in settings where computational constraints call for simpler algorithms. On the side, we observe that the rarely used level 2 shapes, which predict the complete arrangement of helices, multiloops, internal loops and bulges, include the "true" shape in a rather small number of predicted high probability shapes. This calls for an investigation of new strategies to extract high probability members from the (very large) level 2 shape space of an RNA sequence. We provide implementations of all four models, written in a declarative style that makes them easy to be modified. Based on our study, future work on thermodynamic RNA folding may make a choice of model based on our empirical data. It can take our implementations as a starting point for further program development.
Structural, microstructural and vibrational analyses of the monoclinic tungstate BiLuWO6
NASA Astrophysics Data System (ADS)
Ait Ahsaine, H.; Taoufyq, A.; Patout, L.; Ezahri, M.; Benlhachemi, A.; Bakiz, B.; Villain, S.; Guinneton, F.; Gavarri, J.-R.
2014-10-01
The bismuth lutetium tungstate phase BiLuWO6 has been prepared using a solid state route with stoichiometric mixtures of oxide precursors. The obtained polycrystalline phase has been characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), transmission electron microscopy (TEM), and Raman spectroscopy. In the first step, the crystal structure has been refined using Rietveld method: the crystal cell was resolved using monoclinic system (parameters a, b, c, β) with space group A2/m. SEM images showed the presence of large crystallites with a constant local nominal composition (BiLuW). TEM analyses showed that the actual local structure could be better represented by a superlattice (a, 2b, c, β) associated with space groups P2 or P2/m. The Raman spectroscopy showed the presence of vibrational bands similar to those observed in the compounds BiREWO6 with RE=Y, Gd, Nd. However, these vibrational bands were characterized by large full width at half maximum, probably resulting from the long range Bi/Lu disorder and local WO6 octahedron distortions in the structure.
Mechanical failure probability of glasses in Earth orbit
NASA Technical Reports Server (NTRS)
Kinser, Donald L.; Wiedlocher, David E.
1992-01-01
Results of five years of earth-orbital exposure on mechanical properties of glasses indicate that radiation effects on mechanical properties of glasses, for the glasses examined, are less than the probable error of measurement. During the 5 year exposure, seven micrometeorite or space debris impacts occurred on the samples examined. These impacts were located in locations which were not subjected to effective mechanical testing, hence limited information on their influence upon mechanical strength was obtained. Combination of these results with micrometeorite and space debris impact frequency obtained by other experiments permits estimates of the failure probability of glasses exposed to mechanical loading under earth-orbit conditions. This probabilistic failure prediction is described and illustrated with examples.
Detecting space-time cancer clusters using residential histories
NASA Astrophysics Data System (ADS)
Jacquez, Geoffrey M.; Meliker, Jaymie R.
2007-04-01
Methods for analyzing geographic clusters of disease typically ignore the space-time variability inherent in epidemiologic datasets, do not adequately account for known risk factors (e.g., smoking and education) or covariates (e.g., age, gender, and race), and do not permit investigation of the latency window between exposure and disease. Our research group recently developed Q-statistics for evaluating space-time clustering in cancer case-control studies with residential histories. This technique relies on time-dependent nearest neighbor relationships to examine clustering at any moment in the life-course of the residential histories of cases relative to that of controls. In addition, in place of the widely used null hypothesis of spatial randomness, each individual's probability of being a case is instead based on his/her risk factors and covariates. Case-control clusters will be presented using residential histories of 220 bladder cancer cases and 440 controls in Michigan. In preliminary analyses of this dataset, smoking, age, gender, race and education were sufficient to explain the majority of the clustering of residential histories of the cases. Clusters of unexplained risk, however, were identified surrounding the business address histories of 10 industries that emit known or suspected bladder cancer carcinogens. The clustering of 5 of these industries began in the 1970's and persisted through the 1990's. This systematic approach for evaluating space-time clustering has the potential to generate novel hypotheses about environmental risk factors. These methods may be extended to detect differences in space-time patterns of any two groups of people, making them valuable for security intelligence and surveillance operations.
NASA Astrophysics Data System (ADS)
Obuchi, Tomoyuki; Cocco, Simona; Monasson, Rémi
2015-11-01
We consider the problem of learning a target probability distribution over a set of N binary variables from the knowledge of the expectation values (with this target distribution) of M observables, drawn uniformly at random. The space of all probability distributions compatible with these M expectation values within some fixed accuracy, called version space, is studied. We introduce a biased measure over the version space, which gives a boost increasing exponentially with the entropy of the distributions and with an arbitrary inverse `temperature' Γ . The choice of Γ allows us to interpolate smoothly between the unbiased measure over all distributions in the version space (Γ =0) and the pointwise measure concentrated at the maximum entropy distribution (Γ → ∞ ). Using the replica method we compute the volume of the version space and other quantities of interest, such as the distance R between the target distribution and the center-of-mass distribution over the version space, as functions of α =(log M)/N and Γ for large N. Phase transitions at critical values of α are found, corresponding to qualitative improvements in the learning of the target distribution and to the decrease of the distance R. However, for fixed α the distance R does not vary with Γ which means that the maximum entropy distribution is not closer to the target distribution than any other distribution compatible with the observable values. Our results are confirmed by Monte Carlo sampling of the version space for small system sizes (N≤ 10).
A validation study of a stochastic model of human interaction
NASA Astrophysics Data System (ADS)
Burchfield, Mitchel Talmadge
The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree of the variance in each probability distribution. The correlation between predicted and observed probabilities ranged from a low of 0.955 to a high value of 0.998, indicating that humans behave in psychological space as Fermions behave in momentum space.
ERIC Educational Resources Information Center
Kennon, Tillman; Roberts, Ed; Fuller, Teresa
2008-01-01
Space travel, even low Earth orbit, is probably several years away for most of us; however, students and teachers can research the edge of space by participating in the BalloonSat program. BalloonSat is an offshoot of the Space Grant Consortium's very successful RocketSat program. The Arkansas BalloonSat program consists of teacher-initiated…
Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles
NASA Astrophysics Data System (ADS)
Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey
2013-09-01
Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the general validity of the performance standard approach and suggested potential updates to improve the accuracy each of the example methods, especially to address reliability growth.
Quantum work in the Bohmian framework
NASA Astrophysics Data System (ADS)
Sampaio, R.; Suomela, S.; Ala-Nissila, T.; Anders, J.; Philbin, T. G.
2018-01-01
At nonzero temperature classical systems exhibit statistical fluctuations of thermodynamic quantities arising from the variation of the system's initial conditions and its interaction with the environment. The fluctuating work, for example, is characterized by the ensemble of system trajectories in phase space and, by including the probabilities for various trajectories to occur, a work distribution can be constructed. However, without phase-space trajectories, the task of constructing a work probability distribution in the quantum regime has proven elusive. Here we use quantum trajectories in phase space and define fluctuating work as power integrated along the trajectories, in complete analogy to classical statistical physics. The resulting work probability distribution is valid for any quantum evolution, including cases with coherences in the energy basis. We demonstrate the quantum work probability distribution and its properties with an exactly solvable example of a driven quantum harmonic oscillator. An important feature of the work distribution is its dependence on the initial statistical mixture of pure states, which is reflected in higher moments of the work. The proposed approach introduces a fundamentally different perspective on quantum thermodynamics, allowing full thermodynamic characterization of the dynamics of quantum systems, including the measurement process.
Probability of detection evaluation results for railroad tank cars : final report.
DOT National Transportation Integrated Search
2016-08-01
The Transportation Technology Center, Inc. (TTCI) used the approach developed for the National Aeronautics and Space : Association to determine the probability of detection (POD) for various nondestructive test (NDT) methods used during inspection : ...
Characterising RNA secondary structure space using information entropy
2013-01-01
Comparative methods for RNA secondary structure prediction use evolutionary information from RNA alignments to increase prediction accuracy. The model is often described in terms of stochastic context-free grammars (SCFGs), which generate a probability distribution over secondary structures. It is, however, unclear how this probability distribution changes as a function of the input alignment. As prediction programs typically only return a single secondary structure, better characterisation of the underlying probability space of RNA secondary structures is of great interest. In this work, we show how to efficiently compute the information entropy of the probability distribution over RNA secondary structures produced for RNA alignments by a phylo-SCFG, and implement it for the PPfold model. We also discuss interpretations and applications of this quantity, including how it can clarify reasons for low prediction reliability scores. PPfold and its source code are available from http://birc.au.dk/software/ppfold/. PMID:23368905
Mauro, John C; Loucks, Roger J; Balakrishnan, Jitendra; Raghavan, Srikanth
2007-05-21
The thermodynamics and kinetics of a many-body system can be described in terms of a potential energy landscape in multidimensional configuration space. The partition function of such a landscape can be written in terms of a density of states, which can be computed using a variety of Monte Carlo techniques. In this paper, a new self-consistent Monte Carlo method for computing density of states is described that uses importance sampling and a multiplicative update factor to achieve rapid convergence. The technique is then applied to compute the equilibrium quench probability of the various inherent structures (minima) in the landscape. The quench probability depends on both the potential energy of the inherent structure and the volume of its corresponding basin in configuration space. Finally, the methodology is extended to the isothermal-isobaric ensemble in order to compute inherent structure quench probabilities in an enthalpy landscape.
NASA Technical Reports Server (NTRS)
Susko, M.
1984-01-01
A review of meteoroid flux measurements and models for low orbital altitudes of the Space Station has been made in order to provide information that may be useful in design studies and laboratory hypervelocity impact tests which simulate micrometeoroids in space for design of the main wall of the Space Station. This report deals with the meteoroid flux mass model, the defocusing and shielding factors that affect the model, the probability of meteoroid penetration of the main wall of a Space Station. Whipple (1947) suggested a meteoroid bumper, a thin shield around the spacecraft at some distance from the wall, as an effective device for reducing penetration, which has been discussed in this report. The equations of the probability of meteoroid penetration, the average annual cumulative total flux, and the equations for the thickness of the main wall and the bumper are presented in this report.
Transition probabilities for non self-adjoint Hamiltonians in infinite dimensional Hilbert spaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bagarello, F., E-mail: fabio.bagarello@unipa.it
In a recent paper we have introduced several possible inequivalent descriptions of the dynamics and of the transition probabilities of a quantum system when its Hamiltonian is not self-adjoint. Our analysis was carried out in finite dimensional Hilbert spaces. This is useful, but quite restrictive since many physically relevant quantum systems live in infinite dimensional Hilbert spaces. In this paper we consider this situation, and we discuss some applications to well known models, introduced in the literature in recent years: the extended harmonic oscillator, the Swanson model and a generalized version of the Landau levels Hamiltonian. Not surprisingly we willmore » find new interesting features not previously found in finite dimensional Hilbert spaces, useful for a deeper comprehension of this kind of physical systems.« less
Space shuttle solid rocket booster recovery system definition, volume 1
NASA Technical Reports Server (NTRS)
1973-01-01
The performance requirements, preliminary designs, and development program plans for an airborne recovery system for the space shuttle solid rocket booster are discussed. The analyses performed during the study phase of the program are presented. The basic considerations which established the system configuration are defined. A Monte Carlo statistical technique using random sampling of the probability distribution for the critical water impact parameters was used to determine the failure probability of each solid rocket booster component as functions of impact velocity and component strength capability.
Kennedy Space Center Coronary Heart Disease Risk Screening Program
NASA Technical Reports Server (NTRS)
Tipton, David A.; Scarpa, Philip J.
1999-01-01
Coronary heart disease (CHD) is the number one cause of death in the U.S. It is a likely cause of death and disability in the lives of employees at Kennedy Space Center (KSC) as well. The KSC Biomedical Office used a multifactorial formula developed by the Framingham Heart Study to calculate CHD risk probabilities for individuals in a segment of the KSC population who require medical evaluation for job certification. Those individuals assessed to have a high risk probability will be targeted for intervention.
Technology Development Risk Assessment for Space Transportation Systems
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Godsell, Aga M.; Go, Susie
2006-01-01
A new approach for assessing development risk associated with technology development projects is presented. The method represents technology evolution in terms of sector-specific discrete development stages. A Monte Carlo simulation is used to generate development probability distributions based on statistical models of the discrete transitions. Development risk is derived from the resulting probability distributions and specific program requirements. Two sample cases are discussed to illustrate the approach, a single rocket engine development and a three-technology space transportation portfolio.
Piecewise Geometric Estimation of a Survival Function.
1985-04-01
Langberg (1982). One of the by- products of the estimation process is an estimate of the failure rate function: here, another issue is raised. It is evident...envisaged as the infinite product probability space that may be constructed in the usual way from the sequence of probability spaces corresponding to the...received 6 MP (a mercaptopurine used in the treatment of leukemia). The ordered remis- sion times in weeks are: 6, 6, 6, 6+, 7, 9+, 10, 10+, 11+, 13, 16
Rank-k Maximal Statistics for Divergence and Probability of Misclassification
NASA Technical Reports Server (NTRS)
Decell, H. P., Jr.
1972-01-01
A technique is developed for selecting from n-channel multispectral data some k combinations of the n-channels upon which to base a given classification technique so that some measure of the loss of the ability to distinguish between classes, using the compressed k-dimensional data, is minimized. Information loss in compressing the n-channel data to k channels is taken to be the difference in the average interclass divergences (or probability of misclassification) in n-space and in k-space.
Threshold-selecting strategy for best possible ground state detection with genetic algorithms
NASA Astrophysics Data System (ADS)
Lässig, Jörg; Hoffmann, Karl Heinz
2009-04-01
Genetic algorithms are a standard heuristic to find states of low energy in complex state spaces as given by physical systems such as spin glasses but also in combinatorial optimization. The paper considers the problem of selecting individuals in the current population in genetic algorithms for crossover. Many schemes have been considered in literature as possible crossover selection strategies. We show for a large class of quality measures that the best possible probability distribution for selecting individuals in each generation of the algorithm execution is a rectangular distribution over the individuals sorted by their energy values. This means uniform probabilities have to be assigned to a group of the individuals with lowest energy in the population but probabilities equal to zero to individuals which are corresponding to energy values higher than a fixed cutoff, which is equal to a certain rank in the vector sorted by the energy of the states in the current population. The considered strategy is dubbed threshold selecting. The proof applies basic arguments of Markov chains and linear optimization and makes only a few assumptions on the underlying principles and hence applies to a large class of algorithms.
Maximizing Information Diffusion in the Cyber-physical Integrated Network †
Lu, Hongliang; Lv, Shaohe; Jiao, Xianlong; Wang, Xiaodong; Liu, Juan
2015-01-01
Nowadays, our living environment has been embedded with smart objects, such as smart sensors, smart watches and smart phones. They make cyberspace and physical space integrated by their abundant abilities of sensing, communication and computation, forming a cyber-physical integrated network. In order to maximize information diffusion in such a network, a group of objects are selected as the forwarding points. To optimize the selection, a minimum connected dominating set (CDS) strategy is adopted. However, existing approaches focus on minimizing the size of the CDS, neglecting an important factor: the weight of links. In this paper, we propose a distributed maximizing the probability of information diffusion (DMPID) algorithm in the cyber-physical integrated network. Unlike previous approaches that only consider the size of CDS selection, DMPID also considers the information spread probability that depends on the weight of links. To weaken the effects of excessively-weighted links, we also present an optimization strategy that can properly balance the two factors. The results of extensive simulation show that DMPID can nearly double the information diffusion probability, while keeping a reasonable size of selection with low overhead in different distributed networks. PMID:26569254
Global molecular identification from graphs. Neutral and ionized main-group diatomic molecules.
James, Bryan; Caviness, Ken; Geach, Jonathan; Walters, Chris; Hefferlin, Ray
2002-01-01
Diophantine equations and inequalities are presented for main-group closed-shell diatomic molecules. Specifying various bond types (covalent, dative, ionic, van der Waals) and multiplicities, it becomes possible to identify all possible molecules. While many of the identified species are probably unstable under normal conditions, they are interesting and present a challenge for computational or experimental analysis. Ionized molecules with net charges of -1, 1, and 2 are also identified. The analysis applies to molecules with atoms from periods 2 and 3 but can be generalized by substituting isovalent atoms. When closed-shell neutral diatomics are positioned in the chemical space (with axes enumerating the numbers of valence electrons of the free atoms), it is seen that they lie on a few parallel isoelectronic series.
Hobbs, Jennifer A; Towal, R Blythe; Hartmann, Mitra J Z
2015-08-01
Analysis of natural scene statistics has been a powerful approach for understanding neural coding in the auditory and visual systems. In the field of somatosensation, it has been more challenging to quantify the natural tactile scene, in part because somatosensory signals are so tightly linked to the animal's movements. The present work takes a step towards quantifying the natural tactile scene for the rat vibrissal system by simulating rat whisking motions to systematically investigate the probabilities of whisker-object contact in naturalistic environments. The simulations permit an exhaustive search through the complete space of possible contact patterns, thereby allowing for the characterization of the patterns that would most likely occur during long sequences of natural exploratory behavior. We specifically quantified the probabilities of 'concomitant contact', that is, given that a particular whisker makes contact with a surface during a whisk, what is the probability that each of the other whiskers will also make contact with the surface during that whisk? Probabilities of concomitant contact were quantified in simulations that assumed increasingly naturalistic conditions: first, the space of all possible head poses; second, the space of behaviorally preferred head poses as measured experimentally; and third, common head poses in environments such as cages and burrows. As environments became more naturalistic, the probability distributions shifted from exhibiting a 'row-wise' structure to a more diagonal structure. Results also reveal that the rat appears to use motor strategies (e.g. head pitches) that generate contact patterns that are particularly well suited to extract information in the presence of uncertainty. © 2015. Published by The Company of Biologists Ltd.
NASA Technical Reports Server (NTRS)
Elfer, N.; Meibaum, R.; Olsen, G.
1995-01-01
A unique collection of computer codes, Space Debris Surfaces (SD_SURF), have been developed to assist in the design and analysis of space debris protection systems. SD_SURF calculates and summarizes a vehicle's vulnerability to space debris as a function of impact velocity and obliquity. An SD_SURF analysis will show which velocities and obliquities are the most probable to cause a penetration. This determination can help the analyst select a shield design that is best suited to the predominant penetration mechanism. The analysis also suggests the most suitable parameters for development or verification testing. The SD_SURF programs offer the option of either FORTRAN programs or Microsoft-EXCEL spreadsheets and macros. The FORTRAN programs work with BUMPERII. The EXCEL spreadsheets and macros can be used independently or with selected output from the SD_SURF FORTRAN programs. Examples will be presented of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration.
MRI Brain Tumor Segmentation and Necrosis Detection Using Adaptive Sobolev Snakes.
Nakhmani, Arie; Kikinis, Ron; Tannenbaum, Allen
2014-03-21
Brain tumor segmentation in brain MRI volumes is used in neurosurgical planning and illness staging. It is important to explore the tumor shape and necrosis regions at different points of time to evaluate the disease progression. We propose an algorithm for semi-automatic tumor segmentation and necrosis detection. Our algorithm consists of three parts: conversion of MRI volume to a probability space based on the on-line learned model, tumor probability density estimation, and adaptive segmentation in the probability space. We use manually selected acceptance and rejection classes on a single MRI slice to learn the background and foreground statistical models. Then, we propagate this model to all MRI slices to compute the most probable regions of the tumor. Anisotropic 3D diffusion is used to estimate the probability density. Finally, the estimated density is segmented by the Sobolev active contour (snake) algorithm to select smoothed regions of the maximum tumor probability. The segmentation approach is robust to noise and not very sensitive to the manual initialization in the volumes tested. Also, it is appropriate for low contrast imagery. The irregular necrosis regions are detected by using the outliers of the probability distribution inside the segmented region. The necrosis regions of small width are removed due to a high probability of noisy measurements. The MRI volume segmentation results obtained by our algorithm are very similar to expert manual segmentation.
MRI brain tumor segmentation and necrosis detection using adaptive Sobolev snakes
NASA Astrophysics Data System (ADS)
Nakhmani, Arie; Kikinis, Ron; Tannenbaum, Allen
2014-03-01
Brain tumor segmentation in brain MRI volumes is used in neurosurgical planning and illness staging. It is important to explore the tumor shape and necrosis regions at di erent points of time to evaluate the disease progression. We propose an algorithm for semi-automatic tumor segmentation and necrosis detection. Our algorithm consists of three parts: conversion of MRI volume to a probability space based on the on-line learned model, tumor probability density estimation, and adaptive segmentation in the probability space. We use manually selected acceptance and rejection classes on a single MRI slice to learn the background and foreground statistical models. Then, we propagate this model to all MRI slices to compute the most probable regions of the tumor. Anisotropic 3D di usion is used to estimate the probability density. Finally, the estimated density is segmented by the Sobolev active contour (snake) algorithm to select smoothed regions of the maximum tumor probability. The segmentation approach is robust to noise and not very sensitive to the manual initialization in the volumes tested. Also, it is appropriate for low contrast imagery. The irregular necrosis regions are detected by using the outliers of the probability distribution inside the segmented region. The necrosis regions of small width are removed due to a high probability of noisy measurements. The MRI volume segmentation results obtained by our algorithm are very similar to expert manual segmentation.
Application of Second-Moment Source Analysis to Three Problems in Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Donovan, J.; Jordan, T. H.
2011-12-01
Though earthquake forecasting models have often represented seismic sources as space-time points (usually hypocenters), a more complete hazard analysis requires the consideration of finite-source effects, such as rupture extent, orientation, directivity, and stress drop. The most compact source representation that includes these effects is the finite moment tensor (FMT), which approximates the degree-two polynomial moments of the stress glut by its projection onto the seismic (degree-zero) moment tensor. This projection yields a scalar space-time source function whose degree-one moments define the centroid moment tensor (CMT) and whose degree-two moments define the FMT. We apply this finite-source parameterization to three forecasting problems. The first is the question of hypocenter bias: can we reject the null hypothesis that the conditional probability of hypocenter location is uniformly distributed over the rupture area? This hypothesis is currently used to specify rupture sets in the "extended" earthquake forecasts that drive simulation-based hazard models, such as CyberShake. Following McGuire et al. (2002), we test the hypothesis using the distribution of FMT directivity ratios calculated from a global data set of source slip inversions. The second is the question of source identification: given an observed FMT (and its errors), can we identify it with an FMT in the complete rupture set that represents an extended fault-based rupture forecast? Solving this problem will facilitate operational earthquake forecasting, which requires the rapid updating of earthquake triggering and clustering models. Our proposed method uses the second-order uncertainties as a norm on the FMT parameter space to identify the closest member of the hypothetical rupture set and to test whether this closest member is an adequate representation of the observed event. Finally, we address the aftershock excitation problem: given a mainshock, what is the spatial distribution of aftershock probabilities? The FMT representation allows us to generalize the models typically used for this purpose (e.g., marked point process models, such as ETAS), which will again be necessary in operational earthquake forecasting. To quantify aftershock probabilities, we compare mainshock FMTs with the first and second spatial moments of weighted aftershock hypocenters. We will describe applications of these results to the Uniform California Earthquake Rupture Forecast, version 3, which is now under development by the Working Group on California Earthquake Probabilities.
NASA Astrophysics Data System (ADS)
Massiot, Cécile; Townend, John; Nicol, Andrew; McNamara, David D.
2017-08-01
Acoustic borehole televiewer (BHTV) logs provide measurements of fracture attributes (orientations, thickness, and spacing) at depth. Orientation, censoring, and truncation sampling biases similar to those described for one-dimensional outcrop scanlines, and other logging or drilling artifacts specific to BHTV logs, can affect the interpretation of fracture attributes from BHTV logs. K-means, fuzzy K-means, and agglomerative clustering methods provide transparent means of separating fracture groups on the basis of their orientation. Fracture spacing is calculated for each of these fracture sets. Maximum likelihood estimation using truncated distributions permits the fitting of several probability distributions to the fracture attribute data sets within truncation limits, which can then be extrapolated over the entire range where they naturally occur. Akaike Information Criterion (AIC) and Schwartz Bayesian Criterion (SBC) statistical information criteria rank the distributions by how well they fit the data. We demonstrate these attribute analysis methods with a data set derived from three BHTV logs acquired from the high-temperature Rotokawa geothermal field, New Zealand. Varying BHTV log quality reduces the number of input data points, but careful selection of the quality levels where fractures are deemed fully sampled increases the reliability of the analysis. Spacing data analysis comprising up to 300 data points and spanning three orders of magnitude can be approximated similarly well (similar AIC rankings) with several distributions. Several clustering configurations and probability distributions can often characterize the data at similar levels of statistical criteria. Thus, several scenarios should be considered when using BHTV log data to constrain numerical fracture models.
Neutral aggregation in finite-length genotype space
NASA Astrophysics Data System (ADS)
Houchmandzadeh, Bahram
2017-01-01
The advent of modern genome sequencing techniques allows for a more stringent test of the neutrality hypothesis of Darwinian evolution, where all individuals have the same fitness. Using the individual-based model of Wright and Fisher, we compute the amplitude of neutral aggregation in the genome space, i.e., the probability of finding two individuals at genetic (Hamming) distance k as a function of the genome size L , population size N , and mutation probability per base ν . In well-mixed populations, we show that for N ν <1 /L , neutral aggregation is the dominant force and most individuals are found at short genetic distances from each other. For N ν >1 , on the contrary, individuals are randomly dispersed in genome space. The results are extended to a geographically dispersed population, where the controlling parameter is shown to be a combination of mutation and migration probability. The theory we develop can be used to test the neutrality hypothesis in various ecological and evolutionary systems.
ERIC Educational Resources Information Center
Kostal, Heather
2011-01-01
Five- and six-year-olds know a lot about their own homes. Besides school, home is probably where they spend most of their time. But have they ever really thought about their space? Using students' knowledge of their current space will help them design new spaces and think about all the areas that surround them. In this project, students design…
Cohen, Helen S
2003-01-01
This paper is an overview of current research on development of rehabilitative countermeasures to ameliorate the effects of long-term exposure to microgravity on sensorimotor function during space flight. After many years of work we do not yet have operational countermeasures, probably for several reasons: 1) changes in the use of vestibular input are manifested in many ways, 2) due to multiple mechanisms for funding research, investigators doing related research may not coordinate their work, and 3) relatively few scientists work on this problem. The number of investigators and physicians who routinely deal with the functional problems of astronauts and the limitations of working in the space environment is tiny; the number of investigators who are therapists, and who therefore have experience and expertise in developing rehabilitation programs, is miniscule. That's the bad news. The good news is that as a group, we are little but mighty. Therefore, the entire group of investigators can plan to take a more coordinated, collaborative approach than investigators in larger fields. Also, serendipitously, individual research groups have begun approaching different rehabilitative aspects of this problem. If we make a greater effort toward a coordinated, multidimensional approach, guided by rehabilitation concepts, we will be able to provide operational sensorimotor countermeasures when they are needed.
Regulation of autonomic nervous system in space and magnetic storms.
Baevsky, R M; Petrov, V M; Chernikova, A G
1998-01-01
Variations in the earth's magnetic field and magnetic storms are known to be a risk factor for the development of cardiovascular disorders. The main "targets" for geomagnetic perturbations are the central nervous system and the neural regulation of vascular tone and heart rate variability. This paper presents the data about effect of geomagnetic fluctuations on human body in space. As a method for research the analysis of heart rate variability was used, which allows evaluating the state of the sympathetic and parasympathetic parts of the autonomic nervous system, vasomotor center and subcortical neural centers activity. Heart rate variability data were analyzed for 30 cosmonauts at the 2nd day of space flight on transport spaceship Soyuz (32nd orbit). There were formed three groups of cosmonauts: without magnetic storm (n=9), on a day with magnetic storm (n=12) and 1-2 days after magnetic storm (n=9). The present study was the first to demonstrate a specific impact of geomagnetic perturbations on the system of autonomic circulatory control in cosmonauts during space flight. The increasing of highest nervous centers activity was shown for group with magnetic storms, which was more significant on 1-2 days after magnetic storm. The use of discriminate analysis allowed to classify indicated three groups with 88% precision. Canonical variables are suggested to be used as criterions for evaluation of specific and non-specific components of cardiovascular reactions to geomagnetic perturbations. The applied aspect of the findings from the present study should be emphasized. They show, in particular, the need to supplement the medical monitoring of cosmonauts with predictions of probable geomagnetic perturbations in view of the prevention of unfavorable states appearances if the adverse reactions to geomagnetic perturbations are added to the tension experienced by regulatory systems during various stresses situations (such as work in the open space).
Probability and Quantum Paradigms: the Interplay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kracklauer, A. F.
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a fewmore » details, this variant is appealing in its reliance on well tested concepts and technology.« less
Probability and Quantum Paradigms: the Interplay
NASA Astrophysics Data System (ADS)
Kracklauer, A. F.
2007-12-01
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.
A simulation model for probabilistic analysis of Space Shuttle abort modes
NASA Technical Reports Server (NTRS)
Hage, R. T.
1993-01-01
A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.
Probabilistic structural analysis methods for space transportation propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Moore, N.; Anis, C.; Newell, J.; Nagpal, V.; Singhal, S.
1991-01-01
Information on probabilistic structural analysis methods for space propulsion systems is given in viewgraph form. Information is given on deterministic certification methods, probability of failure, component response analysis, stress responses for 2nd stage turbine blades, Space Shuttle Main Engine (SSME) structural durability, and program plans. .
Gandomkar, Ziba; Brennan, Patrick C.; Mello-Thoms, Claudia
2017-01-01
Context: Previous studies showed that the agreement among pathologists in recognition of mitoses in breast slides is fairly modest. Aims: Determining the significantly different quantitative features among easily identifiable mitoses, challenging mitoses, and miscounted nonmitoses within breast slides and identifying which color spaces capture the difference among groups better than others. Materials and Methods: The dataset contained 453 mitoses and 265 miscounted objects in breast slides. The mitoses were grouped into three categories based on the confidence degree of three pathologists who annotated them. The mitoses annotated as “probably a mitosis” by the majority of pathologists were considered as the challenging category. The miscounted objects were recognized as a mitosis or probably a mitosis by only one of the pathologists. The mitoses were segmented using k-means clustering, followed by morphological operations. Morphological, intensity-based, and textural features were extracted from the segmented area and also the image patch of 63 × 63 pixels in different channels of eight color spaces. Holistic features describing the mitoses' surrounding cells of each image were also extracted. Statistical Analysis Used: The Kruskal–Wallis H-test followed by the Tukey-Kramer test was used to identify significantly different features. Results: The results indicated that challenging mitoses were smaller and rounder compared to other mitoses. Among different features, the Gabor textural features differed more than others between challenging mitoses and the easily identifiable ones. Sizes of the non-mitoses were similar to easily identifiable mitoses, but nonmitoses were rounder. The intensity-based features from chromatin channels were the most discriminative features between the easily identifiable mitoses and the miscounted objects. Conclusions: Quantitative features can be used to describe the characteristics of challenging mitoses and miscounted nonmitotic objects. PMID:28966834
Quantum centipedes with strong global constraint
NASA Astrophysics Data System (ADS)
Grange, Pascal
2017-06-01
A centipede made of N quantum walkers on a one-dimensional lattice is considered. The distance between two consecutive legs is either one or two lattice spacings, and a global constraint is imposed: the maximal distance between the first and last leg is N + 1. This is the strongest global constraint compatible with walking. For an initial value of the wave function corresponding to a localized configuration at the origin, the probability law of the first leg of the centipede can be expressed in closed form in terms of Bessel functions. The dispersion relation and the group velocities are worked out exactly. Their maximal group velocity goes to zero when N goes to infinity, which is in contrast with the behaviour of group velocities of quantum centipedes without global constraint, which were recently shown by Krapivsky, Luck and Mallick to give rise to ballistic spreading of extremal wave-front at non-zero velocity in the large-N limit. The corresponding Hamiltonians are implemented numerically, based on a block structure of the space of configurations corresponding to compositions of the integer N. The growth of the maximal group velocity when the strong constraint is gradually relaxed is explored, and observed to be linear in the density of gaps allowed in the configurations. Heuristic arguments are presented to infer that the large-N limit of the globally constrained model can yield finite group velocities provided the allowed number of gaps is a finite fraction of N.
NASA Technical Reports Server (NTRS)
Chase, Thomas D.; Splawn, Keith; Christiansen, Eric L.
2007-01-01
The NASA Extravehicular Mobility Unit (EMU) micrometeoroid and orbital debris protection ability has recently been assessed against an updated, higher threat space environment model. The new environment was analyzed in conjunction with a revised EMU solid model using a NASA computer code. Results showed that the EMU exceeds the required mathematical Probability of having No Penetrations (PNP) of any suit pressure bladder over the remaining life of the program (2,700 projected hours of 2 person spacewalks). The success probability was calculated to be 0.94, versus a requirement of >0.91, for the current spacesuit s outer protective garment. In parallel to the probability assessment, potential improvements to the current spacesuit s outer protective garment were built and impact tested. A NASA light gas gun was used to launch projectiles at test items, at speeds of approximately 7 km per second. Test results showed that substantial garment improvements could be made, with mild material enhancements and moderate assembly development. The spacesuit s PNP would improve marginally with the tested enhancements, if they were available for immediate incorporation. This paper discusses the results of the model assessment process and test program. These findings add confidence to the continued use of the existing NASA EMU during International Space Station (ISS) assembly and Shuttle Operations. They provide a viable avenue for improved hypervelocity impact protection for the EMU, or for future space suits.
Chin, Wei-Chien-Benny; Wen, Tzai-Hung; Sabel, Clive E; Wang, I-Hsiang
2017-10-03
A diffusion process can be considered as the movement of linked events through space and time. Therefore, space-time locations of events are key to identify any diffusion process. However, previous clustering analysis methods have focused only on space-time proximity characteristics, neglecting the temporal lag of the movement of events. We argue that the temporal lag between events is a key to understand the process of diffusion movement. Using the temporal lag could help to clarify the types of close relationships. This study aims to develop a data exploration algorithm, namely the TrAcking Progression In Time And Space (TaPiTaS) algorithm, for understanding diffusion processes. Based on the spatial distance and temporal interval between cases, TaPiTaS detects sub-clusters, a group of events that have high probability of having common sources, identifies progression links, the relationships between sub-clusters, and tracks progression chains, the connected components of sub-clusters. Dengue Fever cases data was used as an illustrative case study. The location and temporal range of sub-clusters are presented, along with the progression links. TaPiTaS algorithm contributes a more detailed and in-depth understanding of the development of progression chains, namely the geographic diffusion process.
Physical processes of shallow mafic dike emplacement near the San Rafael Swell, Utah
Delaney, P.T.; Gartner, A.E.
1997-01-01
Some 200 shonkinite dikes, sills, and breccia bodies on the western Colorado Plateau of south-central Utah were intruded from approximately 3.7 to 4.6 Ma, contemporaneous with mafic volcanism along the nearby plateau margin. Thicknesses of dikes range to about 6 m; the log-normal mean thickness is 85 cm. Despite the excellent exposures of essentially all dikes in strata of the Jurassic San Rafael Group, their number is indeterminate from their outcrop and spacing because they are everywhere greatly segmented. By our grouping of almost 2000 dike segments, most dikes are less than 2 km in outcrop length; the longest is 9 km. Because the San Rafael magmas were primitive and probably ascended directly from the mantle, dike lengths in outcrop are much less than their heights. The present exposures probably lie along the irregular upper peripheries of dikes that lengthen and merge with depth. Orientations of steps on dike contacts record local directions of dike-fracture propagation; about half of the measurements plunge less than 30??, showing that lateral propagation at dike peripheries is as important as the vertical propagation ultimately responsible for ascent. The San Rafael dikes, now exposed after erosion of about 0.5-1.5 km, appear to thicken and shorten upward, probably because near-surface vesiculation enhanced magmatic driving pressures. Propagation likely ceased soon after the first dike segments began to feed nearby sills or vented to initiate small-volume eruptions. Most of the dikes are exposed in clastic strata of the Jurassic San Rafael Group. They probably acquired their strikes, however, while ascending along well-developed joints in massive sandstones of the underlying Glen Canyon Group. Rotation of far-field stresses during the emplacement interval cannot account for disparate strikes of the dikes, which vary through 110??, most lying between north and N25??W. Rather, the two regional horizontal principal stresses were probably nearly equal, and so the dominant N75??E direction of dike opening was not strongly favored. Across the center of the swarm, about 10 to 15 dikes overlap and produce 15-20 m of dilation. Many are in sufficient proximity that later dikes should be thinner than earlier ones if neither the magma pressures nor regional stresses were changing during the emplacement interval. However, dike thicknesses vary systematically neither along the length of the swarm nor in proportion to the number of neighboring dikes. It appears that crustal extension during the maginatic interval relieved compressive stresses localized by intrusion.
Forecasting Lightning at Kennedy Space Center/Cape Canaveral Air Force Station, Florida
NASA Technical Reports Server (NTRS)
Lambert, Winfred; Wheeler, Mark; Roeder, William
2005-01-01
The Applied Meteorology Unit (AMU) developed a set of statistical forecast equations that provide a probability of lightning occurrence on Kennedy Space Center (KSC) I Cape Canaveral Air Force Station (CCAFS) for the day during the warm season (May September). The 45th Weather Squadron (45 WS) forecasters at CCAFS in Florida include a probability of lightning occurrence in their daily 24-hour and weekly planning forecasts, which are briefed at 1100 UTC (0700 EDT). This information is used for general scheduling of operations at CCAFS and KSC. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts for the KSC/CCAFS area during Shuttle flight operations. Much of the current lightning probability forecast at both groups is based on a subjective analysis of model and observational data. The objective tool currently available is the Neumann-Pfeffer Thunderstorm Index (NPTI, Neumann 1971), developed specifically for the KSCICCAFS area over 30 years ago. However, recent studies have shown that 1-day persistence provides a better forecast than the NPTI, indicating that the NPTI needed to be upgraded or replaced. Because they require a tool that provides a reliable estimate of the daily thunderstorm probability forecast, the 45 WS forecasters requested that the AMU develop a new lightning probability forecast tool using recent data and more sophisticated techniques now possible through more computing power than that available over 30 years ago. The equation development incorporated results from two research projects that investigated causes of lightning occurrence near KSCICCAFS and over the Florida peninsula. One proved that logistic regression outperformed the linear regression method used in NPTI, even when the same predictors were used. The other study found relationships between large scale flow regimes and spatial lightning distributions over Florida. Lightning, probabilities based on these flow regimes were used as candidate predictors in the equation development. Fifteen years (1 989-2003) of warm season data were used to develop the forecast equations. The data sources included a local network of cloud-to-ground lightning sensors called the Cloud-to-Ground Lightning Surveillance System (CGLSS), 1200 UTC Florida synoptic soundings, and the 1000 UTC CCAFS sounding. Data from CGLSS were used to determine lightning occurrence for each day. The 1200 UTC soundings were used to calculate the synoptic-scale flow regimes and the 1000 UTC soundings were used to calculate local stability parameters, which were used as candidate predictors of lightning occurrence. Five logistic regression forecast equations were created through careful selection and elimination of the candidate predictors. The resulting equations contain five to six predictors each. Results from four performance tests indicated that the equations showed an increase in skill over several standard forecasting methods, good reliability, an ability to distinguish between non-lightning and lightning days, and good accuracy measures and skill scores. Given the overall good performance the 45 WS requested that the equations be transitioned to operations and added to the current set of tools used to determine the daily lightning probability of occurrence.
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.; Roeder, William P.; Merceret, Francis J.
2011-01-01
A new technique has been developed to estimate the probability that a nearby cloud to ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.
NASA Technical Reports Server (NTRS)
Huddleston, Lisa; Roeder, WIlliam P.; Merceret, Francis J.
2011-01-01
A new technique has been developed to estimate the probability that a nearby cloud-to-ground lightning stroke was within a specified radius of any point of interest. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even within the location error ellipse. This technique is adapted from a method of calculating the probability of debris collision with spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force station. Future applications could include forensic meteorology.
NASA Technical Reports Server (NTRS)
Wheeler, J. T.
1990-01-01
The Weibull process, identified as the inhomogeneous Poisson process with the Weibull intensity function, is used to model the reliability growth assessment of the space shuttle main engine test and flight failure data. Additional tables of percentage-point probabilities for several different values of the confidence coefficient have been generated for setting (1-alpha)100-percent two sided confidence interval estimates on the mean time between failures. The tabled data pertain to two cases: (1) time-terminated testing, and (2) failure-terminated testing. The critical values of the three test statistics, namely Cramer-von Mises, Kolmogorov-Smirnov, and chi-square, were calculated and tabled for use in the goodness of fit tests for the engine reliability data. Numerical results are presented for five different groupings of the engine data that reflect the actual response to the failures.
Miovský, Michal; Vonkova, Hana; Čablová, Lenka; Gabrhelík, Roman
2015-11-01
To study the effect of a universal prevention intervention targeting cannabis use in individual children with different risk profiles. A school-based randomized controlled prevention trial was conducted over a period of 33 months (n=1874 sixth-graders, baseline mean age 11.82). We used a two-level random intercept logistic model for panel data to predict the probabilities of cannabis use for each child. Specifically, we used eight risk/protective factors to characterize each child and then predicted two probabilities of cannabis use for each child if the child had the intervention or not. Using the two probabilities, we calculated the absolute and relative effect of the intervention for each child. According to the two probabilities, we also divided the sample into a low-risk group (the quarter of the children with the lowest probabilities), a moderate-risk group, and a high-risk group (the quarter of the children with the highest probabilities) and showed the average effect of the intervention on these groups. The differences between the intervention group and the control group were statistically significant in each risk group. The average predicted probabilities of cannabis use for a child from the low-risk group were 4.3% if the child had the intervention and 6.53% if no intervention was provided. The corresponding probabilities for a child from the moderate-risk group were 10.91% and 15.34% and for a child from the high-risk group 25.51% and 32.61%. School grades, thoughts of hurting oneself, and breaking the rules were the three most important factors distinguishing high-risk and low-risk children. We predicted the effect of the intervention on individual children, characterized by their risk/protective factors. The predicted absolute effect and relative effect of any intervention for any selected risk/protective profile of a given child may be utilized in both prevention practice and research. Copyright © 2015 Elsevier Ltd. All rights reserved.
1981-06-01
for a de- tection probability of PD and associated false alarm probability PFA (in dB). 21 - - - II V. REFERENCE MODEL A. INTRODUCTION In order to...space for which to choose HI . PFA = P (wI 0o)dw = Q(---) (26) j 0 Similarity, the miss probability=l-detection probability is obtained by integrating...31) = 2 (1+ (22 [()BT z] ~Z The input signal-to-noise ratio: S/N(input) - a2 (32) The probability of false alarm: PFA = Q[ tB(j-I) 1 (33) The
Near-Space Science: A Ballooning Project to Engage Students with Space beyond the Big Screen
ERIC Educational Resources Information Center
Hike, Nina; Beck-Winchatz, Bernhard
2015-01-01
Many students probably know something about space from playing computer games or watching movies and TV shows. Teachers can expose them to the real thing by launching their experiments into near space on a weather balloon. This article describes how to use high-altitude ballooning (HAB) as a culminating project to a chemistry unit on experimental…
NASA Technical Reports Server (NTRS)
Lambert, Winifred; Roeder, William
2008-01-01
This conference presentation describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equaitions showed and increase in skill over the current equations, good reliability, and an ability to distinguish between lightning and non-lightning days.
NASA Technical Reports Server (NTRS)
Lambert, Winifred; Roeder, William
2013-01-01
This conference poster describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability and an ability to distinguish between lightning and non-lightning days.
Robustness of chimera states in complex dynamical systems
Yao, Nan; Huang, Zi-Gang; Lai, Ying-Cheng; Zheng, Zhi-Gang
2013-01-01
The remarkable phenomenon of chimera state in systems of non-locally coupled, identical oscillators has attracted a great deal of recent theoretical and experimental interests. In such a state, different groups of oscillators can exhibit characteristically distinct types of dynamical behaviors, in spite of identity of the oscillators. But how robust are chimera states against random perturbations to the structure of the underlying network? We address this fundamental issue by studying the effects of random removal of links on the probability for chimera states. Using direct numerical calculations and two independent theoretical approaches, we find that the likelihood of chimera state decreases with the probability of random-link removal. A striking finding is that, even when a large number of links are removed so that chimera states are deemed not possible, in the state space there are generally both coherent and incoherent regions. The regime of chimera state is a particular case in which the oscillators in the coherent region happen to be synchronized or phase-locked. PMID:24343533
NASA Astrophysics Data System (ADS)
Malicet, Dominique
2017-12-01
In this paper, we study random walks {g_n=f_{n-1}\\ldots f_0} on the group Homeo ( S 1) of the homeomorphisms of the circle, where the homeomorphisms f k are chosen randomly, independently, with respect to a same probability measure {ν}. We prove that under the only condition that there is no probability measure invariant by {ν}-almost every homeomorphism, the random walk almost surely contracts small intervals. It generalizes what has been known on this subject until now, since various conditions on {ν} were imposed in order to get the phenomenon of contractions. Moreover, we obtain the surprising fact that the rate of contraction is exponential, even in the lack of assumptions of smoothness on the f k 's. We deduce various dynamical consequences on the random walk ( g n ): finiteness of ergodic stationary measures, distribution of the trajectories, asymptotic law of the evaluations, etc. The proof of the main result is based on a modification of the Ávila-Viana's invariance principle, working for continuous cocycles on a space fibred in circles.
SHU, Jian-Jun; WANG, Qi-Wen
2014-01-01
The Parrondo's paradox is a counterintuitive phenomenon where individually-losing strategies can be combined in producing a winning expectation. In this paper, the issues surrounding the Parrondo's paradox are investigated. The focus is lying on testifying whether the same paradoxical effect can be reproduced by using a simple capital dependent game. The paradoxical effect generated by the Parrondo's paradox can be explained by placing all the parameters in one probability space. Based on this framework, it is able to generate other possible paradoxical effects by manipulating the parameters in the probability space. PMID:24577586
Planetary quarantine. Space research and technology
NASA Technical Reports Server (NTRS)
1973-01-01
Planetary quarantine strategies for advanced spacecraft consider effects of satellite encounter, Jupiter atmosphere entry, space radiation, and cleaning and decontamination techniques on microbiological growth probability. Analytical restructuring is developed for microbial burden prediction and planetary contamination.
NASA Astrophysics Data System (ADS)
Tomic, A. T.
2018-04-01
I do not believe that humans in current form are able to reach any interstellar distances. Probably genetic and cyber biology will come up with the solutions, but until then we still can try to grow plants in space.
Modeling the Offensive-Defensive Interaction and Resulting Outcomes in Basketball.
Lamas, Leonardo; Santana, Felipe; Heiner, Matthew; Ugrinowitsch, Carlos; Fellingham, Gilbert
2015-01-01
We analyzed the interaction between offensive (i.e. space creation dynamics--SCDs) and defensive (i.e. space protection dynamics--SPDs) actions in six play outcomes (free shot, contested shot, new SCD, reset, foul, and turnover) in Spanish professional basketball games. Data consisted of 1548 SCD-SPD-outcome triples obtained from six play-off games. We used Bayesian methods to compute marginal probabilities of six outcomes following five different SCDs. We also computed probabilities of the six outcomes following the 16 most frequent SCD-SPD combinations. The pick action (e.g. pick and roll, pop and pop) was the most prevalent SCD (33%). However, this SCD did not produce the highest probability of a free shot (0.235). The highest probability of a free shot followed the SCD without ball (0.409). The pick was performed not only to attempt scoring but also to initiate offenses, as it produced the highest probability leading to a new SCD (0.403). Additionally, the SPD performed influenced the outcome of the SCD. This reinforces the notion that the opposition (offensive-defensive interaction) should be considered. To the best of our knowledge, in team sports, this is the first study to successfully model the tactical features involved in offense-defense interactions. Our analyses revealed that the high frequency of occurrence of some SCDs may be justified not only by an associated high probability of free shots but also by the possibility of progressively create more space in the defense (i.e. a new SCD as outcome). In the second case, it evidences offensive strategic features of progressive disruption of the defensive system through the concatenation of subsequent offensive actions.
Modeling the Offensive-Defensive Interaction and Resulting Outcomes in Basketball
Lamas, Leonardo; Santana, Felipe; Heiner, Matthew; Ugrinowitsch, Carlos; Fellingham, Gilbert
2015-01-01
Purpose We analyzed the interaction between offensive (i.e. space creation dynamics -SCDs) and defensive (i.e. space protection dynamics—SPDs) actions in six play outcomes (free shot, contested shot, new SCD, reset, foul, and turnover) in Spanish professional basketball games. Method Data consisted of 1548 SCD-SPD-outcome triples obtained from six play-off games. We used Bayesian methods to compute marginal probabilities of six outcomes following five different SCDs. We also computed probabilities of the six outcomes following the 16 most frequent SCD-SPD combinations. Results The pick action (e.g. pick and roll, pop and pop) was the most prevalent SCD (33%). However, this SCD did not produce the highest probability of a free shot (0.235). The highest probability of a free shot followed the SCD without ball (0.409). The pick was performed not only to attempt scoring but also to initiate offenses, as it produced the highest probability leading to a new SCD (0.403). Additionally, the SPD performed influenced the outcome of the SCD. This reinforces the notion that the opposition (offensive-defensive interaction) should be considered. To the best of our knowledge, in team sports, this is the first study to successfully model the tactical features involved in offense-defense interactions. Our analyses revealed that the high frequency of occurrence of some SCDs may be justified not only by an associated high probability of free shots but also by the possibility of progressively create more space in the defense (i.e. a new SCD as outcome). In the second case, it evidences offensive strategic features of progressive disruption of the defensive system through the concatenation of subsequent offensive actions. PMID:26659134
The relationship between species detection probability and local extinction probability
Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.
2004-01-01
In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.
Assessment of Group Preferences and Group Uncertainty for Decision Making
1976-06-01
the individ- uals. decision making , group judgments should be preferred to individual judgments if obtaining group judgments costs more. -26- -YI IV... decision making group . IV. A. 3. Aggregation using conjugate distribution. Arvther procedure for combining indivi(jai probability judgments into a group...statisticized group group decision making group judgment subjective probability Delphi method expected utility nominal group 20. ABSTRACT (Continue on
Gao, Xiaolei; Wei, Jianjian; Lei, Hao; Xu, Pengcheng; Cowling, Benjamin J; Li, Yuguo
2016-01-01
Emerging diseases may spread rapidly through dense and large urban contact networks, especially they are transmitted by the airborne route, before new vaccines can be made available. Airborne diseases may spread rapidly as people visit different indoor environments and are in frequent contact with others. We constructed a simple indoor contact model for an ideal city with 7 million people and 3 million indoor spaces, and estimated the probability and duration of contact between any two individuals during one day. To do this, we used data from actual censuses, social behavior surveys, building surveys, and ventilation measurements in Hong Kong to define eight population groups and seven indoor location groups. Our indoor contact model was integrated with an existing epidemiological Susceptible, Exposed, Infectious, and Recovered (SEIR) model to estimate disease spread and with the Wells-Riley equation to calculate local infection risks, resulting in an integrated indoor transmission network model. This model was used to estimate the probability of an infected individual infecting others in the city and to study the disease transmission dynamics. We predicted the infection probability of each sub-population under different ventilation systems in each location type in the case of a hypothetical airborne disease outbreak, which is assumed to have the same natural history and infectiousness as smallpox. We compared the effectiveness of controlling ventilation in each location type with other intervention strategies. We conclude that increasing building ventilation rates using methods such as natural ventilation in classrooms, offices, and homes is a relatively effective strategy for airborne diseases in a large city.
Prescription Drug Misuse Among Club Drug-Using Young Adults
Kelly, Brian C.; Parsons, Jeffrey T.
2009-01-01
Nonmedical prescription (Rx) drug use has recently increased, particularly among young adults. Using time-space sampling to generate a probability-based sample of club-going young adults (18–29), 400 subjects provided data on Rx drug misuse. Club-going young adults misuse Rx drugs at high rates. An overwhelming majority of the sample indicated lifetime use of pain killers, sedatives, and stimulants. A majority indicated recent pain killer use. Variations by gender and sexuality exist in this population. Young lesbian/bisexual women emerged as the group most likely to abuse Rx drugs. Research into the contexts influencing these patterns is imperative. PMID:17994483
Crystallography, chemistry and structural disorder in the new high-Tc Bi-Ca-Sr-Cu-O superconductor
NASA Technical Reports Server (NTRS)
Veblen, D. R.; Heaney, P. J.; Angel, R. J.; Finger, L. W.; Hazen, R. M.
1988-01-01
Diffraction experiments are reported which indicate that the new Bi-Ca-Sr-Cu-O layer-structure superconductor possesses a primitive orthorhombic unit cell with probable space group Pnnn. The material exhibits severe structural disorder which is primarily related to stacking within the layers. The apparent orthorhombic structure is an average resulting from orthorhombic material mixed with monoclinic domains in two twinned orientations. Two distinct types of structural disorder that are common in materials synthesized to date are also described. This disorder complicates the crystallographic analysis and suggests that X-ray and neutron diffraction methods may yield only an average structure.
Mayer control problem with probabilistic uncertainty on initial positions
NASA Astrophysics Data System (ADS)
Marigonda, Antonio; Quincampoix, Marc
2018-03-01
In this paper we introduce and study an optimal control problem in the Mayer's form in the space of probability measures on Rn endowed with the Wasserstein distance. Our aim is to study optimality conditions when the knowledge of the initial state and velocity is subject to some uncertainty, which are modeled by a probability measure on Rd and by a vector-valued measure on Rd, respectively. We provide a characterization of the value function of such a problem as unique solution of an Hamilton-Jacobi-Bellman equation in the space of measures in a suitable viscosity sense. Some applications to a pursuit-evasion game with uncertainty in the state space is also discussed, proving the existence of a value for the game.
NASA Astrophysics Data System (ADS)
Kaur, Prabhmandeep; Jain, Virander Kumar; Kar, Subrat
2014-12-01
In this paper, we investigate the performance of a Free Space Optic (FSO) link considering the impairments caused by the presence of various weather conditions such as very clear air, drizzle, haze, fog, etc., and turbulence in the atmosphere. Analytic expression for the outage probability is derived using the gamma-gamma distribution for turbulence and accounting the effect of weather conditions using the Beer-Lambert's law. The effect of receiver diversity schemes using aperture averaging and array receivers on the outage probability is studied and compared. As the aperture diameter is increased, the outage probability decreases irrespective of the turbulence strength (weak, moderate and strong) and weather conditions. Similar effects are observed when the number of direct detection receivers in the array are increased. However, it is seen that as the desired level of performance in terms of the outage probability decreases, array receiver becomes the preferred choice as compared to the receiver with aperture averaging.
Local regularity for time-dependent tug-of-war games with varying probabilities
NASA Astrophysics Data System (ADS)
Parviainen, Mikko; Ruosteenoja, Eero
2016-07-01
We study local regularity properties of value functions of time-dependent tug-of-war games. For games with constant probabilities we get local Lipschitz continuity. For more general games with probabilities depending on space and time we obtain Hölder and Harnack estimates. The games have a connection to the normalized p (x , t)-parabolic equation ut = Δu + (p (x , t) - 2) Δ∞N u.
Space agencies' scientific roadmaps need harmonisation and reegular re-assessment
NASA Astrophysics Data System (ADS)
Worms, Jean-Claude; Culhane, J. Leonard; Walter, Nicolas; Swings, Jean-Pierre; Detsis, Emmanouil
The need to consider international collaboration in the exploration of space has been recognised since the dawn of the space age in 1957. Since then, international collaboration has been the main operational working mode amongst space scientists the world over, setting aside national pre-eminence and other political arguments. COSPAR itself was created as a tool for scientists to maintain the dialogue at the time of the cold war. Similarly the inherent constraints of the field (cost, complexity, time span) have led space agencies to try and coordinate their efforts. As a result many - if not all - of the key space science missions since the 60’s have been collaborative by nature. Different collaboration models have existed with varying success, and the corresponding lessons learned have been assessed through various fora and reports. For various reasons whose scope has broadened since that time (use of space in other domains such as Earth observation, telecommunication and navigation; emergence of commercial space activities; increased public appeal and capacity to motivate the young generation to engage into related careers), the importance of international collaboration in space has never faltered and coordination among spacefaring nations has become the norm. However programme harmonisation is often found to be lacking, and duplication of efforts sometimes happens due to different planning and decision procedures, programmatic timelines or budgetary constraints. Previous studies, in particular by the European ESSC-ESF, with input from the US NAS-SSB, advocated the need to establish a coordinating body involving major space agencies to address these coordination issues in a systematic and harmonious way. Since then and in line with this recommendation, the International Space Exploration Coordination Group (ISECG) of 14 space agencies was created in 2007 and published a first roadmap to advance a “Global Exploration Strategy”. ISECG is non-binding though and recent examples of lack of coordination in international planning probably indicate that this should be brought to a higher, more systematic level of coordination. Even more recently, discussions i.e. at the ISECG level, have led this forum to envisage setting up a Science Working Group to inform ISECG on ways to better coordinate the “…interaction between the exploration community…” (i.e. agencies) and the “…scientific community”. Following the recommendations by ESSC-ESF, the need for a rational and systematic approach to the harmonisation of agencies’ scientific roadmaps should be undertaken on a regular basis (ideally on an annual basis), through an inter-agency scientific collaboration working group, which would include agency executives but also scientific membership chosen after appropriate consultation. The ISECG Science Working Group could serve as an embryo to this inter-agency body. The presentation will offer prospects for the establishment of such a body and suggestions on its operating mode.
Space-time modelling of lightning-caused ignitions in the Blue Mountains, Oregon
Diaz-Avalos, Carlos; Peterson, D.L.; Alvarado, Ernesto; Ferguson, Sue A.; Besag, Julian E.
2001-01-01
Generalized linear mixed models (GLMM) were used to study the effect of vegetation cover, elevation, slope, and precipitation on the probability of ignition in the Blue Mountains, Oregon, and to estimate the probability of ignition occurrence at different locations in space and in time. Data on starting location of lightning-caused ignitions in the Blue Mountains between April 1986 and September 1993 constituted the base for the analysis. The study area was divided into a pixela??time array. For each pixela??time location we associated a value of 1 if at least one ignition occurred and 0 otherwise. Covariate information for each pixel was obtained using a geographic information system. The GLMMs were fitted in a Bayesian framework. Higher ignition probabilities were associated with the following cover types: subalpine herbaceous, alpine tundra, lodgepole pine (Pinus contorta Dougl. ex Loud.), whitebark pine (Pinus albicaulis Engelm.), Engelmann spruce (Picea engelmannii Parry ex Engelm.), subalpine fir (Abies lasiocarpa (Hook.) Nutt.), and grand fir (Abies grandis (Dougl.) Lindl.). Within each vegetation type, higher ignition probabilities occurred at lower elevations. Additionally, ignition probabilities are lower in the northern and southern extremes of the Blue Mountains. The GLMM procedure used here is suitable for analysing ignition occurrence in other forested regions where probabilities of ignition are highly variable because of a spatially complex biophysical environment.
Exploration Health Risks: Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley
2006-01-01
Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of conservative and nonconservative assumptions on the probability results. We discuss the methods necessary to assess mission risks once exploration mission scenarios are characterized. Preliminary efforts have produced results that are commensurate with earlier qualitative estimates of risk probabilities in this and other operational contexts, indicating that our approach may be usefully applied in support of the development of human health and performance standards for long-duration space exploration missions. This approach will also enable mission-specific probabilistic risk assessments for space exploration missions.
[Yanomami children's nutritional status in the middle Rio Negro, Brazilian Amazônia].
Istria, Jacques; Gazin, Pierre
2002-01-01
The nutritional status of 290 Yanomami Amerindians children, from birth to about six year-olds, living in the middle Rio Negro, Brazilian Amazonia, has been studied in 1998 and 1999 using the weight-for-height. All of them were of low stature. Twenty malnourished (7%), defined as below two standard deviations of NCHS' data, have been observed. Five of them showed a severe malnutrition (= -3 SD). Differences appeared between the communities, however without evident connection with the practices of these groups and their contacts with the outside. These data indicate a lack of scarcity in this population who preserves a traditional way of life and disposes of a large space for gathering and hunting. The cases of malnutrition are probably a conjoined consequence of infectious attacks in children and of a special bad status in their group.
Basis adaptation in homogeneous chaos spaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tipireddy, Ramakrishna; Ghanem, Roger
2014-02-01
We present a new meth for the characterization of subspaces associated with low-dimensional quantities of interet (QoI). The probability density function of these QoI is found to be concentrated around one-dimensional subspaces for which we develop projection operators. Our approach builds on the properties of Gaussian Hilbert spaces and associated tensor product spaces.
Going EVA Outside the Space Station on This Week @NASA – January 26, 2018
2018-01-26
The first space station spacewalk of the new year, launching GOLD to study Earth’s near-space environment, and – read all about it … there’s NASA tech you probably use every day … a few of the stories to tell you about – This Week at NASA!
2002 Commercial Space Transportation Lecture Series, volumes 1,2, and 3
DOT National Transportation Integrated Search
2003-04-01
This document includes three presentations which are part of the 2002 Commercial Space Transportation Lecture Series: The Early Years, AST - A Historical Perspective; Approval of Reentry Vehicles; and, Setting Insurance Requirements: Maximum Probable...
An adaptive threshold detector and channel parameter estimator for deep space optical communications
NASA Technical Reports Server (NTRS)
Arabshahi, P.; Mukai, R.; Yan, T. -Y.
2001-01-01
This paper presents a method for optimal adaptive setting of ulse-position-modulation pulse detection thresholds, which minimizes the total probability of error for the dynamically fading optical fee space channel.
2010-01-01
As part of our effort to increase survival of drug candidates and to move our medicinal chemistry design to higher probability space for success in the Neuroscience therapeutic area, we embarked on a detailed study of the property space for a collection of central nervous system (CNS) molecules. We carried out a thorough analysis of properties for 119 marketed CNS drugs and a set of 108 Pfizer CNS candidates. In particular, we focused on understanding the relationships between physicochemical properties, in vitro ADME (absorption, distribution, metabolism, and elimination) attributes, primary pharmacology binding efficiencies, and in vitro safety data for these two sets of compounds. This scholarship provides guidance for the design of CNS molecules in a property space with increased probability of success and may lead to the identification of druglike candidates with favorable safety profiles that can successfully test hypotheses in the clinic. PMID:22778836
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shabbir, A., E-mail: aqsa.shabbir@ugent.be; Noterdaeme, J. M.; Max-Planck-Institut für Plasmaphysik, Garching D-85748
2014-11-15
Information visualization aimed at facilitating human perception is an important tool for the interpretation of experiments on the basis of complex multidimensional data characterizing the operational space of fusion devices. This work describes a method for visualizing the operational space on a two-dimensional map and applies it to the discrimination of type I and type III edge-localized modes (ELMs) from a series of carbon-wall ELMy discharges at JET. The approach accounts for stochastic uncertainties that play an important role in fusion data sets, by modeling measurements with probability distributions in a metric space. The method is aimed at contributing tomore » physical understanding of ELMs as well as their control. Furthermore, it is a general method that can be applied to the modeling of various other plasma phenomena as well.« less
Augmenting Phase Space Quantization to Introduce Additional Physical Effects
NASA Astrophysics Data System (ADS)
Robbins, Matthew P. G.
Quantum mechanics can be done using classical phase space functions and a star product. The state of the system is described by a quasi-probability distribution. A classical system can be quantized in phase space in different ways with different quasi-probability distributions and star products. A transition differential operator relates different phase space quantizations. The objective of this thesis is to introduce additional physical effects into the process of quantization by using the transition operator. As prototypical examples, we first look at the coarse-graining of the Wigner function and the damped simple harmonic oscillator. By generalizing the transition operator and star product to also be functions of the position and momentum, we show that additional physical features beyond damping and coarse-graining can be introduced into a quantum system, including the generalized uncertainty principle of quantum gravity phenomenology, driving forces, and decoherence.
Space debris detection in optical image sequences.
Xi, Jiangbo; Wen, Desheng; Ersoy, Okan K; Yi, Hongwei; Yao, Dalei; Song, Zongxi; Xi, Shaobo
2016-10-01
We present a high-accuracy, low false-alarm rate, and low computational-cost methodology for removing stars and noise and detecting space debris with low signal-to-noise ratio (SNR) in optical image sequences. First, time-index filtering and bright star intensity enhancement are implemented to remove stars and noise effectively. Then, a multistage quasi-hypothesis-testing method is proposed to detect the pieces of space debris with continuous and discontinuous trajectories. For this purpose, a time-index image is defined and generated. Experimental results show that the proposed method can detect space debris effectively without any false alarms. When the SNR is higher than or equal to 1.5, the detection probability can reach 100%, and when the SNR is as low as 1.3, 1.2, and 1, it can still achieve 99%, 97%, and 85% detection probabilities, respectively. Additionally, two large sets of image sequences are tested to show that the proposed method performs stably and effectively.
Patiño-Martinez, Juan; Marco, Adolfo; Quiñones, Liliana; Calabuig, Cecilia P
2010-09-01
Hatchling emergence to the beach surface from deep sand nests occurs without parental care. Social behaviour among siblings is crucial to overcome this first challenge in sea turtles life. This study, carried out at the Caribbean coast of Colombia, describes the emergence social behaviour of hatchlings from eight nests, and assess the nests translocation effects on temporal patterns of emergence. For the first time, we propose that space released by dehydration of shelled albumen globes (SAGs) at the top of the clutch, might be a reproductive advantage, while facilitating neonates to group together in a very limited space, and favouring the synchrony of emergence. The mean time of groups emergence was of 3.3 days, varying between 1 and 6 days. We found that relocation of the nests did not significantly affect the temporal pattern of emergence, which was mainly nocturnal (77.7% of natural nests and 81.7% of translocated ones). The maximum number of emergences to the surface occurred at the lowest air temperatures (22:00h-06:00h). The selective advantage of this pattern is probably related to the greater rate of predation and mortality by hyperthermia observed during the day.
Classical-Quantum Correspondence by Means of Probability Densities
NASA Technical Reports Server (NTRS)
Vegas, Gabino Torres; Morales-Guzman, J. D.
1996-01-01
Within the frame of the recently introduced phase space representation of non relativistic quantum mechanics, we propose a Lagrangian from which the phase space Schrodinger equation can be derived. From that Lagrangian, the associated conservation equations, according to Noether's theorem, are obtained. This shows that one can analyze quantum systems completely in phase space as it is done in coordinate space, without additional complications.
Stargate GTM: Bridging Descriptor and Activity Spaces.
Gaspar, Héléna A; Baskin, Igor I; Marcou, Gilles; Horvath, Dragos; Varnek, Alexandre
2015-11-23
Predicting the activity profile of a molecule or discovering structures possessing a specific activity profile are two important goals in chemoinformatics, which could be achieved by bridging activity and molecular descriptor spaces. In this paper, we introduce the "Stargate" version of the Generative Topographic Mapping approach (S-GTM) in which two different multidimensional spaces (e.g., structural descriptor space and activity space) are linked through a common 2D latent space. In the S-GTM algorithm, the manifolds are trained simultaneously in two initial spaces using the probabilities in the 2D latent space calculated as a weighted geometric mean of probability distributions in both spaces. S-GTM has the following interesting features: (1) activities are involved during the training procedure; therefore, the method is supervised, unlike conventional GTM; (2) using molecular descriptors of a given compound as input, the model predicts a whole activity profile, and (3) using an activity profile as input, areas populated by relevant chemical structures can be detected. To assess the performance of S-GTM prediction models, a descriptor space (ISIDA descriptors) of a set of 1325 GPCR ligands was related to a B-dimensional (B = 1 or 8) activity space corresponding to pKi values for eight different targets. S-GTM outperforms conventional GTM for individual activities and performs similarly to the Lasso multitask learning algorithm, although it is still slightly less accurate than the Random Forest method.
Chen, Shyi-Ming; Chen, Shen-Wen
2015-03-01
In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy-trend logical relationships. Firstly, the proposed method fuzzifies the historical training data of the main factor and the secondary factor into fuzzy sets, respectively, to form two-factors second-order fuzzy logical relationships. Then, it groups the obtained two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, it calculates the probability of the "down-trend," the probability of the "equal-trend" and the probability of the "up-trend" of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group, respectively. Finally, it performs the forecasting based on the probabilities of the down-trend, the equal-trend, and the up-trend of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) and the NTD/USD exchange rates. The experimental results show that the proposed method outperforms the existing methods.
Faria Alves, Miguel; Ferreira, António Miguel; Cardoso, Gonçalo; Saraiva Lopes, Ricardo; Correia, Maria da Graça; Machado Gil, Victor
2013-03-01
The purpose of this study was to assess the change in theoretical probability of coronary artery disease (CAD) in patients with suspected CAD undergoing coronary CT angiography (CCTA) as first line test vs. patients who underwent CCTA after an exercise ECG. Pre- and post-test probabilities of CAD were assessed in 158 patients with suspected CAD undergoing dual-source CCTA as the first-line test (Group A) and in 134 in whom CCTA was performed after an exercise ECG (Group B). Pre-test probabilities were calculated based on age, gender and type of chest pain. Post-test probabilities were calculated according to Bayes' theorem. There were no significant differences between the groups regarding pre-test probability (median 23.5% [13.3-37.8] in group A vs. 20.5% [13.4-34.5] in group B; p=0,479). In group A, the percentage of patients with intermediate likelihood of disease (10-90%) was 90% before testing and 15% after CCTA (p<0,001), while in group B, it was 95% before testing, 87% after exercise ECG (p=NS), and 17% after CCTA (p<0,001). Unlike exercise testing, CCTA is able to reclassify the risk in the majority of patients with an intermediate probability of obstructive CAD. The use of CCTA as a first-line diagnostic test for CAD may be beneficial in this setting. Copyright © 2012 Sociedade Portuguesa de Cardiologia. Published by Elsevier España. All rights reserved.
NASA Astrophysics Data System (ADS)
Lee, K.; Moon, Y.; Lee, J.; Na, H.; Lee, K.
2013-12-01
We investigate the solar flare occurrence rate and daily flare probability in terms of the sunspot classification supplemented with sunspot area and its changes. For this we use the NOAA active region data and GOES solar flare data for 15 years (from January 1996 to December 2010). We consider the most flare-productive 11 sunspot classes in the McIntosh sunspot group classification. Sunspot area and its changes can be a proxy of magnetic flux and its emergence/cancellation, respectively. We classify each sunspot group into two sub-groups by its area: 'Large' and 'Small'. In addition, for each group, we classify it into three sub-groups according to sunspot area changes: 'Decrease', 'Steady', and 'Increase'. As a result, in the case of compact groups, their flare occurrence rates and daily flare probabilities noticeably increase with sunspot group area. We also find that the flare occurrence rates and daily flare probabilities for the 'Increase' sub-groups are noticeably higher than those for the other sub-groups. In case of the (M + X)-class flares in the ';Dkc' group, the flare occurrence rate of the 'Increase' sub-group is three times higher than that of the 'Steady' sub-group. The mean flare occurrence rates and flare probabilities for all sunspot groups increase with the following order: 'Decrease', 'Steady', and 'Increase'. Our results statistically demonstrate that magnetic flux and its emergence enhance the occurrence of major solar flares.
On the forecasting the unfavorable periods in the technosphere by the space weather factors
NASA Astrophysics Data System (ADS)
Lyakhov, N. N.
2002-12-01
There is the considerable progress in development of geomagnetic disturbances forecast technique, in the necessary time, by solar activity phenomena last years. The possible relationship between violations of the traffic safety terms (VTS) in East Siberian Railway during 1986-1999 and the space weather factors was investigated. The overall number of cases under consideration is equal to 11575. By methods of correlation and spectral analysis it was shown, that statistics of VTS has not a random and it's character is probably caused by space weather factors. The principal difference between rhythmic of VTS by purely technical reasons (MECH) (failures in mechanical systems) and, that of VTS caused by wrong operations of a personnel (MAN), is noted. Increase of sudden storm commencements number results in increase of probability of mistakable actions of an operator. Probability of violations in mechanical systems increases with increase of number of quiet geomagnetic conditions. This, in its turn, dictate different approach to the ordered rows of MECH and MAN data when forecasting the unfavourable periods as the priods of increased risk in working out a wrong decision by technological process participants. The advances in forecasting of geomagnetic environment technique made possible to start construction of systems of the operative informing about unfavourable factors of space weather for the interested organizations.
Diverse Redundant Systems for Reliable Space Life Support
NASA Technical Reports Server (NTRS)
Jones, Harry W.
2015-01-01
Reliable life support systems are required for deep space missions. The probability of a fatal life support failure should be less than one in a thousand in a multi-year mission. It is far too expensive to develop a single system with such high reliability. Using three redundant units would require only that each have a failure probability of one in ten over the mission. Since the system development cost is inverse to the failure probability, this would cut cost by a factor of one hundred. Using replaceable subsystems instead of full systems would further cut cost. Using full sets of replaceable components improves reliability more than using complete systems as spares, since a set of components could repair many different failures instead of just one. Replaceable components would require more tools, space, and planning than full systems or replaceable subsystems. However, identical system redundancy cannot be relied on in practice. Common cause failures can disable all the identical redundant systems. Typical levels of common cause failures will defeat redundancy greater than two. Diverse redundant systems are required for reliable space life support. Three, four, or five diverse redundant systems could be needed for sufficient reliability. One system with lower level repair could be substituted for two diverse systems to save cost.
NASA Technical Reports Server (NTRS)
Lambert, WInifred; Roeder, William
2007-01-01
This conference presentation describes the development of a peak wind forecast tool to assist forecasters in determining the probability of violating launch commit criteria (LCC) at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) in east-central Florida. The peak winds are an important forecast element for both the Space Shuttle and Expendable Launch Vehicle (ELV) programs. The LCC define specific peak wind thresholds for each launch operation that cannot be exceeded in order to ensure the safety of the vehicle. The 45th Weather Squadron (45 WS) has found that peak winds are a challenging parameter to forecast, particularly in the cool season months of October through April. Based on the importance of forecasting peak winds, the 45 WS tasked the Applied Meteorology Unit (AMU) to develop a short-range peak-wind forecast tool to assist in forecasting LCC violations. The tool will include climatologies of the 5-minute mean and peak winds by month, hour, and direction, and probability distributions of the peak winds as a function of the 5-minute mean wind speeds.
A Peak Wind Probability Forecast Tool for Kennedy Space Center and Cape Canaveral Air Force Station
NASA Technical Reports Server (NTRS)
Crawford, Winifred; Roeder, William
2008-01-01
This conference abstract describes the development of a peak wind forecast tool to assist forecasters in determining the probability of violating launch commit criteria (LCC) at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) in east-central Florida. The peak winds are an important forecast element for both the Space Shuttle and Expendable Launch Vehicle (ELV) programs. The LCC define specific peak wind thresholds for each launch operation that cannot be exceeded in order to ensure the safety of the vehicle. The 45th Weather Squadron (45 WS) has found that peak winds are a challenging parameter to forecast, particularly in the cool season months of October through April. Based on the importance of forecasting peak winds, the 45 WS tasked the Applied Meteorology Unit (AMU) to develop a short-range peak-wind forecast tool to assist in forecasting LCC violatioas.The tool will include climatologies of the 5-minute mean end peak winds by month, hour, and direction, and probability distributions of the peak winds as a function of the 5-minute mean wind speeds.
Isotropic probability measures in infinite-dimensional spaces
NASA Technical Reports Server (NTRS)
Backus, George
1987-01-01
Let R be the real numbers, R(n) the linear space of all real n-tuples, and R(infinity) the linear space of all infinite real sequences x = (x sub 1, x sub 2,...). Let P sub in :R(infinity) approaches R(n) be the projection operator with P sub n (x) = (x sub 1,...,x sub n). Let p(infinity) be a probability measure on the smallest sigma-ring of subsets of R(infinity) which includes all of the cylinder sets P sub n(-1) (B sub n), where B sub n is an arbitrary Borel subset of R(n). Let p sub n be the marginal distribution of p(infinity) on R(n), so p sub n(B sub n) = p(infinity) (P sub n to the -1 (B sub n)) for each B sub n. A measure on R(n) is isotropic if it is invariant under all orthogonal transformations of R(n). All members of the set of all isotropic probability distributions on R(n) are described. The result calls into question both stochastic inversion and Bayesian inference, as currently used in many geophysical inverse problems.
NASA Astrophysics Data System (ADS)
Radha, J.; Indhira, K.; Chandrasekaran, V. M.
2017-11-01
A group arrival feedback retrial queue with k optional stages of service and orbital search policy is studied. Any arriving group of customer finds the server free, one from the group enters into the first stage of service and the rest of the group join into the orbit. After completion of the i th stage of service, the customer under service may have the option to choose (i+1)th stage of service with θi probability, with pI probability may join into orbit as feedback customer or may leave the system with {q}i=≤ft\\{\\begin{array}{l}1-{p}i-{θ }i,i=1,2,\\cdots k-1\\ 1-{p}i,i=k\\end{array}\\right\\} probability. Busy server may get to breakdown due to the arrival of negative customers and the service channel will fail for a short interval of time. At the completion of service or repair, the server searches for the customer in the orbit (if any) with probability α or remains idle with probability 1-α. By using the supplementary variable method, steady state probability generating function for system size, some system performance measures are discussed.
Probability techniques for reliability analysis of composite materials
NASA Technical Reports Server (NTRS)
Wetherhold, Robert C.; Ucci, Anthony M.
1994-01-01
Traditional design approaches for composite materials have employed deterministic criteria for failure analysis. New approaches are required to predict the reliability of composite structures since strengths and stresses may be random variables. This report will examine and compare methods used to evaluate the reliability of composite laminae. The two types of methods that will be evaluated are fast probability integration (FPI) methods and Monte Carlo methods. In these methods, reliability is formulated as the probability that an explicit function of random variables is less than a given constant. Using failure criteria developed for composite materials, a function of design variables can be generated which defines a 'failure surface' in probability space. A number of methods are available to evaluate the integration over the probability space bounded by this surface; this integration delivers the required reliability. The methods which will be evaluated are: the first order, second moment FPI methods; second order, second moment FPI methods; the simple Monte Carlo; and an advanced Monte Carlo technique which utilizes importance sampling. The methods are compared for accuracy, efficiency, and for the conservativism of the reliability estimation. The methodology involved in determining the sensitivity of the reliability estimate to the design variables (strength distributions) and importance factors is also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Monterial, Mateusz; Clarke, Shaun
A Bayesian approach is proposed for pulse shape discrimination of photons and neutrons in liquid organic scinitillators. Instead of drawing a decision boundary, each pulse is assigned a photon or neutron confidence probability. In addition, this allows for photon and neutron classification on an event-by-event basis. The sum of those confidence probabilities is used to estimate the number of photon and neutron instances in the data. An iterative scheme, similar to an expectation-maximization algorithm for Gaussian mixtures, is used to infer the ratio of photons-to-neutrons in each measurement. Therefore, the probability space adapts to data with varying photon-to-neutron ratios. Amore » time-correlated measurement of Am–Be and separate measurements of 137Cs, 60Co and 232Th photon sources were used to construct libraries of neutrons and photons. These libraries were then used to produce synthetic data sets with varying ratios of photons-to-neutrons. Probability weighted method that we implemented was found to maintain neutron acceptance rate of up to 90% up to photon-to-neutron ratio of 2000, and performed 9% better than the decision boundary approach. Furthermore, the iterative approach appropriately changed the probability space with an increasing number of photons which kept the neutron population estimate from unrealistically increasing.« less
Popescu, Viorel D; Valpine, Perry; Sweitzer, Rick A
2014-04-01
Wildlife data gathered by different monitoring techniques are often combined to estimate animal density. However, methods to check whether different types of data provide consistent information (i.e., can information from one data type be used to predict responses in the other?) before combining them are lacking. We used generalized linear models and generalized linear mixed-effects models to relate camera trap probabilities for marked animals to independent space use from telemetry relocations using 2 years of data for fishers (Pekania pennanti) as a case study. We evaluated (1) camera trap efficacy by estimating how camera detection probabilities are related to nearby telemetry relocations and (2) whether home range utilization density estimated from telemetry data adequately predicts camera detection probabilities, which would indicate consistency of the two data types. The number of telemetry relocations within 250 and 500 m from camera traps predicted detection probability well. For the same number of relocations, females were more likely to be detected during the first year. During the second year, all fishers were more likely to be detected during the fall/winter season. Models predicting camera detection probability and photo counts solely from telemetry utilization density had the best or nearly best Akaike Information Criterion (AIC), suggesting that telemetry and camera traps provide consistent information on space use. Given the same utilization density, males were more likely to be photo-captured due to larger home ranges and higher movement rates. Although methods that combine data types (spatially explicit capture-recapture) make simple assumptions about home range shapes, it is reasonable to conclude that in our case, camera trap data do reflect space use in a manner consistent with telemetry data. However, differences between the 2 years of data suggest that camera efficacy is not fully consistent across ecological conditions and make the case for integrating other sources of space-use data.
Towards an Artificial Space Object Taxonomy
NASA Astrophysics Data System (ADS)
Wilkins, M.; Schumacher, P.; Jah, M.; Pfeffer, A.
2013-09-01
Object recognition is the first step in positively identifying a resident space object (RSO), i.e. assigning an RSO to a category such as GPS satellite or space debris. Object identification is the process of deciding that two RSOs are in fact one and the same. Provided we have appropriately defined a satellite taxonomy that allows us to place a given RSO into a particular class of object without any ambiguity, one can assess the probability of assignment to a particular class by determining how well the object satisfies the unique criteria of belonging to that class. Ultimately, tree-based taxonomies delineate unique signatures by defining the minimum amount of information required to positively identify a RSO. Therefore, taxonomic trees can be used to depict hypotheses in a Bayesian object recognition and identification process. This work describes a new RSO taxonomy along with specific reasoning behind the choice of groupings. An alternative taxonomy was recently presented at the Sixth Conference on Space Debris in Darmstadt, Germany. [1] The best example of a taxonomy that enjoys almost universal scientific acceptance is the classical Linnaean biological taxonomy. A strength of Linnaean taxonomy is that it can be used to organize the different kinds of living organisms, simply and practically. Every species can be given a unique name. This uniqueness and stability are a result of the acceptance by biologists specializing in taxonomy, not merely of the binomial names themselves. Fundamentally, the taxonomy is governed by rules for the use of these names, and these are laid down in formal Nomenclature Codes. We seek to provide a similar formal nomenclature system for RSOs through a defined tree-based taxonomy structure. Each categorization, beginning with the most general or inclusive, at any level is called a taxon. Taxon names are defined by a type, which can be a specimen or a taxon of lower rank, and a diagnosis, a statement intended to supply characters that differentiate the taxon from others with which it is likely to be confused. Each taxon will have a set of uniquely distinguishing features that will allow one to place a given object into a specific group without any ambiguity. When a new object does not fall into a specific taxon that is already defined, the entire tree structure will need to be evaluated to determine if a new taxon should be created. Ultimately, an online learning process to facilitate tree growth would be desirable. One can assess the probability of assignment to a particular taxon by determining how well the object satisfies the unique criteria of belonging to that taxon. Therefore, we can use taxonomic trees in a Bayesian process to assign prior probabilities to each of our object recognition and identification hypotheses. We will show that this taxonomy is robust by demonstrating specific stressing classification examples. We will also demonstrate how to implement this taxonomy in Figaro, an open source probabilistic programming language.
Probability Distributions of Minkowski Distances between Discrete Random Variables.
ERIC Educational Resources Information Center
Schroger, Erich; And Others
1993-01-01
Minkowski distances are used to indicate similarity of two vectors in an N-dimensional space. How to compute the probability function, the expectation, and the variance for Minkowski distances and the special cases City-block distance and Euclidean distance. Critical values for tests of significance are presented in tables. (SLD)
NASA Astrophysics Data System (ADS)
Berlanga, Juan M.; Harbaugh, John W.
The Tabasco region contains a number of major oilfields, including some of the emerging "giant" oil fields which have received extensive publicity. Fields in the Tabasco region are associated with large geologic structures which are detected readily by seismic surveys. The structures seem to be associated with deepseated movement of salt, and they are complexly faulted. Some structures have as much as 1000 milliseconds relief of seismic lines. A study, interpreting the structure of the area, used initially only a fraction of the total seismic lines That part of Tabasco region that has been studied was surveyed with a close-spaced rectilinear network of seismic lines. A, interpreting the structure of the area, used initially only a fraction of the total seismic data available. The purpose was to compare "predictions" of reflection time based on widely spaced seismic lines, with "results" obtained along more closely spaced lines. This process of comparison simulates the sequence of events in which a reconnaissance network of seismic lines is used to guide a succession of progressively more closely spaced lines. A square gridwork was established with lines spaced at 10 km intervals, and using machine contour maps, compared the results with those obtained with seismic grids employing spacings of 5 and 2.5 km respectively. The comparisons of predictions based on widely spaced lines with observations along closely spaced lines provide information by which an error function can be established. The error at any point can be defined as the difference between the predicted value for that point, and the subsequently observed value at that point. Residuals obtained by fitting third-degree polynomial trend surfaces were used for comparison. The root mean square of the error measurement, (expressed in seconds or milliseconds reflection time) was found to increase more or less linearly with distance from the nearest seismic point. Oil-occurrence probabilities were established on the basis of frequency distributions of trend-surface residuals obtained by fitting and subtracting polynomial trend surfaces from the machine-contoured reflection time maps. We found that there is a strong preferential relationship between the occurrence of petroleum (i.e. its presence versus absence) and particular ranges of trend-surface residual values. An estimate of the probability of oil occurring at any particular geographic point can be calculated on the basis of the estimated trend-surface residual value. This estimate, however, must be tempered by the probable error in the estimate of the residual value provided by the error function. The result, we believe, is a simple but effective procedure for estimating exploration outcome probabilities where seismic data provide the principal form of information in advance of drilling. Implicit in this approach is the comparison between a maturely explored area, for which both seismic and production data are available, and which serves as a statistical "training area", with the "target" area which is undergoing exploration and for which probability forecasts are to be calculated.
Incorporating Covariates into Stochastic Blockmodels
ERIC Educational Resources Information Center
Sweet, Tracy M.
2015-01-01
Social networks in education commonly involve some form of grouping, such as friendship cliques or teacher departments, and blockmodels are a type of statistical social network model that accommodate these grouping or blocks by assuming different within-group tie probabilities than between-group tie probabilities. We describe a class of models,…
Limitations of shallow nets approximation.
Lin, Shao-Bo
2017-10-01
In this paper, we aim at analyzing the approximation abilities of shallow networks in reproducing kernel Hilbert spaces (RKHSs). We prove that there is a probability measure such that the achievable lower bound for approximating by shallow nets can be realized for all functions in balls of reproducing kernel Hilbert space with high probability, which is different with the classical minimax approximation error estimates. This result together with the existing approximation results for deep nets shows the limitations for shallow nets and provides a theoretical explanation on why deep nets perform better than shallow nets. Copyright © 2017 Elsevier Ltd. All rights reserved.
Peppas, Kostas P; Lazarakis, Fotis; Alexandridis, Antonis; Dangakis, Kostas
2012-08-01
In this Letter we investigate the error performance of multiple-input multiple-output free-space optical communication systems employing intensity modulation/direct detection and operating over strong atmospheric turbulence channels. Atmospheric-induced strong turbulence fading is modeled using the negative exponential distribution. For the considered system, an approximate yet accurate analytical expression for the average bit error probability is derived and an efficient method for its numerical evaluation is proposed. Numerically evaluated and computer simulation results are further provided to demonstrate the validity of the proposed mathematical analysis.
Multiple Hypothesis Tracking (MHT) for Space Surveillance: Results and Simulation Studies
2013-09-01
processor. 1 . INTRODUCTION The Joint Space Operations Center (JSpOC) currently tracks more than 22,000 satellites and space debris orbiting the Earth... 1 , 2]. With the anticipated installation of more accurate sensors and the increased probability of future collisions between space objects, the...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed
Walking Clinic in ambulatory surgery--A patient based concept: A Portuguese pioneer project.
Vinagreiro, M; Valverde, J N; Alves, D; Costa, M; Gouveia, P; Guerreiro, E
2015-06-01
Walking Clinic is an innovative, efficient and easily reproducible concept adapted to ambulatory surgery. It consists of a preoperative single day work-up, with a surgeon, an anesthetist and a nurse. The aim of this study was to evaluate patient satisfaction and its determinants. A survey was applied to 171 patients (101 of the Walking Clinic group and 70 not engaged in this new concept). Patient satisfaction was assessed evaluating five major questionnaire items: secretariat (quality of the information and support given), physical space (overall comfort and cleanliness), nurses and medical staff (willingness and expertise), and patients (waiting time until pre-operative consults and exams, waiting time until being scheduled for surgery, surgery day waiting time and postoperative pain control). Furthermore, overall assessment of the received treatment, and probability of patient recommending or returning to our ambulatory unit were also analyzed. Walking Clinic group had overall better results in the five major questionnaire items assessed, with statistical significance, except for the physical space. It also showed better results regarding the sub-items postoperative pain control, waiting time until being scheduled for surgery and surgery day waiting time. The results confirm better patient satisfaction with this new concept. The Walking Clinic concept complements all the tenets of ambulatory surgery, in a more efficient manner. Copyright © 2015 IJS Publishing Group Limited. Published by Elsevier Ltd. All rights reserved.
Koszul information geometry and Souriau Lie group thermodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbaresco, Frédéric, E-mail: frederic.barbaresco@thalesgroup.com
The François Massieu 1869 idea to derive some mechanical and thermal properties of physical systems from 'Characteristic Functions', was developed by Gibbs and Duhem in thermodynamics with the concept of potentials, and introduced by Poincaré in probability. This paper deals with generalization of this Characteristic Function concept by Jean-Louis Koszul in Mathematics and by Jean-Marie Souriau in Statistical Physics. The Koszul-Vinberg Characteristic Function (KVCF) on convex cones will be presented as cornerstone of 'Information Geometry' theory, defining Koszul Entropy as Legendre transform of minus the logarithm of KVCF, and Fisher Information Metrics as hessian of these dual functions, invariant bymore » their automorphisms. In parallel, Souriau has extended the Characteristic Function in Statistical Physics looking for other kinds of invariances through co-adjoint action of a group on its momentum space, defining physical observables like energy, heat and momentum as pure geometrical objects. In covariant Souriau model, Gibbs equilibriums states are indexed by a geometric parameter, the Geometric (Planck) Temperature, with values in the Lie algebra of the dynamical Galileo/Poincaré groups, interpreted as a space-time vector, giving to the metric tensor a null Lie derivative. Fisher Information metric appears as the opposite of the derivative of Mean 'Moment map' by geometric temperature, equivalent to a Geometric Capacity or Specific Heat. These elements has been developed by author in [10][11].« less
NASA Astrophysics Data System (ADS)
Strolger, Louis-Gregory; Porter, Sophia; Lagerstrom, Jill; Weissman, Sarah; Reid, I. Neill; Garcia, Michael
2017-04-01
The Proposal Auto-Categorizer and Manager (PACMan) tool was written to respond to concerns about subjective flaws and potential biases in some aspects of the proposal review process for time allocation for the Hubble Space Telescope (HST), and to partially alleviate some of the anticipated additional workload from the James Webb Space Telescope (JWST) proposal review. PACMan is essentially a mixed-method Naive Bayesian spam filtering routine, with multiple pools representing scientific categories, that utilizes the Robinson method for combining token (or word) probabilities. PACMan was trained to make similar programmatic decisions in science category sorting, panelist selection, and proposal-to-panelists assignments to those made by individuals and committees in the Science Policies Group (SPG) at the Space Telescope Science Institute. Based on training from the previous cycle’s proposals, at an average of 87%, PACMan made the same science category assignments for proposals in Cycle 24 as the SPG. Tests for similar science categorizations, based on training using proposals from additional cycles, show that this accuracy can be further improved, to the > 95 % level. This tool will be used to augment or replace key functions in the Time Allocation Committee review processes in future HST and JWST cycles.
Probabilistic Risk Assessment for Astronaut Post Flight Bone Fracture
NASA Technical Reports Server (NTRS)
Lewandowski, Beth; Myers, Jerry; Licata, Angelo
2015-01-01
Introduction: Space flight potentially reduces the loading that bone can resist before fracture. This reduction in bone integrity may result from a combination of factors, the most common reported as reduction in astronaut BMD. Although evaluating the condition of bones continues to be a critical aspect of understanding space flight fracture risk, defining the loading regime, whether on earth, in microgravity, or in reduced gravity on a planetary surface, remains a significant component of estimating the fracture risks to astronauts. This presentation summarizes the concepts, development, and application of NASA's Bone Fracture Risk Module (BFxRM) to understanding pre-, post, and in mission astronaut bone fracture risk. The overview includes an assessment of contributing factors utilized in the BFxRM and illustrates how new information, such as biomechanics of space suit design or better understanding of post flight activities may influence astronaut fracture risk. Opportunities for the bone mineral research community to contribute to future model development are also discussed. Methods: To investigate the conditions in which spaceflight induced changes to bone plays a critical role in post-flight fracture probability, we implement a modified version of the NASA Bone Fracture Risk Model (BFxRM). Modifications included incorporation of variations in physiological characteristics, post-flight recovery rate, and variations in lateral fall conditions within the probabilistic simulation parameter space. The modeled fracture probability estimates for different loading scenarios at preflight and at 0 and 365 days post-flight time periods are compared. Results: For simple lateral side falls, mean post-flight fracture probability is elevated over mean preflight fracture probability due to spaceflight induced BMD loss and is not fully recovered at 365 days post-flight. In the case of more energetic falls, such as from elevated heights or with the addition of lateral movement, the contribution of space flight quality changes is much less clear, indicating more granular assessments, such as Finite Element modeling, may be needed to further assess the risks in these scenarios.
Highly Competitive Reindeer Males Control Female Behavior during the Rut
Body, Guillaume; Weladji, Robert B.; Holand, Øystein; Nieminen, Mauri
2014-01-01
During the rut, female ungulates move among harems or territories, either to sample mates or to avoid harassment. Females may be herded by a male, may stay with a preferred male, or aggregate near a dominant male to avoid harassment from other males. In fission-fusion group dynamics, female movement is best described by the group’s fission probability, instead of inter-harem movement. In this study, we tested whether male herding ability, female mate choice or harassment avoidance influence fission probability. We recorded group dynamics in a herd of reindeer Rangifer tarandus equipped with GPS collars with activity sensors. We found no evidence that the harassment level in the group affected fission probability, or that females sought high rank (i.e. highly competitive and hence successful) males. However, the behavior of high ranked males decreased fission probability. Male herding activity was synchronous with the decrease of fission probability observed during the rut. We concluded that male herding behavior stabilized groups, thereby increasing average group size and consequently the opportunity for sexual selection. PMID:24759701
Ramos-Fernández, Gabriel; Getz, Wayne M.
2016-01-01
Ecological and social factors influence individual movement and group membership decisions, which ultimately determine how animal groups adjust their behavior in spatially and temporally heterogeneous environments. The mechanisms behind these behavioral adjustments can be better understood by studying the relationship between association and space use patterns of groups and how these change over time. We examined the socio-spatial patterns of adult individuals in a free-ranging group of spider monkeys (Ateles geoffroyi), a species with high fission-fusion dynamics. Data comprised 4916 subgroup scans collected during 325 days throughout a 20-month period and was used to evaluate changes from fruit-scarce to fruit-abundant periods in individual core-area size, subgroup size and two types of association measures: spatial (core-area overlap) and spatio-temporal (occurrence in the same subgroup) associations. We developed a 3-level analysis framework to distinguish passive associations, where individuals are mostly brought together by resources of common interest, from active association, where individuals actively seek or avoid certain others. Results indicated a more concentrated use of space, increased individual gregariousness and higher spatio-temporal association rates in the fruit-abundant seasons, as is compatible with an increase in passive associations. Nevertheless, results also suggested active associations in all the periods analyzed, although associations differed across seasons. In particular, females seem to actively avoid males, perhaps prompted by an increased probability of random encounters among individuals, resulting from the contraction of individual core areas. Our framework proved useful in investigating the interplay between ecological and social constraints and how these constraints can influence individual ranging and grouping decisions in spider monkeys, and possibly other species with high fission-fusion dynamics. PMID:27280800
Regulation of erythropoiesis in rats during space flight
NASA Technical Reports Server (NTRS)
Lange, Robert D.
1989-01-01
Astronauts who have flown in microgravity have experienced a loss in red cell mass. The pathogenesis of the anemia of space flight has not been ascertained, but it is probably multifactorial. In 1978, the laboratory was selected to participate in life sciences studies to be carried out in the space shuttle in an attempt to study the pathogenesis of space anemia. In particular, the original studies were to be made in mice. This was later changed to study erythropoiesis in rats during space flight.
A Financial Market Model Incorporating Herd Behaviour.
Wray, Christopher M; Bishop, Steven R
2016-01-01
Herd behaviour in financial markets is a recurring phenomenon that exacerbates asset price volatility, and is considered a possible contributor to market fragility. While numerous studies investigate herd behaviour in financial markets, it is often considered without reference to the pricing of financial instruments or other market dynamics. Here, a trader interaction model based upon informational cascades in the presence of information thresholds is used to construct a new model of asset price returns that allows for both quiescent and herd-like regimes. Agent interaction is modelled using a stochastic pulse-coupled network, parametrised by information thresholds and a network coupling probability. Agents may possess either one or two information thresholds that, in each case, determine the number of distinct states an agent may occupy before trading takes place. In the case where agents possess two thresholds (labelled as the finite state-space model, corresponding to agents' accumulating information over a bounded state-space), and where coupling strength is maximal, an asymptotic expression for the cascade-size probability is derived and shown to follow a power law when a critical value of network coupling probability is attained. For a range of model parameters, a mixture of negative binomial distributions is used to approximate the cascade-size distribution. This approximation is subsequently used to express the volatility of model price returns in terms of the model parameter which controls the network coupling probability. In the case where agents possess a single pulse-coupling threshold (labelled as the semi-infinite state-space model corresponding to agents' accumulating information over an unbounded state-space), numerical evidence is presented that demonstrates volatility clustering and long-memory patterns in the volatility of asset returns. Finally, output from the model is compared to both the distribution of historical stock returns and the market price of an equity index option.
NASA Technical Reports Server (NTRS)
Backus, George E.
1999-01-01
The purpose of the grant was to study how prior information about the geomagnetic field can be used to interpret surface and satellite magnetic measurements, to generate quantitative descriptions of prior information that might be so used, and to use this prior information to obtain from satellite data a model of the core field with statistically justifiable error estimates. The need for prior information in geophysical inversion has long been recognized. Data sets are finite, and faithful descriptions of aspects of the earth almost always require infinite-dimensional model spaces. By themselves, the data can confine the correct earth model only to an infinite-dimensional subset of the model space. Earth properties other than direct functions of the observed data cannot be estimated from those data without prior information about the earth. Prior information is based on what the observer already knows before the data become available. Such information can be "hard" or "soft". Hard information is a belief that the real earth must lie in some known region of model space. For example, the total ohmic dissipation in the core is probably less that the total observed geothermal heat flow out of the earth's surface. (In principle, ohmic heat in the core can be recaptured to help drive the dynamo, but this effect is probably small.) "Soft" information is a probability distribution on the model space, a distribution that the observer accepts as a quantitative description of her/his beliefs about the earth. The probability distribution can be a subjective prior in the sense of Bayes or the objective result of a statistical study of previous data or relevant theories.
The Scent of the Future: Manned Space Travel and the Soviet Union.
1981-06-01
AND ECONOMIC APPLICATIONS 56 GREENHOUSES , BOOSTERS, AND SPACE PLANES: SOVIET SPACE-RELATED RESEARCH AND DEVELOPMENT 72 R.U.R. REVISITED: MANNED VERSUS... greenhouse that was part of their 12-square-meter closed environment.9 6 The successful conclusion of this test demonstrated the feasibility of a manned...will probably be timed to coincide with the XXVI Party Congress which convenes in February 1981. 71 GREENHOUSES , BOOSTERS, AND SPACE PLANES: SOVIET
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biermann, W. J.
1978-01-01
All the available experimental evidence suggests that the optimum ''organic'' absorbent/refrigerant combination would be a methane derivative with a single hydrogen atom with chlorine and fluorine atoms in the other sites, as refrigerant. This would be hydrogen bonded to an absorbent molecule containing the group =NC/sup -/O, with the substituent groups being such that no steric hindrance took place. Cycle analyses showed that the ratio of internal heat transfer to cooling would be large, probably impractically so in view of the high coefficient of performance needed for solar driven cooling and the additional handicap of heat rejection to the atmosphere.more » A more promising approach would be to reduce the internal heat transfer per unit of space cooling by selecting a refrigerant with a high latent heat of vaporization and selecting an absorbent with suitable properties.« less
Unusual satellite data: A black hole?. [International Ultraviolet Explorer observations
NASA Technical Reports Server (NTRS)
1978-01-01
Data obtained by the NASA-launched European Space Agency's International Ultraviolet Explorer satellite suggests the possibility of a massive black hole at the center of some globular clusters (star groups) in our galaxy. Six of these clusters, three of them X-ray sources, were closely examined. Onboard short wavelength UV instrumentation penetrated the background denseness of the clusters 15,000 light years away where radiation, probably from a group of 10 to 20 bright blue stars orbiting the core, was observed. The stars may well be orbiting a massive black hole the size of 1,000 solar systems. The existence of the black hole is uncertain. The dynamics of the stars must be studied first to determine how they rotate in relation to the center of the million-star cluster. This may better indicate what provides the necessary gravitational pull that holds them in orbit.
NASA Astrophysics Data System (ADS)
Yan, H.; Zheng, M. J.; Zhu, D. Y.; Wang, H. T.; Chang, W. S.
2015-07-01
When using clutter suppression interferometry (CSI) algorithm to perform signal processing in a three-channel wide-area surveillance radar system, the primary concern is to effectively suppress the ground clutter. However, a portion of moving target's energy is also lost in the process of channel cancellation, which is often neglected in conventional applications. In this paper, we firstly investigate the two-dimensional (radial velocity dimension and squint angle dimension) residual amplitude of moving targets after channel cancellation with CSI algorithm. Then, a new approach is proposed to increase the two-dimensional detection probability of moving targets by reserving the maximum value of the three channel cancellation results in non-uniformly spaced channel system. Besides, theoretical expression of the false alarm probability with the proposed approach is derived in the paper. Compared with the conventional approaches in uniformly spaced channel system, simulation results validate the effectiveness of the proposed approach. To our knowledge, it is the first time that the two-dimensional detection probability of CSI algorithm is studied.
Advanced Health Management System for the Space Shuttle Main Engine
NASA Technical Reports Server (NTRS)
Davidson, Matt; Stephens, John
2004-01-01
Boeing-Canoga Park (BCP) and NASA-Marshall Space Flight Center (NASA-MSFC) are developing an Advanced Health Management System (AHMS) for use on the Space Shuttle Main Engine (SSME) that will improve Shuttle safety by reducing the probability of catastrophic engine failures during the powered ascent phase of a Shuttle mission. This is a phased approach that consists of an upgrade to the current Space Shuttle Main Engine Controller (SSMEC) to add turbomachinery synchronous vibration protection and addition of a separate Health Management Computer (HMC) that will utilize advanced algorithms to detect and mitigate predefined engine anomalies. The purpose of the Shuttle AHMS is twofold; one is to increase the probability of successfully placing the Orbiter into the intended orbit, and the other is to increase the probability of being able to safely execute an abort of a Space Transportation System (STS) launch. Both objectives are achieved by increasing the useful work envelope of a Space Shuttle Main Engine after it has developed anomalous performance during launch and the ascent phase of the mission. This increase in work envelope will be the result of two new anomaly mitigation options, in addition to existing engine shutdown, that were previously unavailable. The added anomaly mitigation options include engine throttle-down and performance correction (adjustment of engine oxidizer to fuel ratio), as well as enhanced sensor disqualification capability. The HMC is intended to provide the computing power necessary to diagnose selected anomalous engine behaviors and for making recommendations to the engine controller for anomaly mitigation. Independent auditors have assessed the reduction in Shuttle ascent risk to be on the order of 40% with the combined system and a three times improvement in mission success.
Safe days in space with acceptable uncertainty from space radiation exposure.
Cucinotta, Francis A; Alp, Murat; Rowedder, Blake; Kim, Myung-Hee Y
2015-04-01
The prediction of the risks of cancer and other late effects from space radiation exposure carries large uncertainties mostly due to the lack of information on the risks from high charge and energy (HZE) particles and other high linear energy transfer (LET) radiation. In our recent work new methods were used to consider NASA's requirement to protect against the acceptable risk of no more than 3% probability of cancer fatality estimated at the 95% confidence level. Because it is not possible that a zero-level of uncertainty could be achieved, we suggest that an acceptable uncertainty level should be defined in relationship to a probability distribution function (PDF) that only suffers from modest skewness with higher uncertainty allowed for a normal PDF. In this paper, we evaluate PDFs and the number or "safe days" in space, which are defined as the mission length where risk limits are not exceeded, for several mission scenarios at different acceptable levels of uncertainty. In addition, we briefly discuss several important issues in risk assessment including non-cancer effects, the distinct tumor spectra and lethality found in animal experiments for HZE particles compared to background or low LET radiation associated tumors, and the possibility of non-targeted effects (NTE) modifying low dose responses and increasing relative biological effectiveness (RBE) factors for tumor induction. Each of these issues skew uncertainty distributions to higher fatality probabilities with the potential to increase central values of risk estimates in the future. Therefore they will require significant research efforts to support space exploration within acceptable levels of risk and uncertainty. Copyright © 2015 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.
Using the human eye to image space radiation or the history and status of the light flash phenomena
NASA Astrophysics Data System (ADS)
Fuglesang, C.
2007-10-01
About 80% of people who travel in space experience sudden phosphenes, commonly called light flashes (LF). Although the detailed physiological process is still not known, the LFs are caused by particles in the cosmic radiation field. Indeed, by counting LFs one can even make a crude image of the radiation environment around the Earth. Studies on the space station Mir with the SilEye experiment correlated LFs with charged particles traversing the eye. It was found that a nucleus in the radiation environment has roughly a 1% probability of causing a light flash, whereas the proton's probability is almost three orders of magnitude less. As a function of linear energy transfer (LET), the probability increased with ionization above 10 keV/μm, reaching about 5% at 50 keV/μm. The investigations are continuing on the International Space Station (ISS) with the Alteino/SileEye-3 detector, which is also a precursor to the large Anomalous Long Term Effects on Astronauts (ALTEA) facility. These detectors are also measuring—imaging—the radiation environment inside the ISS, which will be compared to Geant4 simulations from the DESIRE project. To further the understanding of the LF phenomena, a survey among current NASA and ESA astronauts was recently conducted. The LFs are predominantly noticed before sleep and some respondents even thought it disturbed their sleep. The LFs appear white, have elongated shapes, and most interestingly, often come with a sense of motion. Comparing the shapes quoted from space observations with ground experiments done by researchers in the 1970s, it seems likely that some 5-10% of the LFs in space are due to Cherenkov light in the eye. However, the majority is most likely caused by some direct interaction in the retina.
Weyl calculus in QED I. The unitary group
NASA Astrophysics Data System (ADS)
Amour, L.; Lascar, R.; Nourrigat, J.
2017-01-01
In this work, we consider fixed 1/2 spin particles interacting with the quantized radiation field in the context of quantum electrodynamics. We investigate the time evolution operator in studying the reduced propagator (interaction picture). We first prove that this propagator belongs to the class of infinite dimensional Weyl pseudodifferential operators recently introduced in Amour et al. [J. Funct. Anal. 269(9), 2747-2812 (2015)] on Wiener spaces. We give a semiclassical expansion of the symbol of the reduced propagator up to any order with estimates on the remainder terms. Next, taking into account analyticity properties for the Weyl symbol of the reduced propagator, we derive estimates concerning transition probabilities between coherent states.
Ground radar detection of meteoroids in space
NASA Technical Reports Server (NTRS)
Kessler, D. J.; Landry, P. M.; Gabbard, J. R.; Moran, J. L. T.
1980-01-01
A special test to lower the detection threshold for satellite fragments potentially dangerous to spacecraft was carried out by NORAD for NASA, using modified radar software. The Perimeter Acquisition Radar Attack Characterization System, a large, planar face, phased radar, operates at a nominal 430 MHz and produces 120 pulses per second, 45 of which were dedicated to search. In a time period of 8.4 hours of observations over three days, over 6000 objects were detected and tracked of which 37 were determined to have velocities greater than escape velocity. Six of these were larger objects with radar cross sections greater than 0.1 sq m and were probably orbiting satellites. A table gives the flux of both observed groups.
Cosgrove, Casey M; Cohn, David E; Hampel, Heather; Frankel, Wendy L; Jones, Dan; McElroy, Joseph P; Suarez, Adrian A; Zhao, Weiqiang; Chen, Wei; Salani, Ritu; Copeland, Larry J; O'Malley, David M; Fowler, Jeffrey M; Yilmaz, Ahmet; Chassen, Alexis S; Pearlman, Rachel; Goodfellow, Paul J; Backes, Floor J
2017-09-01
To determine the relationship between mismatch repair (MMR) classification and clinicopathologic features including tumor volume, and explore outcomes by MMR class in a contemporary cohort. Single institution cohort evaluating MMR classification for endometrial cancers (EC). MMR immunohistochemistry (IHC)±microsatellite instability (MSI) testing and reflex MLH1 methylation testing was performed. Tumors with MMR abnormalities by IHC or MSI and MLH1 methylation were classified as epigenetic MMR deficiency while those without MLH1 methylation were classified as probable MMR mutations. Clinicopathologic characteristics were analyzed. 466 endometrial cancers were classified; 75% as MMR proficient, 20% epigenetic MMR defects, and 5% as probable MMR mutations. Epigenetic MMR defects were associated with advanced stage, higher grade, presence of lymphovascular space invasion, and older age. MMR class was significantly associated with tumor volume, an association not previously reported. The epigenetic MMR defect tumors median volume was 10,220mm 3 compared to 3321mm 3 and 2,846mm 3 , for MMR proficient and probable MMR mutations respectively (P<0.0001). Higher tumor volume was associated with lymph node involvement. Endometrioid EC cases with epigenetic MMR defects had significantly reduced recurrence-free survival (RFS). Among advanced stage (III/IV) endometrioid EC the epigenetic MMR defect group was more likely to recur compared to the MMR proficient group (47.7% vs 3.4%) despite receiving similar adjuvant therapy. In contrast, there was no difference in the number of early stage recurrences for the different MMR classes. MMR testing that includes MLH1 methylation analysis defines a subset of tumors that have worse prognostic features and reduced RFS. Copyright © 2017 Elsevier Inc. All rights reserved.
A Probabilistic, Facility-Centric Approach to Lightning Strike Location
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.
2012-01-01
A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.
NASA Technical Reports Server (NTRS)
Crawford, Winifred; Roeder, William
2010-01-01
The 45th Weather Squadron (45 WS) at Cape Canaveral Air Force Station (CCAFS) includes the probability of lightning occurrence in their 24-Hour and Weekly Planning Forecasts, briefed at 0700 EDT for daily operations planning on Kennedy Space Center (KSC) and CCAFS. This forecast is based on subjective analyses of model and observational data and output from an objective tool developed by the Applied Meteorology Unit (AMU). This tool was developed over two phases (Lambert and Wheeler 2005, Lambert 2007). It consists of five equations, one for each warm season month (May-Sep), that calculate the probability of lightning occurrence for the day and a graphical user interface (GUI) to display the output. The Phase I and II equations outperformed previous operational tools by a total of 56%. Based on this success, the 45 WS tasked the AMU with Phase III to improve the tool further.
Two Universality Properties Associated with the Monkey Model of Zipf's Law
NASA Astrophysics Data System (ADS)
Perline, Richard; Perline, Ron
2016-03-01
The distribution of word probabilities in the monkey model of Zipf's law is associated with two universality properties: (1) the power law exponent converges strongly to $-1$ as the alphabet size increases and the letter probabilities are specified as the spacings from a random division of the unit interval for any distribution with a bounded density function on $[0,1]$; and (2), on a logarithmic scale the version of the model with a finite word length cutoff and unequal letter probabilities is approximately normally distributed in the part of the distribution away from the tails. The first property is proved using a remarkably general limit theorem for the logarithm of sample spacings from Shao and Hahn, and the second property follows from Anscombe's central limit theorem for a random number of i.i.d. random variables. The finite word length model leads to a hybrid Zipf-lognormal mixture distribution closely related to work in other areas.
Shape Classification Using Wasserstein Distance for Brain Morphometry Analysis.
Su, Zhengyu; Zeng, Wei; Wang, Yalin; Lu, Zhong-Lin; Gu, Xianfeng
2015-01-01
Brain morphometry study plays a fundamental role in medical imaging analysis and diagnosis. This work proposes a novel framework for brain cortical surface classification using Wasserstein distance, based on uniformization theory and Riemannian optimal mass transport theory. By Poincare uniformization theorem, all shapes can be conformally deformed to one of the three canonical spaces: the unit sphere, the Euclidean plane or the hyperbolic plane. The uniformization map will distort the surface area elements. The area-distortion factor gives a probability measure on the canonical uniformization space. All the probability measures on a Riemannian manifold form the Wasserstein space. Given any 2 probability measures, there is a unique optimal mass transport map between them, the transportation cost defines the Wasserstein distance between them. Wasserstein distance gives a Riemannian metric for the Wasserstein space. It intrinsically measures the dissimilarities between shapes and thus has the potential for shape classification. To the best of our knowledge, this is the first. work to introduce the optimal mass transport map to general Riemannian manifolds. The method is based on geodesic power Voronoi diagram. Comparing to the conventional methods, our approach solely depends on Riemannian metrics and is invariant under rigid motions and scalings, thus it intrinsically measures shape distance. Experimental results on classifying brain cortical surfaces with different intelligence quotients demonstrated the efficiency and efficacy of our method.
Shape Classification Using Wasserstein Distance for Brain Morphometry Analysis
Su, Zhengyu; Zeng, Wei; Wang, Yalin; Lu, Zhong-Lin; Gu, Xianfeng
2015-01-01
Brain morphometry study plays a fundamental role in medical imaging analysis and diagnosis. This work proposes a novel framework for brain cortical surface classification using Wasserstein distance, based on uniformization theory and Riemannian optimal mass transport theory. By Poincare uniformization theorem, all shapes can be conformally deformed to one of the three canonical spaces: the unit sphere, the Euclidean plane or the hyperbolic plane. The uniformization map will distort the surface area elements. The area-distortion factor gives a probability measure on the canonical uniformization space. All the probability measures on a Riemannian manifold form the Wasserstein space. Given any 2 probability measures, there is a unique optimal mass transport map between them, the transportation cost defines the Wasserstein distance between them. Wasserstein distance gives a Riemannian metric for the Wasserstein space. It intrinsically measures the dissimilarities between shapes and thus has the potential for shape classification. To the best of our knowledge, this is the first work to introduce the optimal mass transport map to general Riemannian manifolds. The method is based on geodesic power Voronoi diagram. Comparing to the conventional methods, our approach solely depends on Riemannian metrics and is invariant under rigid motions and scalings, thus it intrinsically measures shape distance. Experimental results on classifying brain cortical surfaces with different intelligence quotients demonstrated the efficiency and efficacy of our method. PMID:26221691
2003-01-16
KENNEDY SPACE CENTER, FLA. -- Approximately 33 seconds after T-0 and liftoff of Space Shuttle Columbia, several particles are observed falling away from the -Z portion of the LH solid rocket booster ETA ring. Particles were identified later as probably pieces of the instafoam closeout on the ETA ring.
Peculiarities of biological action of hadrons of space radiation.
Akoev, I G; Yurov, S S
1975-01-01
Biological investigations in space enable one to make a significant contribution on high-energy hadrons to biological effects under the influence of factors of space flights. Physical and molecular principles of the action of high-energy hadrons are analysed. Genetic and somatic hadron effects produced by the secondary radiation from 70 GeV protons have been studied experimentally. The high biological effectiveness of hadrons, great variability in biological effects, and specifically of their action, are associated with strong interactions of high-energy hadrons. These are the probability of nuclear interaction with any atom nucleus, generation of a great number of secondary particles (among them, probably, highly effective multicharged and heavy nuclei, antiprotons, pi(-)-mesons), and the spatial distribution of secondary particles as a narrow cone with extremely high density of particles in its first part. The secondary radiation generated by high- and superhigh-energy hadrons upon their interaction with the spaceship is likely to be the greatest hazard of radiation to the crew during space flights.
NASA Technical Reports Server (NTRS)
Laurini, Kathleen C.; Hufenbach, Bernhard; Satoh, Maoki; Piedboeuf, Jean-Claude; Neumann, Benjamin
2010-01-01
Advancing critical and enhancing technologies is considered essential to enabling sustainable and affordable human space exploration. Critical technologies are those that enable a certain class of mission, such as technologies necessary for safe landing on the Martian surface, advanced propulsion, and closed loop life support. Others enhance the mission by leading to a greater satisfaction of mission objectives or increased probability of mission success. Advanced technologies are needed to reduce mass and cost. Many space agencies have studied exploration mission architectures and scenarios with the resulting lists of critical and enhancing technologies being very similar. With this in mind, and with the recognition that human space exploration will only be enabled by agencies working together to address these challenges, interested agencies participating in the International Space Exploration Coordination Group (ISECG) have agreed to perform a technology assessment as an important step in exploring cooperation opportunities for future exploration mission scenarios. "The Global Exploration Strategy: The Framework for Coordination" was developed by fourteen space agencies and released in May 2007. Since the fall of 2008, several International Space Exploration Coordination Group (ISECG) participating space agencies have been studying concepts for human exploration of the moon. They have identified technologies considered critical and enhancing of sustainable space exploration. Technologies such as in-situ resource utilization, advanced power generation/energy storage systems, reliable dust resistant mobility systems, and closed loop life support systems are important examples. Similarly, agencies such as NASA, ESA, and Russia have studied Mars exploration missions and identified critical technologies. They recognize that human and robotic precursor missions to destinations such as LEO, moon, and near earth objects provide opportunities to demonstrate the technologies needed for Mars mission. Agencies see the importance of assessing gaps and overlaps in their plans to advance technologies in order to leverage their investments and enable exciting missions as soon as practical. They see the importance of respecting the ability of any agency to invest in any technologies considered interesting or strategic. This paper will describe the importance of developing an appropriate international strategy for technology development and ideas for effective mechanisms for advancing an international strategy. This work will both inform and be informed by the development of an ISECG Global Exploration Roadmap and serve as a concrete step forward in advancing the Global Exploration Strategy.
Statistical Short-Range Guidance for Peak Wind Speed Forecasts at Edwards Air Force Base, CA
NASA Technical Reports Server (NTRS)
Dreher, Joseph G.; Crawford, Winifred; Lafosse, Richard; Hoeth, Brian; Burns, Kerry
2009-01-01
The peak winds near the surface are an important forecast element for space shuttle landings. As defined in the Flight Rules (FR), there are peak wind thresholds that cannot be exceeded in order to ensure the safety of the shuttle during landing operations. The National Weather Service Spaceflight Meteorology Group (SMG) is responsible for weather forecasts for all shuttle landings, and is required to issue surface average and 10-minute peak wind speed forecasts. They indicate peak winds are a challenging parameter to forecast. To alleviate the difficulty in making such wind forecasts, the Applied Meteorology Unit (AMU) developed a PC-based graphical user interface (GUI) for displaying peak wind climatology and probabilities of exceeding peak wind thresholds for the Shuttle Landing Facility (SLF) at Kennedy Space Center (KSC; Lambert 2003). However, the shuttle occasionally may land at Edwards Air Force Base (EAFB) in southern California when weather conditions at KSC in Florida are not acceptable, so SMG forecasters requested a similar tool be developed for EAFB.
Situational Lightning Climatologies for Central Florida, Phase 2, Part 3
NASA Technical Reports Server (NTRS)
Bauman, William H., III
2007-01-01
The threat of lightning is a daily concern during the warm season in Florida. The forecasters at the Spaceflight Meteorology Group (SMG) at Johnson Spaceflight Center in Houston, TX consider lightning in their landing forecasts for space shuttles at the Kennedy Space Center (KSC), FL Shuttle Landing Facility (SLF). The forecasters at the National Weather Service in Melbourne, FL (NWS MLB) do the same in their routine Terminal Aerodrome Forecasts (TAFs) for seven airports in the NWS MLB County Warning Area (CWA). The Applied Meteorology Unit created flow regime climatologies of lightning probability in the 5-, 10-, 20-, and 30-n mi circles surrounding the Shuttle Landing Facility (SLF) and all airports in the NWS MLB county warning area in 1-, 3-, and 6-hour increments. The results were presented in tabular and graphical format and incorporated into a web-based graphical user interface so forecasters could easily navigate through the data and to make the GUI usable in any web browser on computers with different operating systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kavenoky, A.
1973-01-01
From national topical meeting on mathematical models and computational techniques for analysis of nuclear systems; Ann Arbor, Michigan, USA (8 Apr 1973). In mathematical models and computational techniques for analysis of nuclear systems. APOLLO calculates the space-and-energy-dependent flux for a one dimensional medium, in the multigroup approximation of the transport equation. For a one dimensional medium, refined collision probabilities have been developed for the resolution of the integral form of the transport equation; these collision probabilities increase accuracy and save computing time. The interaction between a few cells can also be treated by the multicell option of APOLLO. The diffusionmore » coefficient and the material buckling can be computed in the various B and P approximations with a linearly anisotropic scattering law, even in the thermal range of the spectrum. Eventually this coefficient is corrected for streaming by use of Benoist's theory. The self-shielding of the heavy isotopes is treated by a new and accurate technique which preserves the reaction rates of the fundamental fine structure flux. APOLLO can perform a depletion calculation for one cell, a group of cells or a complete reactor. The results of an APOLLO calculation are the space-and-energy-dependent flux, the material buckling or any reaction rate; these results can also be macroscopic cross sections used as input data for a 2D or 3D depletion and diffusion code in reactor geometry. 10 references. (auth)« less
Metrics for More than Two Points at Once
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2005-01-01
The conventional definition of a topological metric over a space specifies properties that must be obeyed by any measure of "how separated" two points in that space are. Here it is shown how to extend that definition, and in particular the triangle inequality, to concern arbitrary numbers of points. Such a measure of how separated the points within a collection are can be bootstrapped, to measure "how separated" from each other are two (or more) collections. The measure presented here also allows fractional membership of an element in a collection. This means it directly concerns measures of "how spread out" a probability distribution over a space is. When such a measure is bootstrapped to compare two collections, it allows us to measure how separated two probability distributions are, or more generally, how separated a distribution of distributions is.
NASA Astrophysics Data System (ADS)
Cajiao Vélez, F.; Kamiński, J. Z.; Krajewska, K.
2018-04-01
High-energy photoionization driven by short and circularly-polarized laser pulses is studied in the framework of the relativistic strong-field approximation. The saddle-point analysis of the integrals defining the probability amplitude is used to determine the general properties of the probability distributions. Additionally, an approximate solution to the saddle-point equation is derived. This leads to the concept of the three-dimensional spiral of life in momentum space, around which the ionization probability distribution is maximum. We demonstrate that such spiral is also obtained from a classical treatment.
Racial/Ethnic and County-level Disparity in Inpatient Utilization among Hawai'i Medicaid Population.
Siriwardhana, Chathura; Lim, Eunjung; Aggarwal, Lovedhi; Davis, James; Hixon, Allen; Chen, John J
2018-05-01
We investigated racial/ethnic and county-level disparities in inpatient utilization for 15 clinical conditions among Hawaii's Medicaid population. The study was conducted using inpatient claims data from more than 200,000 Hawai'i Medicaid beneficiaries, reported in the year 2010. The analysis was performed by stratifying the Medicaid population into three age groups: children and adolescent group (1-20 years), adult group (21-64 years), and elderly group (65 years and above). Among the differences found, Asians had a low probability of inpatient admissions compared to Whites for many disease categories, while Native Hawaiian/Pacific Islanders had higher probabilities than Whites, across all age groups. Pediatric and adult groups from Hawai'i County (Big Island) had lower probabilities for inpatient admissions compared to Honolulu County (O'ahu) for most disease conditions, but higher probabilities were observed for several conditions in the elderly group. Notably, the elderly population residing on Kaua'i County (Kaua'i and Ni'ihau islands) had substantially increased odds of hospital admissions for several disease conditions, compared to Honolulu.
Multiple Streaming and the Probability Distribution of Density in Redshift Space
NASA Astrophysics Data System (ADS)
Hui, Lam; Kofman, Lev; Shandarin, Sergei F.
2000-07-01
We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple streaming using the Zeldovich approximation (ZA), and compute the average number of streams in both real and redshift space. We find that multiple streaming can be significant in redshift space but negligible in real space, even at moderate values of the linear fluctuation amplitude (σl<~1). Moreover, unlike their real-space counterparts, redshift-space multiple streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which are physically distinct from the fingers-of-God due to small-scale virialized motions, might in part explain the well-known departure of redshift distortions from the classic linear prediction by Kaiser, even at relatively large scales where the corresponding density field in real space is well described by linear perturbation theory. We also compute, using the ZA, the probability distribution function (PDF) of the density, as well as S3, in real and redshift space, and compare it with the PDF measured from N-body simulations. The role of caustics in defining the character of the high-density tail is examined. We find that (non-Lagrangian) smoothing, due to both finite resolution or discreteness and small-scale velocity dispersions, is very effective in erasing caustic structures, unless the initial power spectrum is sufficiently truncated.
Rare event simulation in radiation transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kollman, Craig
1993-10-01
This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved,more » even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.« less
Application of Bayes' theorem for pulse shape discrimination
NASA Astrophysics Data System (ADS)
Monterial, Mateusz; Marleau, Peter; Clarke, Shaun; Pozzi, Sara
2015-09-01
A Bayesian approach is proposed for pulse shape discrimination of photons and neutrons in liquid organic scinitillators. Instead of drawing a decision boundary, each pulse is assigned a photon or neutron confidence probability. This allows for photon and neutron classification on an event-by-event basis. The sum of those confidence probabilities is used to estimate the number of photon and neutron instances in the data. An iterative scheme, similar to an expectation-maximization algorithm for Gaussian mixtures, is used to infer the ratio of photons-to-neutrons in each measurement. Therefore, the probability space adapts to data with varying photon-to-neutron ratios. A time-correlated measurement of Am-Be and separate measurements of 137Cs, 60Co and 232Th photon sources were used to construct libraries of neutrons and photons. These libraries were then used to produce synthetic data sets with varying ratios of photons-to-neutrons. Probability weighted method that we implemented was found to maintain neutron acceptance rate of up to 90% up to photon-to-neutron ratio of 2000, and performed 9% better than the decision boundary approach. Furthermore, the iterative approach appropriately changed the probability space with an increasing number of photons which kept the neutron population estimate from unrealistically increasing.
Application of Bayes' theorem for pulse shape discrimination
Marleau, Peter; Monterial, Mateusz; Clarke, Shaun; ...
2015-06-14
A Bayesian approach is proposed for pulse shape discrimination of photons and neutrons in liquid organic scinitillators. Instead of drawing a decision boundary, each pulse is assigned a photon or neutron confidence probability. In addition, this allows for photon and neutron classification on an event-by-event basis. The sum of those confidence probabilities is used to estimate the number of photon and neutron instances in the data. An iterative scheme, similar to an expectation-maximization algorithm for Gaussian mixtures, is used to infer the ratio of photons-to-neutrons in each measurement. Therefore, the probability space adapts to data with varying photon-to-neutron ratios. Amore » time-correlated measurement of Am–Be and separate measurements of 137Cs, 60Co and 232Th photon sources were used to construct libraries of neutrons and photons. These libraries were then used to produce synthetic data sets with varying ratios of photons-to-neutrons. Probability weighted method that we implemented was found to maintain neutron acceptance rate of up to 90% up to photon-to-neutron ratio of 2000, and performed 9% better than the decision boundary approach. Furthermore, the iterative approach appropriately changed the probability space with an increasing number of photons which kept the neutron population estimate from unrealistically increasing.« less
Multi stage unreliable retrial Queueing system with Bernoulli vacation
NASA Astrophysics Data System (ADS)
Radha, J.; Indhira, K.; Chandrasekaran, V. M.
2017-11-01
In this work we considered the Bernoulli vacation in group arrival retrial queues with unreliable server. Here, a server providing service in k stages. Any arriving group of units finds the server free, one from the group entering the first stage of service and the rest are joining into the orbit. After completion of the i th, (i=1,2,…k) stage of service, the customer may go to (i+1)th stage with probability θi , or leave the system with probability qi = 1 - θi , (i = 1,2,…k - 1) and qi = 1, (i = k). The server may enjoy vacation (orbit is empty or not) with probability v after finishing the service or continuing the service with probability 1-v. After finishing the vacation, the server search for the customer in the orbit with probability θ or remains idle for new arrival with probability 1-θ. We analyzed the system using the method of supplementary variable.
NASA Astrophysics Data System (ADS)
Cimermanová, K.
2009-01-01
In this paper we illustrate the influence of prior probabilities of diseases on diagnostic reasoning. For various prior probabilities of classified groups characterized by volatile organic compounds of breath profile, smokers and non-smokers, we constructed the ROC curve and the Youden index with related asymptotic pointwise confidence intervals.
Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vourdas, A.
2014-08-15
The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H{sub 1},H{sub 2}), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H{sub 1}),P(H{sub 2}), to the subspacesmore » H{sub 1}, H{sub 2}. As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities.« less
Earth reencounter probabilities for aborted space disposal of hazardous nuclear waste
NASA Technical Reports Server (NTRS)
Friedlander, A. L.; Feingold, H.
1977-01-01
A quantitative assessment is made of the long-term risk of earth reencounter and reentry associated with aborted disposal of hazardous material in the space environment. Numerical results are presented for 10 candidate disposal options covering a broad spectrum of disposal destinations and deployment propulsion systems. Based on representative models of system failure, the probability that a single payload will return and collide with earth within a period of 250,000 years is found to lie in the range .0002-.006. Proportionately smaller risk attaches to shorter time intervals. Risk-critical factors related to trajectory geometry and system reliability are identified as possible mechanisms of hazard reduction.
A study into the loss of lock of the space telescope fine guidance sensor
NASA Technical Reports Server (NTRS)
Polites, M. E.
1983-01-01
The results of a study into the loss of lock phenomenon associated with the Space Telescope Fine Guidance Sensor (FGS) are documented. The primary cause of loss of lock has been found to be a combination of cosmic ray spikes and photon noise due to a 14.5 Mv star. The probability of maintaining lock versus time is estimated both for the baseline FGS design and with parameter changes in the FGS firmware which will improve the probability of maintaining lock. The parameters varied are changeable in-flight from the ground and hence do not impact the design of the FGS hardware.
NASA Technical Reports Server (NTRS)
Backus, George
1987-01-01
Let R be the real numbers, R(n) the linear space of all real n-tuples, and R(infinity) the linear space of all infinite real sequences x = (x sub 1, x sub 2,...). Let P sub n :R(infinity) approaches R(n) be the projection operator with P sub n (x) = (x sub 1,...,x sub n). Let p(infinity) be a probability measure on the smallest sigma-ring of subsets of R(infinity) which includes all of the cylinder sets P sub n(-1) (B sub n), where B sub n is an arbitrary Borel subset of R(n). Let p sub n be the marginal distribution of p(infinity) on R(n), so p sub n(B sub n) = p(infinity)(P sub n to the -1(B sub n)) for each B sub n. A measure on R(n) is isotropic if it is invariant under all orthogonal transformations of R(n). All members of the set of all isotropic probability distributions on R(n) are described. The result calls into question both stochastic inversion and Bayesian inference, as currently used in many geophysical inverse problems.
Analysis of Advanced Respiratory Support Onboard ISS and CCV
NASA Technical Reports Server (NTRS)
Shah, Ronak V.; Kertsman, Eric L.; Alexander, David J.; Duchesne, Ted; Law, Jennifer; Roden, Sean K.
2014-01-01
NASA is collaborating with private entities for the development of commercial space vehicles. The Space and Clinical Operations Division was tasked to review the oxygen and respiratory support system and recommend what capabilities, if any, the vehicle should have to support the return of an ill or injured crewmember. The Integrated Medical Model (IMM) was utilized as a data source for the development of these recommendations. The Integrated Medical Model (IMM) was used to simulate a six month, six crew, International Space Station (ISS) mission. Three medical system scenarios were considered based on the availability of (1) oxygen only, (2) oxygen and a ventilator, or (3) neither oxygen nor ventilator. The IMM analysis provided probability estimates of medical events that would require either oxygen or ventilator support. It also provided estimates of crew health, the probability of evacuation, and the probability of loss of crew life secondary to medical events for each of the three medical system scenarios. These IMM outputs were used as objective data to enable evidence-based decisions regarding oxygen and respiratory support system requirements for commercial crew vehicles. The IMM provides data that may be utilized to support informed decisions regarding the development of medical systems for commercial crew vehicles.
Martínez, Lina; Prada, Sergio; Estrada, Daniela
2018-06-01
Obesity and frequent mental and physical distress are often associated with major health problems. The characteristics of the urban environment, such as homicide rates and public goods provision, play an important role in influencing participation in physical activity and in overall mental health. This study aimed to determine whether there was a relationship between homicide rates and public goods provision on the health outcomes of the citizens of Cali, Colombia, a city known for its high urban violence rate and low municipal investment in public goods. We used a linear probability model to relate homicide rates and public goods provision (lighted parks, effective public space per inhabitant, and bus stations) at the district level to health outcomes (obesity and frequent mental and physical distress). Individual data were obtained from the 2014 CaliBRANDO survey, and urban context characteristics were obtained from official government statistics. After controlling for individual covariates, results showed that homicide rates were a risk factor in all examined outcomes. An increase in 1.0 m 2 of public space per inhabitant reduced the probability of an individual being obese or overweight by 0.2% (95% confidence interval (CI) = - 0.004 to - 0.001) and the probability of frequent physical distress by 0.1% (95% CI = - 0.002 to - 0.001). On average, the presence of one additional bus station increased the probability of being obese or overweight by 1.1%, the probability of frequent mental distress by 0.3% (95% CI = 0.001-0.004), and the probability of frequent physical distress by 0.02% (95% CI = 0.000-0.003). Living in districts with adequate public space and lighted parks lowers the probability of being obese and high homicide rates, which are correlated with poor health outcomes in Cali, Colombia. Investments in public goods provision and urban safety to reduce obesity rates may contribute to a better quality of life for the population.
Are stress-induced cortisol changes during pregnancy associated with postpartum depressive symptoms?
Nierop, Ada; Bratsikas, Aliki; Zimmermann, Roland; Ehlert, Ulrike
2006-01-01
The purpose of this study was to examine the association between psychobiological stress reactivity during healthy pregnancy and depressive symptoms in the early puerperium. A sample of healthy nulliparous pregnant women (N = 57) between the ages of 21 and 35 years underwent a standardized psychosocial stress test during pregnancy. Within an average of 13 days after delivery, postpartum depressive symptoms were assessed using the German version of the Edinburgh postnatal depression scale (EPDS). The sample was divided into a group with probable cases (EPDS score >9, N = 16) and a group with probable noncases (EPDS score < or =9, N = 41). The probable case group showed significantly higher cortisol responses to the stress test compared with the probable noncase group, whereas baseline levels did not differ. Additionally, women in the probable case group showed significantly higher state anxiety and lower mood state throughout the experiment. Furthermore, the probable case group showed higher stress susceptibility, higher trait anxiety, and higher levels in the Symptom Checklist. No differences were found for prior episodes of psychiatric disorders, obstetrical complications, birth weight, or mode of delivery. Our data provide evidence that healthy pregnant women developing postpartum depressive symptoms might already be identified during pregnancy by means of their higher cortisol reactivity and their higher psychological reactivity in response to psychosocial stress. Further investigations are required to explore whether higher psychobiological stress responses not only precede depressive symptoms within 2 weeks after birth, but might also predict postpartum major depression.
Where are compact groups in the local Universe?
NASA Astrophysics Data System (ADS)
Díaz-Giménez, Eugenia; Zandivarez, Ariel
2015-06-01
Aims: The purpose of this work is to perform a statistical analysis of the location of compact groups in the Universe from observational and semi-analytical points of view. Methods: We used the velocity-filtered compact group sample extracted from the Two Micron All Sky Survey for our analysis. We also used a new sample of galaxy groups identified in the 2M++ galaxy redshift catalogue as tracers of the large-scale structure. We defined a procedure to search in redshift space for compact groups that can be considered embedded in other overdense systems and applied this criterion to several possible combinations of different compact and galaxy group subsamples. We also performed similar analyses for simulated compact and galaxy groups identified in a 2M++ mock galaxy catalogue constructed from the Millennium Run Simulation I plus a semi-analytical model of galaxy formation. Results: We observed that only ~27% of the compact groups can be considered to be embedded in larger overdense systems, that is, most of the compact groups are more likely to be isolated systems. The embedded compact groups show statistically smaller sizes and brighter surface brightnesses than non-embedded systems. No evidence was found that embedded compact groups are more likely to inhabit galaxy groups with a given virial mass or with a particular dynamical state. We found very similar results when the analysis was performed using mock compact and galaxy groups. Based on the semi-analytical studies, we predict that 70% of the embedded compact groups probably are 3D physically dense systems. Finally, real space information allowed us to reveal the bimodal behaviour of the distribution of 3D minimum distances between compact and galaxy groups. Conclusions: The location of compact groups should be carefully taken into account when comparing properties of galaxies in environments that are a priori different. Appendices are available in electronic form at http://www.aanda.orgFull Tables B.1 and B.2 are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/578/A61
NASA Technical Reports Server (NTRS)
Lambert, Winifred C.
2003-01-01
This report describes the results from Phase II of the AMU's Short-Range Statistical Forecasting task for peak winds at the Shuttle Landing Facility (SLF). The peak wind speeds are an important forecast element for the Space Shuttle and Expendable Launch Vehicle programs. The 45th Weather Squadron and the Spaceflight Meteorology Group indicate that peak winds are challenging to forecast. The Applied Meteorology Unit was tasked to develop tools that aid in short-range forecasts of peak winds at tower sites of operational interest. A seven year record of wind tower data was used in the analysis. Hourly and directional climatologies by tower and month were developed to determine the seasonal behavior of the average and peak winds. Probability density functions (PDF) of peak wind speed were calculated to determine the distribution of peak speed with average speed. These provide forecasters with a means of determining the probability of meeting or exceeding a certain peak wind given an observed or forecast average speed. A PC-based Graphical User Interface (GUI) tool was created to display the data quickly.
Johnson, Matthew W; Johnson, Patrick S; Herrmann, Evan S; Sweeney, Mary M
2015-01-01
Individuals with cocaine use disorders are disproportionately affected by HIV/AIDS, partly due to higher rates of unprotected sex. Recent research suggests delay discounting of condom use is a factor in sexual HIV risk. Delay discounting is a behavioral economic concept describing how delaying an event reduces that event's value or impact on behavior. Probability discounting is a related concept describing how the uncertainty of an event decreases its impact on behavior. Individuals with cocaine use disorders (n = 23) and matched non-cocaine-using controls (n = 24) were compared in decision-making tasks involving hypothetical outcomes: delay discounting of condom-protected sex (Sexual Delay Discounting Task), delay discounting of money, the effect of sexually transmitted infection (STI) risk on likelihood of condom use (Sexual Probability Discounting Task), and probability discounting of money. The Cocaine group discounted delayed condom-protected sex (i.e., were more likely to have unprotected sex vs. wait for a condom) significantly more than controls in two of four Sexual Delay Discounting Task partner conditions. The Cocaine group also discounted delayed money (i.e., preferred smaller immediate amounts over larger delayed amounts) significantly more than controls. In the Sexual Probability Discounting Task, both groups showed sensitivity to STI risk, however the groups did not differ. The Cocaine group did not consistently discount probabilistic money more or less than controls. Steeper discounting of delayed, but not probabilistic, sexual outcomes may contribute to greater rates of sexual HIV risk among individuals with cocaine use disorders. Probability discounting of sexual outcomes may contribute to risk of unprotected sex in both groups. Correlations showed sexual and monetary results were unrelated, for both delay and probability discounting. The results highlight the importance of studying specific behavioral processes (e.g., delay and probability discounting) with respect to specific outcomes (e.g., monetary and sexual) to understand decision making in problematic behavior.
Johnson, Matthew W.; Johnson, Patrick S.; Herrmann, Evan S.; Sweeney, Mary M.
2015-01-01
Individuals with cocaine use disorders are disproportionately affected by HIV/AIDS, partly due to higher rates of unprotected sex. Recent research suggests delay discounting of condom use is a factor in sexual HIV risk. Delay discounting is a behavioral economic concept describing how delaying an event reduces that event’s value or impact on behavior. Probability discounting is a related concept describing how the uncertainty of an event decreases its impact on behavior. Individuals with cocaine use disorders (n = 23) and matched non-cocaine-using controls (n = 24) were compared in decision-making tasks involving hypothetical outcomes: delay discounting of condom-protected sex (Sexual Delay Discounting Task), delay discounting of money, the effect of sexually transmitted infection (STI) risk on likelihood of condom use (Sexual Probability Discounting Task), and probability discounting of money. The Cocaine group discounted delayed condom-protected sex (i.e., were more likely to have unprotected sex vs. wait for a condom) significantly more than controls in two of four Sexual Delay Discounting Task partner conditions. The Cocaine group also discounted delayed money (i.e., preferred smaller immediate amounts over larger delayed amounts) significantly more than controls. In the Sexual Probability Discounting Task, both groups showed sensitivity to STI risk, however the groups did not differ. The Cocaine group did not consistently discount probabilistic money more or less than controls. Steeper discounting of delayed, but not probabilistic, sexual outcomes may contribute to greater rates of sexual HIV risk among individuals with cocaine use disorders. Probability discounting of sexual outcomes may contribute to risk of unprotected sex in both groups. Correlations showed sexual and monetary results were unrelated, for both delay and probability discounting. The results highlight the importance of studying specific behavioral processes (e.g., delay and probability discounting) with respect to specific outcomes (e.g., monetary and sexual) to understand decision making in problematic behavior. PMID:26017273
DOT National Transportation Integrated Search
1997-07-22
The Commercial Space Launch Act requires that all commercial licensees : demonstrate financial responsibility to compensate for the maximum probable : loss (MPL) from claims by a third party for death, bodily injury, or property : damage or loss resu...
Predicting the Consequences of MMOD Penetrations on the International Space Station
NASA Technical Reports Server (NTRS)
Hyde, James; Christiansen, E.; Lear, D.; Evans
2018-01-01
The threat from micrometeoroid and orbital debris (MMOD) impacts on space vehicles is often quantified in terms of the probability of no penetration (PNP). However, for large spacecraft, especially those with multiple compartments, a penetration may have a number of possible outcomes. The extent of the damage (diameter of hole, crack length or penetration depth), the location of the damage relative to critical equipment or crew, crew response, and even the time of day of the penetration are among the many factors that can affect the outcome. For the International Space Station (ISS), a Monte-Carlo style software code called Manned Spacecraft Crew Survivability (MSCSurv) is used to predict the probability of several outcomes of an MMOD penetration-broadly classified as loss of crew (LOC), crew evacuation (Evac), loss of escape vehicle (LEV), and nominal end of mission (NEOM). By generating large numbers of MMOD impacts (typically in the billions) and tracking the consequences, MSCSurv allows for the inclusion of a large number of parameters and models as well as enabling the consideration of uncertainties in the models and parameters. MSCSurv builds upon the results from NASA's Bumper software (which provides the probability of penetration and critical input data to MSCSurv) to allow analysts to estimate the probability of LOC, Evac, LEV, and NEOM. This paper briefly describes the overall methodology used by NASA to quantify LOC, Evac, LEV, and NEOM with particular emphasis on describing in broad terms how MSCSurv works and its capabilities and most significant models.
Quantum gravity in timeless configuration space
NASA Astrophysics Data System (ADS)
Gomes, Henrique
2017-12-01
On the path towards quantum gravity we find friction between temporal relations in quantum mechanics (QM) (where they are fixed and field-independent), and in general relativity (where they are field-dependent and dynamic). This paper aims to attenuate that friction, by encoding gravity in the timeless configuration space of spatial fields with dynamics given by a path integral. The framework demands that boundary conditions for this path integral be uniquely given, but unlike other approaches where they are prescribed—such as the no-boundary and the tunneling proposals—here I postulate basic principles to identify boundary conditions in a large class of theories. Uniqueness arises only if a reduced configuration space can be defined and if it has a profoundly asymmetric fundamental structure. These requirements place strong restrictions on the field and symmetry content of theories encompassed here; shape dynamics is one such theory. When these constraints are met, any emerging theory will have a Born rule given merely by a particular volume element built from the path integral in (reduced) configuration space. Also as in other boundary proposals, Time, including space-time, emerges as an effective concept; valid for certain curves in configuration space but not assumed from the start. When some such notion of time becomes available, conservation of (positive) probability currents ensues. I show that, in the appropriate limits, a Schrödinger equation dictates the evolution of weakly coupled source fields on a classical gravitational background. Due to the asymmetry of reduced configuration space, these probabilities and currents avoid a known difficulty of standard WKB approximations for Wheeler DeWitt in minisuperspace: the selection of a unique Hamilton–Jacobi solution to serve as background. I illustrate these constructions with a simple example of a full quantum gravitational theory (i.e. not in minisuperspace) for which the formalism is applicable, and give a formula for calculating gravitational semi-classical relative probabilities in it.
Shilov, Ignat V; Seymour, Sean L; Patel, Alpesh A; Loboda, Alex; Tang, Wilfred H; Keating, Sean P; Hunter, Christie L; Nuwaysir, Lydia M; Schaeffer, Daniel A
2007-09-01
The Paragon Algorithm, a novel database search engine for the identification of peptides from tandem mass spectrometry data, is presented. Sequence Temperature Values are computed using a sequence tag algorithm, allowing the degree of implication by an MS/MS spectrum of each region of a database to be determined on a continuum. Counter to conventional approaches, features such as modifications, substitutions, and cleavage events are modeled with probabilities rather than by discrete user-controlled settings to consider or not consider a feature. The use of feature probabilities in conjunction with Sequence Temperature Values allows for a very large increase in the effective search space with only a very small increase in the actual number of hypotheses that must be scored. The algorithm has a new kind of user interface that removes the user expertise requirement, presenting control settings in the language of the laboratory that are translated to optimal algorithmic settings. To validate this new algorithm, a comparison with Mascot is presented for a series of analogous searches to explore the relative impact of increasing search space probed with Mascot by relaxing the tryptic digestion conformance requirements from trypsin to semitrypsin to no enzyme and with the Paragon Algorithm using its Rapid mode and Thorough mode with and without tryptic specificity. Although they performed similarly for small search space, dramatic differences were observed in large search space. With the Paragon Algorithm, hundreds of biological and artifact modifications, all possible substitutions, and all levels of conformance to the expected digestion pattern can be searched in a single search step, yet the typical cost in search time is only 2-5 times that of conventional small search space. Despite this large increase in effective search space, there is no drastic loss of discrimination that typically accompanies the exploration of large search space.
NASA Technical Reports Server (NTRS)
Kerstman, Eric; Saile, Lynn; Freire de Carvalho, Mary; Myers, Jerry; Walton, Marlei; Butler, Douglas; Lopez, Vilma
2011-01-01
Introduction The Integrated Medical Model (IMM) is a decision support tool that is useful to space flight mission managers and medical system designers in assessing risks and optimizing medical systems. The IMM employs an evidence-based, probabilistic risk assessment (PRA) approach within the operational constraints of space flight. Methods Stochastic computational methods are used to forecast probability distributions of medical events, crew health metrics, medical resource utilization, and probability estimates of medical evacuation and loss of crew life. The IMM can also optimize medical kits within the constraints of mass and volume for specified missions. The IMM was used to forecast medical evacuation and loss of crew life probabilities, as well as crew health metrics for a near-earth asteroid (NEA) mission. An optimized medical kit for this mission was proposed based on the IMM simulation. Discussion The IMM can provide information to the space program regarding medical risks, including crew medical impairment, medical evacuation and loss of crew life. This information is valuable to mission managers and the space medicine community in assessing risk and developing mitigation strategies. Exploration missions such as NEA missions will have significant mass and volume constraints applied to the medical system. Appropriate allocation of medical resources will be critical to mission success. The IMM capability of optimizing medical systems based on specific crew and mission profiles will be advantageous to medical system designers. Conclusion The IMM is a decision support tool that can provide estimates of the impact of medical events on human space flight missions, such as crew impairment, evacuation, and loss of crew life. It can be used to support the development of mitigation strategies and to propose optimized medical systems for specified space flight missions. Learning Objectives The audience will learn how an evidence-based decision support tool can be used to help assess risk, develop mitigation strategies, and optimize medical systems for exploration space flight missions.
Probability based models for estimation of wildfire risk
Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit
2004-01-01
We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...
Imaging in syndesmotic injury: a systematic literature review.
Krähenbühl, Nicola; Weinberg, Maxwell W; Davidson, Nathan P; Mills, Megan K; Hintermann, Beat; Saltzman, Charles L; Barg, Alexej
2018-05-01
To give a systematic overview of current diagnostic imaging options for assessment of the distal tibio-fibular syndesmosis. A systematic literature search across the following sources was performed: PubMed, ScienceDirect, Google Scholar, and SpringerLink. Forty-two articles were included and subdivided into three groups: group one consists of studies using conventional radiographs (22 articles), group two includes studies using computed tomography (CT) scans (15 articles), and group three comprises studies using magnet resonance imaging (MRI, 9 articles).The following data were extracted: imaging modality, measurement method, number of participants and ankles included, average age of participants, sensitivity, specificity, and accuracy of the measurement technique. The Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2) tool was used to assess the methodological quality. The three most common techniques used for assessment of the syndesmosis in conventional radiographs are the tibio-fibular clear space (TFCS), the tibio-fibular overlap (TFO), and the medial clear space (MCS). Regarding CT scans, the tibio-fibular width (axial images) was most commonly used. Most of the MRI studies used direct assessment of syndesmotic integrity. Overall, the included studies show low probability of bias and are applicable in daily practice. Conventional radiographs cannot predict syndesmotic injuries reliably. CT scans outperform plain radiographs in detecting syndesmotic mal-reduction. Additionally, the syndesmotic interval can be assessed in greater detail by CT. MRI measurements achieve a sensitivity and specificity of nearly 100%; however, correlating MRI findings with patients' complaints is difficult, and utility with subtle syndesmotic instability needs further investigation. Overall, the methodological quality of these studies was satisfactory.
The Hubble Space Telescope Medium Deep Survey Cluster Sample: Methodology and Data
NASA Astrophysics Data System (ADS)
Ostrander, E. J.; Nichol, R. C.; Ratnatunga, K. U.; Griffiths, R. E.
1998-12-01
We present a new, objectively selected, sample of galaxy overdensities detected in the Hubble Space Telescope Medium Deep Survey (MDS). These clusters/groups were found using an automated procedure that involved searching for statistically significant galaxy overdensities. The contrast of the clusters against the field galaxy population is increased when morphological data are used to search around bulge-dominated galaxies. In total, we present 92 overdensities above a probability threshold of 99.5%. We show, via extensive Monte Carlo simulations, that at least 60% of these overdensities are likely to be real clusters and groups and not random line-of-sight superpositions of galaxies. For each overdensity in the MDS cluster sample, we provide a richness and the average of the bulge-to-total ratio of galaxies within each system. This MDS cluster sample potentially contains some of the most distant clusters/groups ever detected, with about 25% of the overdensities having estimated redshifts z > ~0.9. We have made this sample publicly available to facilitate spectroscopic confirmation of these clusters and help more detailed studies of cluster and galaxy evolution. We also report the serendipitous discovery of a new cluster close on the sky to the rich optical cluster Cl l0016+16 at z = 0.546. This new overdensity, HST 001831+16208, may be coincident with both an X-ray source and a radio source. HST 001831+16208 is the third cluster/group discovered near to Cl 0016+16 and appears to strengthen the claims of Connolly et al. of superclustering at high redshift.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murakami, Mário T.; Center for Applied Toxinology, CAT-CEPID, São Paulo, SP; Advanced Center for Genomics and Proteomics, UNESP-State University of São Paulo, São José do Rio Preto 15054-000
2007-07-01
A single crystal of zhaoermiatoxin with maximum dimensions of 0.2 × 0.2 × 0.5 mm was used for X-ray diffraction data collection to a resolution of 2.05 Å using synchrotron radiation and the diffraction pattern was indexed in the hexagonal space group P6{sub 4}, with unit-cell parameters a = 72.9, b = 72.9, c = 93.9 Å. Zhaoermiatoxin, an Arg49 phospholipase A{sub 2} homologue from Zhaoermia mangshanensis (formerly Trimeresurus mangshanensis, Ermia mangshanensis) venom is a novel member of the PLA{sub 2}-homologue family that possesses an arginine residue at position 49, probably arising from a secondary Lys49→Arg substitution that does notmore » alter the catalytic inactivity towards phospholipids. Like other Lys49 PLA{sub 2} homologues, zhaoermiatoxin induces oedema and strong myonecrosis without detectable PLA{sub 2} catalytic activity. A single crystal with maximum dimensions of 0.2 × 0.2 × 0.5 mm was used for X-ray diffraction data collection to a resolution of 2.05 Å using synchrotron radiation and the diffraction pattern was indexed in the hexagonal space group P6{sub 4}, with unit-cell parameters a = 72.9, b = 72.9, c = 93.9 Å.« less
Space Debris Surfaces - Probability of no penetration versus impact velocity and obliquity
NASA Technical Reports Server (NTRS)
Elfer, N.; Meibaum, R.; Olsen, G.
1992-01-01
A collection of computer codes called Space Debris Surfaces (SD-SURF), have been developed to assist in the design and analysis of space debris protection systems. An SD-SURF analysis will show which obliquities and velocities are most likely to cause a penetration to help the analyst select a shield design best suited to the predominant penetration mechanism. Examples of the interaction between space vehicle geometry, the space debris environment, and the penetration and critical damage ballistic limit surfaces of the shield under consideration are presented.
Malhis, Nawar; Butterfield, Yaron S N; Ester, Martin; Jones, Steven J M
2009-01-01
A plethora of alignment tools have been created that are designed to best fit different types of alignment conditions. While some of these are made for aligning Illumina Sequence Analyzer reads, none of these are fully utilizing its probability (prb) output. In this article, we will introduce a new alignment approach (Slider) that reduces the alignment problem space by utilizing each read base's probabilities given in the prb files. Compared with other aligners, Slider has higher alignment accuracy and efficiency. In addition, given that Slider matches bases with probabilities other than the most probable, it significantly reduces the percentage of base mismatches. The result is that its SNP predictions are more accurate than other SNP prediction approaches used today that start from the most probable sequence, including those using base quality.
A discussion on the origin of quantum probabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holik, Federico, E-mail: olentiev2@gmail.com; Departamento de Matemática - Ciclo Básico Común, Universidad de Buenos Aires - Pabellón III, Ciudad Universitaria, Buenos Aires; Sáenz, Manuel
We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivationmore » of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.« less
NASA Technical Reports Server (NTRS)
Morgenthaler, George W.
1989-01-01
The ability to launch-on-time and to send payloads into space has progressed dramatically since the days of the earliest missile and space programs. Causes for delay during launch, i.e., unplanned 'holds', are attributable to several sources: weather, range activities, vehicle conditions, human performance, etc. Recent developments in space program, particularly the need for highly reliable logistic support of space construction and the subsequent planned operation of space stations, large unmanned space structures, lunar and Mars bases, and the necessity of providing 'guaranteed' commercial launches have placed increased emphasis on understanding and mastering every aspect of launch vehicle operations. The Center of Space Construction has acquired historical launch vehicle data and is applying these data to the analysis of space launch vehicle logistic support of space construction. This analysis will include development of a better understanding of launch-on-time capability and simulation of required support systems for vehicle assembly and launch which are necessary to support national space program construction schedules. In this paper, the author presents actual launch data on unscheduled 'hold' distributions of various launch vehicles. The data have been supplied by industrial associate companies of the Center for Space Construction. The paper seeks to determine suitable probability models which describe these historical data and that can be used for several purposes such as: inputs to broader simulations of launch vehicle logistic space construction support processes and the determination of which launch operations sources cause the majority of the unscheduled 'holds', and hence to suggest changes which might improve launch-on-time. In particular, the paper investigates the ability of a compound distribution probability model to fit actual data, versus alternative models, and recommends the most productive avenues for future statistical work.
Surface to 90 km winds for Kennedy Space Center, Florida, and Vandenberg AFB, California
NASA Technical Reports Server (NTRS)
Johnson, D. L.; Brown, S. C.
1979-01-01
Bivariate normal wind statistics for a 90 degree flight azimuth, from 0 through 90 km altitude, for Kennedy Space Center, Florida, and Vandenberg AFB, California are presented. Wind probability distributions and statistics for any rotation of axes can be computed from the five given parameters.
Deep Space Detectives: Searching for Planets Suitable for Life
ERIC Educational Resources Information Center
Pallant, Amy; Damelin, Daniel; Pryputniewicz, Sarah
2013-01-01
This article describes the High-Adventure Science curriculum unit "Is There Life in Space?" This free online investigation, developed by The Concord Consortium, helps students see how scientists use modern tools to locate planets around distant stars and explore the probability of finding extraterrestrial life. This innovative curriculum…
Incorporating detection probability into northern Great Plains pronghorn population estimates
Jacques, Christopher N.; Jenks, Jonathan A.; Grovenburg, Troy W.; Klaver, Robert W.; DePerno, Christopher S.
2014-01-01
Pronghorn (Antilocapra americana) abundances commonly are estimated using fixed-wing surveys, but these estimates are likely to be negatively biased because of violations of key assumptions underpinning line-transect methodology. Reducing bias and improving precision of abundance estimates through use of detection probability and mark-resight models may allow for more responsive pronghorn management actions. Given their potential application in population estimation, we evaluated detection probability and mark-resight models for use in estimating pronghorn population abundance. We used logistic regression to quantify probabilities that detecting pronghorn might be influenced by group size, animal activity, percent vegetation, cover type, and topography. We estimated pronghorn population size by study area and year using mixed logit-normal mark-resight (MLNM) models. Pronghorn detection probability increased with group size, animal activity, and percent vegetation; overall detection probability was 0.639 (95% CI = 0.612–0.667) with 396 of 620 pronghorn groups detected. Despite model selection uncertainty, the best detection probability models were 44% (range = 8–79%) and 180% (range = 139–217%) greater than traditional pronghorn population estimates. Similarly, the best MLNM models were 28% (range = 3–58%) and 147% (range = 124–180%) greater than traditional population estimates. Detection probability of pronghorn was not constant but depended on both intrinsic and extrinsic factors. When pronghorn detection probability is a function of animal group size, animal activity, landscape complexity, and percent vegetation, traditional aerial survey techniques will result in biased pronghorn abundance estimates. Standardizing survey conditions, increasing resighting occasions, or accounting for variation in individual heterogeneity in mark-resight models will increase the accuracy and precision of pronghorn population estimates.
Postfledging survival of European starlings
Krementz, D.G.; Nichols, J.D.; Hines, J.E.
1989-01-01
We tested the hypotheses that mass at fledging and fledge date within the breeding season affect postfledging survival in European Starlings (Sturnus vulgaris). Nestlings were weighed on day 18 after hatch and tagged with individually identifiable patagial tags. Fledge date was recorded. Marked fledglings were resighted during weekly two-day intensive observation periods for 9 weeks postfledging. Post-fledging survival and sighting probabilities were estimated for each of four groups (early or late fledging by heavy or light fledging mass). Body mass was related to post-fledging survival for birds that fledged early. Results were not clear-cut for relative fledge date, although there was weak evidence that this also influenced survival. Highest survival probability estimates occurred in the EARLY-HEAVY group, while the lowest survival estimate occurred in the LATE-LIGHT group. Sighting probabilities differed significantly among groups, emphasizing the need to estimate and compare survival using models which explicitly incorporate sighting probabilities.
NASA Technical Reports Server (NTRS)
1978-01-01
Benefits accruing to the United States from the investment of public and private resources in space industralization are projected. The future was examined to characterize resource pressures, requirements and supply (population, energy, materials, food). The backdrop of probable events, attitudes, and trends against which space industralization will evolve were postulated. The opportunities for space industry that would benefit earth were compiled and screened against terrestrial alternatives. A cursory market survey was conducted for the selected services and products provided by these initiatives.
NASA Technical Reports Server (NTRS)
Garcia-Ovejero, D.; Trejo, J. L.; Ciriza, I.; Walton, K. D.; Garcia-Segura, L. M.
2001-01-01
Effects of microgravity on postural control and volume of extracellular fluids as well as stress associated with space flight may affect the function of hypothalamic neurosecretory neurons. Since environmental modifications in young animals may result in permanent alterations in neuroendocrine function, the present study was designed to determine the effect of a space flight on oxytocinergic and vasopressinergic magnocellular hypothalamic neurons of prepuberal rats. Fifteen-day-old Sprague-Dawley female rats were flown aboard the Space Shuttle Columbia (STS-90, Neurolab mission, experiment 150) for 16 days. Age-matched litters remained on the ground in cages similar to those of the flight animals. Six animals from each group were killed on the day of landing and eight animals from each group were maintained under standard vivarium conditions and killed 18 weeks after landing. Several signs of enhanced transcriptional and biosynthetic activity were observed in magnocellular supraoptic neurons of flight animals on the day of landing compared to control animals. These include increased c-Fos expression, larger nucleoli and cytoplasm, and higher volume occupied in the neuronal perikaryon by mitochondriae, endoplasmic reticulum, Golgi apparatus, lysosomes and cytoplasmic inclusions known as nematosomes. In contrast, the volume occupied by neurosecretory vesicles in the supraoptic neuronal perikarya was significantly decreased in flight rats. This decrease was associated with a significant decrease in oxytocin and vasopressin immunoreactive levels, suggestive of an increased hormonal release. Vasopressin levels, cytoplasmic volume and c-Fos expression returned to control levels by 18 weeks after landing. These reversible effects were probably associated to osmotic stimuli resulting from modifications in the volume and distribution of extracellular fluids and plasma during flight and landing. However, oxytocin levels were still reduced at 18 weeks after landing in flight animals compared to controls. This indicates that space flight during prepuberal age may induce irreversible modifications in the regulation of oxytocinergic neurons, which in turn may result in permanent endocrine and behavioral impairments.
Intrinsic Bayesian Active Contours for Extraction of Object Boundaries in Images
Srivastava, Anuj
2010-01-01
We present a framework for incorporating prior information about high-probability shapes in the process of contour extraction and object recognition in images. Here one studies shapes as elements of an infinite-dimensional, non-linear quotient space, and statistics of shapes are defined and computed intrinsically using differential geometry of this shape space. Prior models on shapes are constructed using probability distributions on tangent bundles of shape spaces. Similar to the past work on active contours, where curves are driven by vector fields based on image gradients and roughness penalties, we incorporate the prior shape knowledge in the form of vector fields on curves. Through experimental results, we demonstrate the use of prior shape models in the estimation of object boundaries, and their success in handling partial obscuration and missing data. Furthermore, we describe the use of this framework in shape-based object recognition or classification. PMID:21076692
NASA Technical Reports Server (NTRS)
James, John T.
2011-01-01
Safe breathing air for space faring crews is essential whether they are inside an Extravehicular Mobility Suit (EMU), a small capsule such as Soyuz, or the expansive International Space Station (ISS). Sources of air pollution can include entry of propellants, excess offgassing from polymeric materials, leakage of systems compounds, escape of payload compounds, over-use of utility compounds, microbial metabolism, and human metabolism. The toxicological risk posed by a compound is comprised of the probability of escaping to cause air pollution and the magnitude of adverse effects on human health if escape occurs. The risk from highly toxic compounds is controlled by requiring multiple levels of containment to greatly reduce the probability of escape; whereas compounds that are virtually non-toxic may require little or no containment. The potential for toxicity is determined by the inherent toxicity of the compound and the amount that could potentially escape into the breathing air.
[Risk, uncertainty and ignorance in medicine].
Rørtveit, G; Strand, R
2001-04-30
Exploration of healthy patients' risk factors for disease has become a major medical activity. The rationale behind primary prevention through exploration and therapeutic risk reduction is not separated from the theoretical assumption that every form of uncertainty can be expressed as risk. Distinguishing "risk" (as quantitative probabilities in a known sample space), "strict uncertainty" (when the sample space is known, but probabilities of events cannot be quantified) and "ignorance" (when the sample space is not fully known), a typical clinical situation (primary risk of coronary disease) is analysed. It is shown how strict uncertainty and sometimes ignorance can be present, in which case the orthodox decision theoretical rationale for treatment breaks down. For use in such cases, a different ideal model of rationality is proposed, focusing on the patient's considered reasons. This model has profound implications for the current understanding of medical professionalism as well as for the design of clinical guidelines.
Risk of Skin Cancer from Space Radiation. Chapter 11
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Kim, Myung-Hee Y.; George, Kerry A.; Wu, Hong-Lu
2003-01-01
We review the methods for estimating the probability of increased incidence of skin cancers from space radiation exposure, and describe some of the individual factors that may contribute to risk projection models, including skin pigment, and synergistic effects of combined ionizing and UV exposure. The steep dose gradients from trapped electrons, protons, and heavy ions radiation during EVA and limitations in EVA dosimetry are important factors for projecting skin cancer risk of astronauts. We estimate that the probability of increased skin cancer risk varies more than 10-fold for individual astronauts and that the risk of skin cancer could exceed 1 % for future lunar base operations for astronauts with light skin color and hair. Limitations in physical dosimetry in estimating the distribution of dose at the skin suggest that new biodosimetry methods be developed for responding to accidental overexposure of the skin during future space missions.
The global impact distribution of Near-Earth objects
NASA Astrophysics Data System (ADS)
Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter M.
2016-02-01
Asteroids that could collide with the Earth are listed on the publicly available Near-Earth object (NEO) hazard web sites maintained by the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The impact probability distribution of 69 potentially threatening NEOs from these lists that produce 261 dynamically distinct impact instances, or Virtual Impactors (VIs), were calculated using the Asteroid Risk Mitigation and Optimization Research (ARMOR) tool in conjunction with OrbFit. ARMOR projected the impact probability of each VI onto the surface of the Earth as a spatial probability distribution. The projection considers orbit solution accuracy and the global impact probability. The method of ARMOR is introduced and the tool is validated against two asteroid-Earth collision cases with objects 2008 TC3 and 2014 AA. In the analysis, the natural distribution of impact corridors is contrasted against the impact probability distribution to evaluate the distributions' conformity with the uniform impact distribution assumption. The distribution of impact corridors is based on the NEO population and orbital mechanics. The analysis shows that the distribution of impact corridors matches the common assumption of uniform impact distribution and the result extends the evidence base for the uniform assumption from qualitative analysis of historic impact events into the future in a quantitative way. This finding is confirmed in a parallel analysis of impact points belonging to a synthetic population of 10,006 VIs. Taking into account the impact probabilities introduced significant variation into the results and the impact probability distribution, consequently, deviates markedly from uniformity. The concept of impact probabilities is a product of the asteroid observation and orbit determination technique and, thus, represents a man-made component that is largely disconnected from natural processes. It is important to consider impact probabilities because such information represents the best estimate of where an impact might occur.
NASA Astrophysics Data System (ADS)
Schiavone, Clinton Cleveland
We begin by defining the concept of `open' Markov processes, which are continuous-time Markov chains where probability can flow in and out through certain `boundary' states. We study open Markov processes which in the absence of such boundary flows admit equilibrium states satisfying detailed balance, meaning that the net flow of probability vanishes between all pairs of states. External couplings which fix the probabilities of boundary states can maintain such systems in non-equilibrium steady states in which non-zero probability currents flow. We show that these non-equilibrium steady states minimize a quadratic form which we call 'dissipation.' This is closely related to Prigogine's principle of minimum entropy production. We bound the rate of change of the entropy of a driven non-equilibrium steady state relative to the underlying equilibrium state in terms of the flow of probability through the boundary of the process. We then consider open Markov processes as morphisms in a symmetric monoidal category by splitting up their boundary states into certain sets of `inputs' and `outputs.' Composition corresponds to gluing the outputs of one such open Markov process onto the inputs of another so that the probability flowing out of the first process is equal to the probability flowing into the second. Tensoring in this category corresponds to placing two such systems side by side. We construct a `black-box' functor characterizing the behavior of an open Markov process in terms of the space of possible steady state probabilities and probability currents along the boundary. The fact that this is a functor means that the behavior of a composite open Markov process can be computed by composing the behaviors of the open Markov processes from which it is composed. We prove a similar black-boxing theorem for reaction networks whose dynamics are given by the non-linear rate equation. Along the way we describe a more general category of open dynamical systems where composition corresponds to gluing together open dynamical systems.
Design for Reliability and Safety Approach for the NASA New Launch Vehicle
NASA Technical Reports Server (NTRS)
Safie, Fayssal, M.; Weldon, Danny M.
2007-01-01
The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program intended for sending crew and cargo to the international Space Station (ISS), to the moon, and beyond. This program is called Constellation. As part of the Constellation program, NASA is developing new launch vehicles aimed at significantly increase safety and reliability, reduce the cost of accessing space, and provide a growth path for manned space exploration. Achieving these goals requires a rigorous process that addresses reliability, safety, and cost upfront and throughout all the phases of the life cycle of the program. This paper discusses the "Design for Reliability and Safety" approach for the NASA new crew launch vehicle called ARES I. The ARES I is being developed by NASA Marshall Space Flight Center (MSFC) in support of the Constellation program. The ARES I consists of three major Elements: A solid First Stage (FS), an Upper Stage (US), and liquid Upper Stage Engine (USE). Stacked on top of the ARES I is the Crew exploration vehicle (CEV). The CEV consists of a Launch Abort System (LAS), Crew Module (CM), Service Module (SM), and a Spacecraft Adapter (SA). The CEV development is being led by NASA Johnson Space Center (JSC). Designing for high reliability and safety require a good integrated working environment and a sound technical design approach. The "Design for Reliability and Safety" approach addressed in this paper discusses both the environment and the technical process put in place to support the ARES I design. To address the integrated working environment, the ARES I project office has established a risk based design group called "Operability Design and Analysis" (OD&A) group. This group is an integrated group intended to bring together the engineering, design, and safety organizations together to optimize the system design for safety, reliability, and cost. On the technical side, the ARES I project has, through the OD&A environment, implemented a probabilistic approach to analyze and evaluate design uncertainties and understand their impact on safety, reliability, and cost. This paper focuses on the use of the various probabilistic approaches that have been pursued by the ARES I project. Specifically, the paper discusses an integrated functional probabilistic analysis approach that addresses upffont some key areas to support the ARES I Design Analysis Cycle (DAC) pre Preliminary Design (PD) Phase. This functional approach is a probabilistic physics based approach that combines failure probabilities with system dynamics and engineering failure impact models to identify key system risk drivers and potential system design requirements. The paper also discusses other probabilistic risk assessment approaches planned by the ARES I project to support the PD phase and beyond.
Psychophysiology of Spaceflight and Aviation
NASA Technical Reports Server (NTRS)
Cowings, Patricia; Toscano, William
2013-01-01
In space, the absence of gravity alone causes unique physiological stress. Significant biomedical changes, across multiple organ systems, such as body fluid redistribution, diminished musculoskeletal strength, changes in cardiac function and sensorimotor control have been reported. The time course of development of these disorders and severity of symptoms experienced by individuals varies widely. Space motion sickness (SMS) is an example of maladaptation to microgravity, which occurs early in the mission and can have profound effects on physical health and crew performance. Disturbances in sleep quality, perception, emotional equilibrium and mood have also been reported, with impact to health and performance varying widely across individuals. And lastly, post-flight orthostatic intolerance, low blood pressure experienced after returning to Earth, is also of serious concern. Both the Russian and American space programs have a varied list of human errors and mistakes, which adversely impacted mission goals. Continued probability of human exposure to microgravity for extended time periods provides a rationale for the study of the effects of stress. The primary focus of this research group is directed toward examining individual differences in: (a) prediction of susceptibility to these disorders, (b) assessment of symptom severity, (c) evaluation of the effectiveness of countermeasures, and (d) developing and testing a physiological training method, Autogenic-Feedback Training Exercise (AFTE) as a countermeasure with multiple applications. The present paper reports on the results of a series of human flight experiments with AFTE aboard the Space Shuttle and Mir Space Station, and during emergency flight scenarios on Earth.
Local random configuration-tree theory for string repetition and facilitated dynamics of glass
NASA Astrophysics Data System (ADS)
Lam, Chi-Hang
2018-02-01
We derive a microscopic theory of glassy dynamics based on the transport of voids by micro-string motions, each of which involves particles arranged in a line hopping simultaneously displacing one another. Disorder is modeled by a random energy landscape quenched in the configuration space of distinguishable particles, but transient in the physical space as expected for glassy fluids. We study the evolution of local regions with m coupled voids. At a low temperature, energetically accessible local particle configurations can be organized into a random tree with nodes and edges denoting configurations and micro-string propagations respectively. Such trees defined in the configuration space naturally describe systems defined in two- or three-dimensional physical space. A micro-string propagation initiated by a void can facilitate similar motions by other voids via perturbing the random energy landscape, realizing path interactions between voids or equivalently string interactions. We obtain explicit expressions of the particle diffusion coefficient and a particle return probability. Under our approximation, as temperature decreases, random trees of energetically accessible configurations exhibit a sequence of percolation transitions in the configuration space, with local regions containing fewer coupled voids entering the non-percolating immobile phase first. Dynamics is dominated by coupled voids of an optimal group size, which increases as temperature decreases. Comparison with a distinguishable-particle lattice model (DPLM) of glass shows very good quantitative agreements using only two adjustable parameters related to typical energy fluctuations and the interaction range of the micro-strings.
Atmospheric constraint statistics for the Space Shuttle mission planning
NASA Technical Reports Server (NTRS)
Smith, O. E.; Batts, G. W.; Willett, J. A.
1982-01-01
The procedures used to establish statistics of atmospheric constraints of interest to the Space Shuttle mission planning are presented. The statistics considered are for the frequency of occurrence, runs, and time conditional probabilities of several atmospheric constrants for each of the Space Shuttle mission phases. The mission phases considered are (1) prelaunch, (2) launch, (3) return to launch site, (4) abort once around landing, and (5) end of mission landing.
Lessons Learned From The EMU Fire and How It Impacts CxP Suit Element Development and Testing
NASA Technical Reports Server (NTRS)
Metts, Jonathan; Hill, Terry
2008-01-01
During testing a Space Shuttle Extravehicular Mobility Unit (EMU) pressure garment and life-support backpack was destroyed in a flash fire in the Johnson Space Center's Crew systems laboratory. This slide presentation reviews the accident, probable causes, the lessons learned and the effect this has on the testing and the environment for testing of the Space Suit for the Constellation Program.
Design knowledge capture for the space station
NASA Technical Reports Server (NTRS)
Crouse, K. R.; Wechsler, D. B.
1987-01-01
The benefits of design knowledge availability are identifiable and pervasive. The implementation of design knowledge capture and storage using current technology increases the probability for success, while providing for a degree of access compatibility with future applications. The space station design definition should be expanded to include design knowledge. Design knowledge should be captured. A critical timing relationship exists between the space station development program, and the implementation of this project.
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Kim, Myung-Hee Y.; Chappell, Lori J.
2012-01-01
Cancer risk is an important concern for International Space Station (ISS) missions and future exploration missions. An important question concerns the likelihood of a causal association between a crew members radiation exposure and the occurrence of cancer. The probability of causation (PC), also denoted as attributable risk, is used to make such an estimate. This report summarizes the NASA model of space radiation cancer risks and uncertainties, including improvements to represent uncertainties in tissue-specific cancer incidence models for never-smokers and the U.S. average population. We report on tissue-specific cancer incidence estimates and PC for different post-mission times for ISS and exploration missions. An important conclusion from our analysis is that the NASA policy to limit the risk of exposure-induced death to 3% at the 95% confidence level largely ensures that estimates of the PC for most cancer types would not reach a level of significance. Reducing uncertainties through radiobiological research remains the most efficient method to extend mission length and establish effective mitigators for cancer risks. Efforts to establish biomarkers of space radiation-induced tumors and to estimate PC for rarer tumor types are briefly discussed.
Physically detached 'compact groups'
NASA Technical Reports Server (NTRS)
Hernquist, Lars; Katz, Neal; Weinberg, David H.
1995-01-01
A small fraction of galaxies appear to reside in dense compact groups, whose inferred crossing times are much shorter than a Hubble time. These short crossing times have led to considerable disagreement among researchers attempting to deduce the dynamical state of these systems. In this paper, we suggest that many of the observed groups are not physically bound but are chance projections of galaxies well separated along the line of sight. Unlike earlier similar proposals, ours does not require that the galaxies in the compact group be members of a more diffuse, but physically bound entity. The probability of physically separated galaxies projecting into an apparent compact group is nonnegligible if most galaxies are distributed in thin filaments. We illustrate this general point with a specific example: a simulation of a cold dark matter universe, in which hydrodynamic effects are included to identify galaxies. The simulated galaxy distribution is filamentary and end-on views of these filaments produce apparent galaxy associations that have sizes and velocity dispersions similar to those of observed compact groups. The frequency of such projections is sufficient, in principle, to explain the observed space density of groups in the Hickson catalog. We discuss the implications of our proposal for the formation and evolution of groups and elliptical galaxies. The proposal can be tested by using redshift-independent distance estimators to measure the line-of-sight spatial extent of nearby compact groups.
A synaptic organizing principle for cortical neuronal groups
Perin, Rodrigo; Berger, Thomas K.; Markram, Henry
2011-01-01
Neuronal circuitry is often considered a clean slate that can be dynamically and arbitrarily molded by experience. However, when we investigated synaptic connectivity in groups of pyramidal neurons in the neocortex, we found that both connectivity and synaptic weights were surprisingly predictable. Synaptic weights follow very closely the number of connections in a group of neurons, saturating after only 20% of possible connections are formed between neurons in a group. When we examined the network topology of connectivity between neurons, we found that the neurons cluster into small world networks that are not scale-free, with less than 2 degrees of separation. We found a simple clustering rule where connectivity is directly proportional to the number of common neighbors, which accounts for these small world networks and accurately predicts the connection probability between any two neurons. This pyramidal neuron network clusters into multiple groups of a few dozen neurons each. The neurons composing each group are surprisingly distributed, typically more than 100 μm apart, allowing for multiple groups to be interlaced in the same space. In summary, we discovered a synaptic organizing principle that groups neurons in a manner that is common across animals and hence, independent of individual experiences. We speculate that these elementary neuronal groups are prescribed Lego-like building blocks of perception and that acquired memory relies more on combining these elementary assemblies into higher-order constructs. PMID:21383177
Multiple Streaming and the Probability Distribution of Density in Redshift Space
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hui, Lam; Kofman, Lev; Shandarin, Sergei F.
2000-07-01
We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple streaming using the Zeldovich approximation (ZA), and compute the average number of streams in both real and redshift space. We find that multiple streaming can be significant in redshift space but negligible in real space, even at moderate values of the linear fluctuation amplitude ({sigma}{sub l}(less-or-similar sign)1). Moreover, unlike their real-space counterparts, redshift-space multiple streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which aremore » physically distinct from the fingers-of-God due to small-scale virialized motions, might in part explain the well-known departure of redshift distortions from the classic linear prediction by Kaiser, even at relatively large scales where the corresponding density field in real space is well described by linear perturbation theory. We also compute, using the ZA, the probability distribution function (PDF) of the density, as well as S{sub 3}, in real and redshift space, and compare it with the PDF measured from N-body simulations. The role of caustics in defining the character of the high-density tail is examined. We find that (non-Lagrangian) smoothing, due to both finite resolution or discreteness and small-scale velocity dispersions, is very effective in erasing caustic structures, unless the initial power spectrum is sufficiently truncated. (c) 2000 The American Astronomical Society.« less
NASA Astrophysics Data System (ADS)
Gerd, Niestegge
2010-12-01
In the quantum mechanical Hilbert space formalism, the probabilistic interpretation is a later ad-hoc add-on, more or less enforced by the experimental evidence, but not motivated by the mathematical model itself. A model involving a clear probabilistic interpretation from the very beginning is provided by the quantum logics with unique conditional probabilities. It includes the projection lattices in von Neumann algebras and here probability conditionalization becomes identical with the state transition of the Lüders-von Neumann measurement process. This motivates the definition of a hierarchy of five compatibility and comeasurability levels in the abstract setting of the quantum logics with unique conditional probabilities. Their meanings are: the absence of quantum interference or influence, the existence of a joint distribution, simultaneous measurability, and the independence of the final state after two successive measurements from the sequential order of these two measurements. A further level means that two elements of the quantum logic (events) belong to the same Boolean subalgebra. In the general case, the five compatibility and comeasurability levels appear to differ, but they all coincide in the common Hilbert space formalism of quantum mechanics, in von Neumann algebras, and in some other cases.
NASA Astrophysics Data System (ADS)
Dib, Alain; Kavvas, M. Levent
2018-03-01
The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.
Combined loading criterial influence on structural performance
NASA Technical Reports Server (NTRS)
Kuchta, B. J.; Sealey, D. M.; Howell, L. J.
1972-01-01
An investigation was conducted to determine the influence of combined loading criteria on the space shuttle structural performance. The study consisted of four primary phases: Phase (1) The determination of the sensitivity of structural weight to various loading parameters associated with the space shuttle. Phase (2) The determination of the sensitivity of structural weight to various levels of loading parameter variability and probability. Phase (3) The determination of shuttle mission loading parameters variability and probability as a function of design evolution and the identification of those loading parameters where inadequate data exists. Phase (4) The determination of rational methods of combining both deterministic time varying and probabilistic loading parameters to provide realistic design criteria. The study results are presented.
Wang, Zhiping; Cao, Dewei; Yu, Benli
2016-05-01
We present a new scheme for three-dimensional (3D) atom localization in a three-level atomic system via measuring the absorption of a weak probe field. Owing to the space-dependent atom-field interaction, the position probability distribution of the atom can be directly determined by measuring the probe absorption. It is found that, by properly varying the parameters of the system, the probability of finding the atom in 3D space can be almost 100%. Our scheme opens a promising way to achieve high-precision and high-efficiency 3D atom localization, which provides some potential applications in laser cooling or atom nano-lithography via atom localization.
NASA Technical Reports Server (NTRS)
Han, D.; Kim, Y. S.; Noz, Marilyn E.
1989-01-01
It is possible to calculate expectation values and transition probabilities from the Wigner phase-space distribution function. Based on the canonical transformation properties of the Wigner function, an algorithm is developed for calculating these quantities in quantum optics for coherent and squeezed states. It is shown that the expectation value of a dynamical variable can be written in terms of its vacuum expectation value of the canonically transformed variable. Parallel-axis theorems are established for the photon number and its variant. It is also shown that the transition probability between two squeezed states can be reduced to that of the transition from one squeezed state to vacuum.
Nuclear risk analysis of the Ulysses mission
NASA Astrophysics Data System (ADS)
Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W.
An account is given of the method used to quantify the risks accruing to the use of a radioisotope thermoelectric generator fueled by Pu-238 dioxide aboard the Space Shuttle-launched Ulysses mission. After using a Monte Carlo technique to develop probability distributions for the radiological consequences of a range of accident scenarios throughout the mission, factors affecting those consequences are identified in conjunction with their probability distributions. The functional relationship among all the factors is then established, and probability distributions for all factor effects are combined by means of a Monte Carlo technique.
Probability of the moiré effect in barrier and lenticular autostereoscopic 3D displays.
Saveljev, Vladimir; Kim, Sung-Kyu
2015-10-05
The probability of the moiré effect in LCD displays is estimated as a function of angle based on the experimental data; a theoretical function (node spacing) is proposed basing on the distance between nodes. Both functions are close to each other. The connection between the probability of the moiré effect and the Thomae's function is also found. The function proposed in this paper can be used in the minimization of the moiré effect in visual displays, especially in autostereoscopic 3D displays.
Gap probability - Measurements and models of a pecan orchard
NASA Technical Reports Server (NTRS)
Strahler, Alan H.; Li, Xiaowen; Moody, Aaron; Liu, YI
1992-01-01
Measurements and models are compared for gap probability in a pecan orchard. Measurements are based on panoramic photographs of 50* by 135 view angle made under the canopy looking upwards at regular positions along transects between orchard trees. The gap probability model is driven by geometric parameters at two levels-crown and leaf. Crown level parameters include the shape of the crown envelope and spacing of crowns; leaf level parameters include leaf size and shape, leaf area index, and leaf angle, all as functions of canopy position.
Electrodynamic Dust Shield for Space Applications
NASA Technical Reports Server (NTRS)
Mackey, P. J.; Johansen, M. R.; Olsen, R. C.; Raines, M. G.; Phillips, J. R., III; Pollard, J. R. S.; Calle, C. I.
2016-01-01
The International Space Exploration Coordination Group (ISECG) has chosen dust mitigation technology as a Global Exploration Roadmap (GER) critical technology need in order to reduce life cycle cost and risk, and increase the probability of mission success. NASA has also included Particulate Contamination Prevention and Mitigation as a cross-cutting technology to be developed for contamination prevention, cleaning and protection. This technology has been highlighted due to the detrimental effect of dust on both human and robotic missions. During manned Apollo missions, dust caused issues with both equipment and crew. Contamination of equipment caused many issues including incorrect instrument readings and increased temperatures due to masking of thermal radiators. The astronauts were directly affected by dust that covered space suits, obscured face shields and later propagated to the cabin and into the crew's eyes and lungs. Robotic missions on Mars were affected when solar panels were obscured by dust thereby reducing the effectiveness of the solar panels. The Electrostatics and Surface Physics Lab in Swamp Works at the Kennedy Space Center has been developing an Electrodynamic Dust Shield (EDS) to remove dust from multiple surfaces, including glass shields and thermal radiators. This technology has been tested in lab environments and has evolved over several years. Tests of the technology include reduced gravity flights (one-sixth g) in which Apollo Lunar dust samples were successfully removed from glass shields while under vacuum (10(exp -6) kPa).
Decision analysis with approximate probabilities
NASA Technical Reports Server (NTRS)
Whalen, Thomas
1992-01-01
This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.
Ethnic Group Bias in Intelligence Test Items.
ERIC Educational Resources Information Center
Scheuneman, Janice
In previous studies of ethnic group bias in intelligence test items, the question of bias has been confounded with ability differences between the ethnic group samples compared. The present study is based on a conditional probability model in which an unbiased item is defined as one where the probability of a correct response to an item is the…
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.; Roeder, William; Merceret, Francis J.
2010-01-01
A technique has been developed to calculate the probability that any nearby lightning stroke is within any radius of any point of interest. In practice, this provides the probability that a nearby lightning stroke was within a key distance of a facility, rather than the error ellipses centered on the stroke. This process takes the current bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to get the probability that the stroke is inside any specified radius. This new facility-centric technique will be much more useful to the space launch customers and may supersede the lightning error ellipse approach discussed in [5], [6].
Poisson statistics of PageRank probabilities of Twitter and Wikipedia networks
NASA Astrophysics Data System (ADS)
Frahm, Klaus M.; Shepelyansky, Dima L.
2014-04-01
We use the methods of quantum chaos and Random Matrix Theory for analysis of statistical fluctuations of PageRank probabilities in directed networks. In this approach the effective energy levels are given by a logarithm of PageRank probability at a given node. After the standard energy level unfolding procedure we establish that the nearest spacing distribution of PageRank probabilities is described by the Poisson law typical for integrable quantum systems. Our studies are done for the Twitter network and three networks of Wikipedia editions in English, French and German. We argue that due to absence of level repulsion the PageRank order of nearby nodes can be easily interchanged. The obtained Poisson law implies that the nearby PageRank probabilities fluctuate as random independent variables.
Relative commutativity degree of some dihedral groups
NASA Astrophysics Data System (ADS)
Abdul Hamid, Muhanizah; Mohd Ali, Nor Muhainiah; Sarmin, Nor Haniza; Abd Manaf, Fadila Normahia
2013-04-01
The commutativity degree of a finite group G was introduced by Erdos and Turan for symmetric groups, finite groups and finite rings in 1968. The commutativity degree, P(G), is defined as the probability that a random pair of elements in a group commute. The relative commutativity degree of a group G is defined as the probability for an element of subgroup, H and an element of G to commute with one another and denoted by P(H,G). In this research the relative commutativity degree of some dihedral groups are determined.
Mathematical Methods for Physics and Engineering Third Edition Paperback Set
NASA Astrophysics Data System (ADS)
Riley, Ken F.; Hobson, Mike P.; Bence, Stephen J.
2006-06-01
Prefaces; 1. Preliminary algebra; 2. Preliminary calculus; 3. Complex numbers and hyperbolic functions; 4. Series and limits; 5. Partial differentiation; 6. Multiple integrals; 7. Vector algebra; 8. Matrices and vector spaces; 9. Normal modes; 10. Vector calculus; 11. Line, surface and volume integrals; 12. Fourier series; 13. Integral transforms; 14. First-order ordinary differential equations; 15. Higher-order ordinary differential equations; 16. Series solutions of ordinary differential equations; 17. Eigenfunction methods for differential equations; 18. Special functions; 19. Quantum operators; 20. Partial differential equations: general and particular; 21. Partial differential equations: separation of variables; 22. Calculus of variations; 23. Integral equations; 24. Complex variables; 25. Application of complex variables; 26. Tensors; 27. Numerical methods; 28. Group theory; 29. Representation theory; 30. Probability; 31. Statistics; Index.
NASA Astrophysics Data System (ADS)
Hur, Gwang-Ok
The -kicked rotor is a paradigm of quantum chaos. Its realisation with clouds of cold atoms in pulsed optical lattices demonstrated the well-known quantum chaos phenomenon of 'dynamical localisation'. In those experi ments by several groups world-wide, the £-kicks were applied at equal time intervals. However, recent theoretical and experimental work by the cold atom group at UCL Monteiro et al 2002, Jonckheere et al 2003, Jones et al 2004 showed that novel quantum and classical dynamics arises if the atomic cloud is pulsed with repeating sequences of unequally spaced kicks. In Mon teiro et al 2002 it was found that the energy absorption rates depend on the momentum of the atoms relative to the optical lattice hence a type of chaotic ratchet was proposed. In Jonckheere et al and Jones et al, a possible mechanism for selecting atoms according to their momenta (velocity filter) was investigated. The aim of this thesis was to study the properties of the underlying eigen values and eigenstates. Despite the unequally-spaced kicks, these systems are still time-periodic, so we in fact investigated the Floquet states, which are eigenstates of U(T), the one-period time evolution operator. The Floquet states and corresponding eigenvalues were obtained by diagonalising a ma trix representation of the operator U(T). It was found that the form of the eigenstates enables us to analyse qual itatively the atomic momentum probability distributions, N(p) measured experimentally. In particular, the momentum width of the individual eigen states varies strongly with < p > as expected from the theoretical and ex- perimental results obtained previously. In addition, at specific < p > close to values which in the experiment yield directed motion (ratchet transport), the probability distribution of the individual Floquet states is asymmetric, mirroring the asymmetric N(p) measured in clouds of cesium atoms. In the penultimate chapter, the spectral fluctuations (eigenvalue statis tics) are investigated for one particular system, the double-delta kicked rotor. We computed Nearest Neighbour Spacing (NNS) distributions as well as the number variances (E2 statistics). We find that even in regimes where the corresponding classical dynamics are fully chaotic, the statistics are, unex pectedly, intermediate between fully chaotic (GOE) and fully regular (Pois- son). It is argued that they are analogous to the critical statistics seen in the Anderson metal-insulator transition.
Electron number probability distributions for correlated wave functions.
Francisco, E; Martín Pendás, A; Blanco, M A
2007-03-07
Efficient formulas for computing the probability of finding exactly an integer number of electrons in an arbitrarily chosen volume are only known for single-determinant wave functions [E. Cances et al., Theor. Chem. Acc. 111, 373 (2004)]. In this article, an algebraic method is presented that extends these formulas to the case of multideterminant wave functions and any number of disjoint volumes. The derived expressions are applied to compute the probabilities within the atomic domains derived from the space partitioning based on the quantum theory of atoms in molecules. Results for a series of test molecules are presented, paying particular attention to the effects of electron correlation and of some numerical approximations on the computed probabilities.
Research and technology report, 1981
NASA Technical Reports Server (NTRS)
1981-01-01
The Marshall Space Flight Center programs of research and technology for 1981 in various areas of aerospace science are reviewed. Each activity reviewed has a high probability of application to current or future programs or is an application of the results of current programs. Projects in atmospheric and magnetospheric science, solar physics, astronomy, and space technology are included.
Recovering a Probabilistic Knowledge Structure by Constraining Its Parameter Space
ERIC Educational Resources Information Center
Stefanutti, Luca; Robusto, Egidio
2009-01-01
In the Basic Local Independence Model (BLIM) of Doignon and Falmagne ("Knowledge Spaces," Springer, Berlin, 1999), the probabilistic relationship between the latent knowledge states and the observable response patterns is established by the introduction of a pair of parameters for each of the problems: a lucky guess probability and a careless…
Quintinite-1 M from the Mariinsky Deposit, Ural Emerald Mines, Central Urals, Russia
NASA Astrophysics Data System (ADS)
Zhitova, E. S.; Popov, M. P.; Krivovichev, S. V.; Zaitsev, A. N.; Vlasenko, N. S.
2017-12-01
The paper describes the first finding of quintinite [Mg4Al2(OH)12][(CO3)(H2O)3] at the Mariinsky deposit in the Central Urals, Russia. The mineral occurs as white tabular crystals in cavities within altered gabbro in association with prehnite, calcite, and a chlorite-group mineral. Quintinite is the probable result of late hydrothermal alteration of primary mafic and ultramafic rocks hosting emerald-bearing glimmerite. According to electron microprobe data, the Mg: Al ratio is 2: 1. IR spectroscopy has revealed hydroxyl and carbonate groups and H2O molecules in the mineral. According to single crystal XRD data, quintinite is monoclinic, space group C2/ m, a =5.233(1), b = 9.051(2), c = 7.711(2) Å, β = 103.09(3)°, V = 355.7(2) Å3. Based on structure refinement, the polytype of quintinite should be denoted as 1M. This is the third approved occurrence of quintinite-1M in the world after the Kovdor complex and Bazhenovsky chrysotile-asbestos deposit.
NASA Technical Reports Server (NTRS)
Majewski, Steven R.; Munn, Jeffrey A.; Hawley, Suzanne L.
1994-01-01
Radial velocities have been obtained for six of nine stars identified on the basis of similar distances and common, extreme transverse velocities in the proper motion survey of Majewski (1992) as a candidate halo moving group at the north Galactic pole. These radial velocities correspond to velocities perpendicular to the Galactic plane which span the range -48 +/- 21 to -128 +/- 9 km/sec (but a smaller range, -48 +/- 21 to -86 +/- 19 km/sec, when only our own measurements are considered), significantly different than the expected distribution, with mean 0 km/sec, for a random sample of either halo or thick disk stars. The probability of picking such a set of radial velocities at random is less than 1%. Thus the radial velocity data support the hypothesis that these stars constitute part of a halo moving group or star stream at a distance of approximately 4-5 kpc above the Galactic plane. If real, this moving group is evidence for halo phase space substructure which may be the fossil remains of a destroyed globular cluster, Galactic satellite, or Searle & Zinn (1978) 'fragment.'
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang Yumin; Lum, Kai-Yew; Wang Qingguo
In this paper, a H-infinity fault detection and diagnosis (FDD) scheme for a class of discrete nonlinear system fault using output probability density estimation is presented. Unlike classical FDD problems, the measured output of the system is viewed as a stochastic process and its square root probability density function (PDF) is modeled with B-spline functions, which leads to a deterministic space-time dynamic model including nonlinearities, uncertainties. A weighting mean value is given as an integral function of the square root PDF along space direction, which leads a function only about time and can be used to construct residual signal. Thus,more » the classical nonlinear filter approach can be used to detect and diagnose the fault in system. A feasible detection criterion is obtained at first, and a new H-infinity adaptive fault diagnosis algorithm is further investigated to estimate the fault. Simulation example is given to demonstrate the effectiveness of the proposed approaches.« less
NASA Astrophysics Data System (ADS)
Zhang, Yumin; Wang, Qing-Guo; Lum, Kai-Yew
2009-03-01
In this paper, a H-infinity fault detection and diagnosis (FDD) scheme for a class of discrete nonlinear system fault using output probability density estimation is presented. Unlike classical FDD problems, the measured output of the system is viewed as a stochastic process and its square root probability density function (PDF) is modeled with B-spline functions, which leads to a deterministic space-time dynamic model including nonlinearities, uncertainties. A weighting mean value is given as an integral function of the square root PDF along space direction, which leads a function only about time and can be used to construct residual signal. Thus, the classical nonlinear filter approach can be used to detect and diagnose the fault in system. A feasible detection criterion is obtained at first, and a new H-infinity adaptive fault diagnosis algorithm is further investigated to estimate the fault. Simulation example is given to demonstrate the effectiveness of the proposed approaches.
Probing the statistics of transport in the Hénon Map
NASA Astrophysics Data System (ADS)
Alus, O.; Fishman, S.; Meiss, J. D.
2016-09-01
The phase space of an area-preserving map typically contains infinitely many elliptic islands embedded in a chaotic sea. Orbits near the boundary of a chaotic region have been observed to stick for long times, strongly influencing their transport properties. The boundary is composed of invariant "boundary circles." We briefly report recent results of the distribution of rotation numbers of boundary circles for the Hénon quadratic map and show that the probability of occurrence of small integer entries of their continued fraction expansions is larger than would be expected for a number chosen at random. However, large integer entries occur with probabilities distributed proportionally to the random case. The probability distributions of ratios of fluxes through island chains is reported as well. These island chains are neighbours in the sense of the Meiss-Ott Markov-tree model. Two distinct universality families are found. The distributions of the ratio between the flux and orbital period are also presented. All of these results have implications for models of transport in mixed phase space.
Multi-beam transmitter geometries for free-space optical communications
NASA Astrophysics Data System (ADS)
Tellez, Jason A.; Schmidt, Jason D.
2010-02-01
Free-space optical communications systems provide the opportunity to take advantage of higher data transfer rates and lower probability of intercept compared to radio-frequency communications. However, propagation through atmospheric turbulence, such as for airborne laser communication over long paths, results in intensity variations at the receiver and a corresponding degradation in bit error rate (BER) performance. Previous literature has shown that two transmitters, when separated sufficiently, can effectively average out the intensity varying effects of the atmospheric turbulence at the receiver. This research explores the impacts of adding more transmitters and the marginal reduction in the probability of signal fades while minimizing the overall transmitter footprint, an important design factor when considering an airborne communications system. Analytical results for the cumulative distribution function are obtained for tilt-only results, while wave-optics simulations are used to simulate the effects of scintillation. These models show that the probability of signal fade is reduced as the number of transmitters is increased.
Probability Distributions for Random Quantum Operations
NASA Astrophysics Data System (ADS)
Schultz, Kevin
Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.
NASA Astrophysics Data System (ADS)
Korepanov, Alexey
2017-12-01
Let {T : M \\to M} be a nonuniformly expanding dynamical system, such as logistic or intermittent map. Let {v : M \\to R^d} be an observable and {v_n = \\sum_{k=0}^{n-1} v circ T^k} denote the Birkhoff sums. Given a probability measure {μ} on M, we consider v n as a discrete time random process on the probability space {(M, μ)} . In smooth ergodic theory there are various natural choices of {μ} , such as the Lebesgue measure, or the absolutely continuous T-invariant measure. They give rise to different random processes. We investigate relation between such processes. We show that in a large class of measures, it is possible to couple (redefine on a new probability space) every two processes so that they are almost surely close to each other, with explicit estimates of "closeness". The purpose of this work is to close a gap in the proof of the almost sure invariance principle for nonuniformly hyperbolic transformations by Melbourne and Nicol.
Space Situational Awareness of Large Numbers of Payloads From a Single Deployment
NASA Astrophysics Data System (ADS)
Segerman, A.; Byers, J.; Emmert, J.; Nicholas, A.
2014-09-01
The nearly simultaneous deployment of a large number of payloads from a single vehicle presents a new challenge for space object catalog maintenance and space situational awareness (SSA). Following two cubesat deployments last November, it took five weeks to catalog the resulting 64 orbits. The upcoming Kicksat mission will present an even greater SSA challenge, with its deployment of 128 chip-sized picosats. Although all of these deployments are in short-lived orbits, future deployments will inevitably occur at higher altitudes, with a longer term threat of collision with active spacecraft. With such deployments, individual scientific payload operators require rapid precise knowledge of their satellites' locations. Following the first November launch, the cataloguing did not initially associate a payload with each orbit, leaving this to the satellite operators. For short duration missions, the time required to identify an experiment's specific orbit may easily be a large fraction of the spacecraft's lifetime. For a Kicksat-type deployment, present tracking cannot collect enough observations to catalog each small object. The current approach is to treat the chip cloud as a single catalog object. However, the cloud dissipates into multiple subclouds and, ultimately, tiny groups of untrackable chips. One response to this challenge may be to mandate installation of a transponder on each spacecraft. Directional transponder transmission detections could be used as angle observations for orbit cataloguing. Of course, such an approach would only be employable with cooperative spacecraft. In other cases, a probabilistic association approach may be useful, with the goal being to establish the probability of an element being at a given point in space. This would permit more reliable assessment of the probability of collision of active spacecraft with any cloud element. This paper surveys the cataloguing challenges presented by large scale deployments of small spacecraft, examining current methods. Potential new approaches are discussed, including simulations to evaluate their utility. Acknowledgement: This work was supported by the Office of the Assistant Secretary of Defense for R&E, via the Data-to-Decisions program.
A novel method for correcting scanline-observational bias of discontinuity orientation
Huang, Lei; Tang, Huiming; Tan, Qinwen; Wang, Dingjian; Wang, Liangqing; Ez Eldin, Mutasim A. M.; Li, Changdong; Wu, Qiong
2016-01-01
Scanline observation is known to introduce an angular bias into the probability distribution of orientation in three-dimensional space. In this paper, numerical solutions expressing the functional relationship between the scanline-observational distribution (in one-dimensional space) and the inherent distribution (in three-dimensional space) are derived using probability theory and calculus under the independence hypothesis of dip direction and dip angle. Based on these solutions, a novel method for obtaining the inherent distribution (also for correcting the bias) is proposed, an approach which includes two procedures: 1) Correcting the cumulative probabilities of orientation according to the solutions, and 2) Determining the distribution of the corrected orientations using approximation methods such as the one-sample Kolmogorov-Smirnov test. The inherent distribution corrected by the proposed method can be used for discrete fracture network (DFN) modelling, which is applied to such areas as rockmass stability evaluation, rockmass permeability analysis, rockmass quality calculation and other related fields. To maximize the correction capacity of the proposed method, the observed sample size is suggested through effectiveness tests for different distribution types, dispersions and sample sizes. The performance of the proposed method and the comparison of its correction capacity with existing methods are illustrated with two case studies. PMID:26961249
Generalized probabilistic scale space for image restoration.
Wong, Alexander; Mishra, Akshaya K
2010-10-01
A novel generalized sampling-based probabilistic scale space theory is proposed for image restoration. We explore extending the definition of scale space to better account for both noise and observation models, which is important for producing accurately restored images. A new class of scale-space realizations based on sampling and probability theory is introduced to realize this extended definition in the context of image restoration. Experimental results using 2-D images show that generalized sampling-based probabilistic scale-space theory can be used to produce more accurate restored images when compared with state-of-the-art scale-space formulations, particularly under situations characterized by low signal-to-noise ratios and image degradation.
Teaching Probabilities and Statistics to Preschool Children
ERIC Educational Resources Information Center
Pange, Jenny
2003-01-01
This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…
NASA Astrophysics Data System (ADS)
Smith, L. A.
2007-12-01
We question the relevance of climate-model based Bayesian (or other) probability statements for decision support and impact assessment on spatial scales less than continental and temporal averages less than seasonal. Scientific assessment of higher resolution space and time scale information is urgently needed, given the commercial availability of "products" at high spatiotemporal resolution, their provision by nationally funded agencies for use both in industry decision making and governmental policy support, and their presentation to the public as matters of fact. Specifically we seek to establish necessary conditions for probability forecasts (projections conditioned on a model structure and a forcing scenario) to be taken seriously as reflecting the probability of future real-world events. We illustrate how risk management can profitably employ imperfect models of complicated chaotic systems, following NASA's study of near-Earth PHOs (Potentially Hazardous Objects). Our climate models will never be perfect, nevertheless the space and time scales on which they provide decision- support relevant information is expected to improve with the models themselves. Our aim is to establish a set of baselines of internal consistency; these are merely necessary conditions (not sufficient conditions) that physics based state-of-the-art models are expected to pass if their output is to be judged decision support relevant. Probabilistic Similarity is proposed as one goal which can be obtained even when our models are not empirically adequate. In short, probabilistic similarity requires that, given inputs similar to today's empirical observations and observational uncertainties, we expect future models to produce similar forecast distributions. Expert opinion on the space and time scales on which we might reasonably expect probabilistic similarity may prove of much greater utility than expert elicitation of uncertainty in parameter values in a model that is not empirically adequate; this may help to explain the reluctance of experts to provide information on "parameter uncertainty." Probability statements about the real world are always conditioned on some information set; they may well be conditioned on "False" making them of little value to a rational decision maker. In other instances, they may be conditioned on physical assumptions not held by any of the modellers whose model output is being cast as a probability distribution. Our models will improve a great deal in the next decades, and our insight into the likely climate fifty years hence will improve: maintaining the credibility of the science and the coherence of science based decision support, as our models improve, require a clear statement of our current limitations. What evidence do we have that today's state-of-the-art models provide decision-relevant probability forecasts? What space and time scales do we currently have quantitative, decision-relevant information on for 2050? 2080?
NASA Astrophysics Data System (ADS)
Gurevich, Boris M.; Tempel'man, Arcady A.
2010-05-01
For a dynamical system \\tau with 'time' \\mathbb Z^d and compact phase space X, we introduce three subsets of the space \\mathbb R^m related to a continuous function f\\colon X\\to\\mathbb R^m: the set of time means of f and two sets of space means of f, namely those corresponding to all \\tau-invariant probability measures and those corresponding to some equilibrium measures on X. The main results concern topological properties of these sets of means and their mutual position. Bibliography: 18 titles.
Bangalore, Sripal; Gopinath, Devi; Yao, Siu-Sun; Chaudhry, Farooq A
2007-03-01
We sought to evaluate the risk stratification ability and incremental prognostic value of stress echocardiography over historic, clinical, and stress electrocardiographic (ECG) variables, over a wide spectrum of bayesian pretest probabilities of coronary artery disease (CAD). Stress echocardiography is an established technique for the diagnosis of CAD. However, data on incremental prognostic value of stress echocardiography over historic, clinical, and stress ECG variables in patients with known or suggested CAD is limited. We evaluated 3259 patients (60 +/- 13 years, 48% men) undergoing stress echocardiography. Patients were grouped into low (<15%), intermediate (15-85%), and high (>85%) pretest CAD likelihood subgroups using standard software. The historical, clinical, stress ECG, and stress echocardiographic variables were recorded for the entire cohort. Follow-up (2.7 +/- 1.1 years) for confirmed myocardial infarction (n = 66) and cardiac death (n = 105) was obtained. For the entire cohort, an ischemic stress echocardiography study confers a 5.0 times higher cardiac event rate than the normal stress echocardiography group (4.0% vs 0.8%/y, P < .0001). Furthermore, Cox proportional hazard regression model showed incremental prognostic value of stress echocardiography variables over historic, clinical, and stress ECG variables across all pretest probability subgroups (global chi2 increased from 5.1 to 8.5 to 20.1 in the low pretest group, P = .44 and P = .01; from 20.9 to 28.2 to 116 in the intermediate pretest group, P = .47 and P < .0001; and from 17.5 to 36.6 to 61.4 in the high pretest group, P < .0001 for both groups). A normal stress echocardiography portends a benign prognosis (<1% event rate/y) in all pretest probability subgroups and even in patients with high pretest probability and yields incremental prognostic value over historic, clinical, and stress ECG variables across all pretest probability subgroups. The best incremental value is, however, in the intermediate pretest probability subgroup.
Assessing flight safety differences between the United States regional and major airlines
NASA Astrophysics Data System (ADS)
Sharp, Broderick H.
During 2008, the U.S. domestic airline departures exceeded 28,000 flights per day. Thirty-nine or less than 0.2 of 1% of these flights resulted in operational incidents or accidents. However, even a low percentage of airline accidents and incidents continue to cause human suffering and property loss. The charge of this study was the comparison of U.S. major and regional airline safety histories. The study spans safety events from January 1982 through December 2008. In this quantitative analysis, domestic major and regional airlines were statistically tested for their flight safety differences. Four major airlines and thirty-seven regional airlines qualified for the safety study which compared the airline groups' fatal accidents, incidents, non-fatal accidents, pilot errors, and the remaining six safety event probable cause types. The six other probable cause types are mechanical failure, weather, air traffic control, maintenance, other, and unknown causes. The National Transportation Safety Board investigated each airline safety event, and assigned a probable cause to each event. A sample of 500 events was randomly selected from the 1,391 airlines' accident and incident population. The airline groups' safety event probabilities were estimated using the least squares linear regression. A probability significance level of 5% was chosen to conclude the appropriate research question hypothesis. The airline fatal accidents and incidents probability levels were 1.2% and 0.05% respectively. These two research questions did not reach the 5% significance level threshold. Therefore, the airline groups' fatal accidents and non-destructive incidents probabilities favored the airline groups' safety differences hypothesis. The linear progression estimates for the remaining three research questions were 71.5% for non-fatal accidents, 21.8% for the pilot errors, and 7.4% significance level for the six probable causes. These research questions' linear regressions are greater than the 5% level. Consequently, these three research questions favored airline groups' safety similarities hypothesis. The study indicates the U.S. domestic major airlines were safer than the regional airlines. Ideas for potential airline safety progress can examine pilot fatigue, the airline groups' hiring policies, the government's airline oversight personnel, or the comparison of individual airline's operational policies.
Waterborne disease outbreak detection: an integrated approach using health administrative databases.
Coly, S; Vincent, N; Vaissiere, E; Charras-Garrido, M; Gallay, A; Ducrot, C; Mouly, D
2017-08-01
Hundreds of waterborne disease outbreaks (WBDO) of acute gastroenteritis (AGI) due to contaminated tap water are reported in developed countries each year. Such outbreaks are probably under-detected. The aim of our study was to develop an integrated approach to detect and study clusters of AGI in geographical areas with homogeneous exposure to drinking water. Data for the number of AGI cases are available at the municipality level while exposure to tap water depends on drinking water networks (DWN). These two geographical units do not systematically overlap. This study proposed to develop an algorithm which would match the most relevant grouping of municipalities with a specific DWN, in order that tap water exposure can be taken into account when investigating future disease outbreaks. A space-time detection method was applied to the grouping of municipalities. Seven hundred and fourteen new geographical areas (groupings of municipalities) were obtained compared with the 1,310 municipalities and the 1,706 DWN. Eleven potential WBDO were identified in these groupings of municipalities. For ten of them, additional environmental investigations identified at least one event that could have caused microbiological contamination of DWN in the days previous to the occurrence of a reported WBDO.
Özarslan, Evren; Koay, Cheng Guan; Shepherd, Timothy M; Komlosh, Michal E; İrfanoğlu, M Okan; Pierpaoli, Carlo; Basser, Peter J
2013-09-01
Diffusion-weighted magnetic resonance (MR) signals reflect information about underlying tissue microstructure and cytoarchitecture. We propose a quantitative, efficient, and robust mathematical and physical framework for representing diffusion-weighted MR imaging (MRI) data obtained in "q-space," and the corresponding "mean apparent propagator (MAP)" describing molecular displacements in "r-space." We also define and map novel quantitative descriptors of diffusion that can be computed robustly using this MAP-MRI framework. We describe efficient analytical representation of the three-dimensional q-space MR signal in a series expansion of basis functions that accurately describes diffusion in many complex geometries. The lowest order term in this expansion contains a diffusion tensor that characterizes the Gaussian displacement distribution, equivalent to diffusion tensor MRI (DTI). Inclusion of higher order terms enables the reconstruction of the true average propagator whose projection onto the unit "displacement" sphere provides an orientational distribution function (ODF) that contains only the orientational dependence of the diffusion process. The representation characterizes novel features of diffusion anisotropy and the non-Gaussian character of the three-dimensional diffusion process. Other important measures this representation provides include the return-to-the-origin probability (RTOP), and its variants for diffusion in one- and two-dimensions-the return-to-the-plane probability (RTPP), and the return-to-the-axis probability (RTAP), respectively. These zero net displacement probabilities measure the mean compartment (pore) volume and cross-sectional area in distributions of isolated pores irrespective of the pore shape. MAP-MRI represents a new comprehensive framework to model the three-dimensional q-space signal and transform it into diffusion propagators. Experiments on an excised marmoset brain specimen demonstrate that MAP-MRI provides several novel, quantifiable parameters that capture previously obscured intrinsic features of nervous tissue microstructure. This should prove helpful for investigating the functional organization of normal and pathologic nervous tissue. Copyright © 2013 Elsevier Inc. All rights reserved.
A Financial Market Model Incorporating Herd Behaviour
2016-01-01
Herd behaviour in financial markets is a recurring phenomenon that exacerbates asset price volatility, and is considered a possible contributor to market fragility. While numerous studies investigate herd behaviour in financial markets, it is often considered without reference to the pricing of financial instruments or other market dynamics. Here, a trader interaction model based upon informational cascades in the presence of information thresholds is used to construct a new model of asset price returns that allows for both quiescent and herd-like regimes. Agent interaction is modelled using a stochastic pulse-coupled network, parametrised by information thresholds and a network coupling probability. Agents may possess either one or two information thresholds that, in each case, determine the number of distinct states an agent may occupy before trading takes place. In the case where agents possess two thresholds (labelled as the finite state-space model, corresponding to agents’ accumulating information over a bounded state-space), and where coupling strength is maximal, an asymptotic expression for the cascade-size probability is derived and shown to follow a power law when a critical value of network coupling probability is attained. For a range of model parameters, a mixture of negative binomial distributions is used to approximate the cascade-size distribution. This approximation is subsequently used to express the volatility of model price returns in terms of the model parameter which controls the network coupling probability. In the case where agents possess a single pulse-coupling threshold (labelled as the semi-infinite state-space model corresponding to agents’ accumulating information over an unbounded state-space), numerical evidence is presented that demonstrates volatility clustering and long-memory patterns in the volatility of asset returns. Finally, output from the model is compared to both the distribution of historical stock returns and the market price of an equity index option. PMID:27007236
Miladinovic, Branko; Kumar, Ambuj; Mhaskar, Rahul; Djulbegovic, Benjamin
2014-10-21
To understand how often 'breakthroughs,' that is, treatments that significantly improve health outcomes, can be developed. We applied weighted adaptive kernel density estimation to construct the probability density function for observed treatment effects from five publicly funded cohorts and one privately funded group. 820 trials involving 1064 comparisons and enrolling 331,004 patients were conducted by five publicly funded cooperative groups. 40 cancer trials involving 50 comparisons and enrolling a total of 19,889 patients were conducted by GlaxoSmithKline. We calculated that the probability of detecting treatment with large effects is 10% (5-25%), and that the probability of detecting treatment with very large treatment effects is 2% (0.3-10%). Researchers themselves judged that they discovered a new, breakthrough intervention in 16% of trials. We propose these figures as the benchmarks against which future development of 'breakthrough' treatments should be measured. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
ERIC Educational Resources Information Center
Day, Roger P.; And Others
A quasi-experimental design with two experimental groups and one control group was used to evaluate the use of two books in the Quantitative Literacy Series, "Exploring Data" and "Exploring Probability." Group X teachers were those who had attended a workshop on the use of the materials and were using the materials during the…
Maximum ikelihood estimation for the double-count method with independent observers
Manly, Bryan F.J.; McDonald, Lyman L.; Garner, Gerald W.
1996-01-01
Data collected under a double-count protocol during line transect surveys were analyzed using new maximum likelihood methods combined with Akaike's information criterion to provide estimates of the abundance of polar bear (Ursus maritimus Phipps) in a pilot study off the coast of Alaska. Visibility biases were corrected by modeling the detection probabilities using logistic regression functions. Independent variables that influenced the detection probabilities included perpendicular distance of bear groups from the flight line and the number of individuals in the groups. A series of models were considered which vary from (1) the simplest, where the probability of detection was the same for both observers and was not affected by either distance from the flight line or group size, to (2) models where probability of detection is different for the two observers and depends on both distance from the transect and group size. Estimation procedures are developed for the case when additional variables may affect detection probabilities. The methods are illustrated using data from the pilot polar bear survey and some recommendations are given for design of a survey over the larger Chukchi Sea between Russia and the United States.
The 2013–2016 induced earthquakes in Harper and Sumner Counties, southern Kansas
Rubinstein, Justin L.; Ellsworth, William L.; Dougherty, Sara L.
2018-01-01
We examine the first four years (2013–2016) of the ongoing seismicity in southern Kansas using high‐precision locations derived from a local seismometer network. The earthquakes occur almost exclusively in the shallow crystalline basement, below the wastewater injection horizon of the Arbuckle Group at the base of the sedimentary section. Multiple lines of evidence lead us to conclude that disposal of wastewater from the production of oil and gas by deep injection is the probable cause for the surge of seismicity that began in 2013. First, the seismicity correlates in space and time with the injection. We observe increases in seismicity subsequent to increases in injection and decreases in seismicity in response to decreases in injection. Second, the earthquake‐rate change is statistically improbable to be of natural origin. From 1974 through the time of the injection increase in 2012, no ML">ML 4 or larger earthquakes occurred in the study area, while six occurred between 2012 and 2016. The probability of this rate change occurring randomly is ∼0.16%">∼0.16%. Third, the other potential industrial drivers of seismicity (hydraulic fracturing and oil production) do not correlate in space or time with seismicity. Local geological conditions are important in determining whether injection operations will induce seismicity, as shown by absence of seismicity near the largest injection operations in the southwest portion of our study area. In addition to local operations, the presence of seismicity 10+ km from large injection wells indicates that regional injection operations also need to be considered to understand the effects of injection on seismicity.
There Once Was a 9-Block ...--A Middle-School Design for Probability and Statistics
ERIC Educational Resources Information Center
Abrahamson, Dor; Janusz, Ruth M.; Wilensky, Uri
2006-01-01
ProbLab is a probability-and-statistics unit developed at the Center for Connected Learning and Computer-Based Modeling, Northwestern University. Students analyze the combinatorial space of the 9-block, a 3-by-3 grid of squares, in which each square can be either green or blue. All 512 possible 9-blocks are constructed and assembled in a "bar…
NASA Technical Reports Server (NTRS)
Lambert, Winifred; Wheeler, Mark
2004-01-01
The 45th Weather Squadron (45 WS) forecasters at Cape Canaveral Air Force Station (CCAFS) in Florida include a probability of thunderstorm occurrence in their daily morning briefings. This information is used by personnel involved in determining the possibility of violating Launch Commit Criteria, evaluating Flight Rules for the Space Shuttle, and daily planning for ground operation activities on Kennedy Space Center (KSC)/CCAFS. Much of the current lightning probability forecast is based on a subjective analysis of model and observational data. The forecasters requested that a lightning probability forecast tool based on statistical analysis of historical warm-season (May - September) data be developed in order to increase the objectivity of the daily thunderstorm probability forecast. The tool is a set of statistical lightning forecast equations that provide a lightning occurrence probability for the day by 1100 UTC (0700 EDT) during the warm season. This study used 15 years (1989-2003) of warm season data to develop the objective forecast equations. The local CCAFS 1000 UTC sounding was used to calculate stability parameters for equation predictors. The Cloud-to-Ground Lightning Surveillance System (CGLSS) data were used to determine lightning occurrence for each day. The CGLSS data have been found to be more reliable indicators of lightning in the area than surface observations through local informal analyses. This work was based on the results from two earlier research projects. Everitt (1999) used surface observations and rawinsonde data to develop logistic regression equations that forecast the daily thunderstorm probability at CCAFS. The Everitt (1999) equations showed an improvement in skill over the Neumann-Pfeffer thunderstorm index (Neumann 1971), which uses multiple linear regression, and also persistence and climatology forecasts. Lericos et al. (2002) developed lightning distributions over the Florida peninsula based on specific flow regimes. The flow regimes were inferred from the average wind direction in the 1000-700 mb layer at Miami (MIA), Tampa (TBW), and Jacksonville (JAX), Florida, and the lightning data were from the National Lightning Detection Network. The results suggested that the daily flow regime may be an important predictor of lightning occurrence on KSC/CCAFS.
Development of a funding, cost, and spending model for satellite projects
NASA Technical Reports Server (NTRS)
Johnson, Jesse P.
1989-01-01
The need for a predictive budget/funging model is obvious. The current models used by the Resource Analysis Office (RAO) are used to predict the total costs of satellite projects. An effort to extend the modeling capabilities from total budget analysis to total budget and budget outlays over time analysis was conducted. A statistical based and data driven methodology was used to derive and develop the model. Th budget data for the last 18 GSFC-sponsored satellite projects were analyzed and used to build a funding model which would describe the historical spending patterns. This raw data consisted of dollars spent in that specific year and their 1989 dollar equivalent. This data was converted to the standard format used by the RAO group and placed in a database. A simple statistical analysis was performed to calculate the gross statistics associated with project length and project cost ant the conditional statistics on project length and project cost. The modeling approach used is derived form the theory of embedded statistics which states that properly analyzed data will produce the underlying generating function. The process of funding large scale projects over extended periods of time is described by Life Cycle Cost Models (LCCM). The data was analyzed to find a model in the generic form of a LCCM. The model developed is based on a Weibull function whose parameters are found by both nonlinear optimization and nonlinear regression. In order to use this model it is necessary to transform the problem from a dollar/time space to a percentage of total budget/time space. This transformation is equivalent to moving to a probability space. By using the basic rules of probability, the validity of both the optimization and the regression steps are insured. This statistically significant model is then integrated and inverted. The resulting output represents a project schedule which relates the amount of money spent to the percentage of project completion.
NASA Technical Reports Server (NTRS)
Rodgers, E. B.; Seale, D. B.; Boraas, M. E.; Sommer, C. V.
1989-01-01
The probable sources and implications of microbial contamination on the proposed Space Station are discussed. Because of the limited availability of material, facilities and time on the Space Station, we are exploring the feasibility of replacing traditional incubation methods for assessing microbial contamination with rapid, automated methods. Some possibilities include: ATP measurement, microscopy and telecommunications, and molecular techniques such as DNA probes or monoclonal antibodies. Some of the important ecological factors that could alter microbes in space include microgravity, exposure to radiation, and antibiotic resistance.
A Deterministic Approach to Active Debris Removal Target Selection
NASA Astrophysics Data System (ADS)
Lidtke, A.; Lewis, H.; Armellin, R.
2014-09-01
Many decisions, with widespread economic, political and legal consequences, are being considered based on space debris simulations that show that Active Debris Removal (ADR) may be necessary as the concerns about the sustainability of spaceflight are increasing. The debris environment predictions are based on low-accuracy ephemerides and propagators. This raises doubts about the accuracy of those prognoses themselves but also the potential ADR target-lists that are produced. Target selection is considered highly important as removal of many objects will increase the overall mission cost. Selecting the most-likely candidates as soon as possible would be desirable as it would enable accurate mission design and allow thorough evaluation of in-orbit validations, which are likely to occur in the near-future, before any large investments are made and implementations realized. One of the primary factors that should be used in ADR target selection is the accumulated collision probability of every object. A conjunction detection algorithm, based on the smart sieve method, has been developed. Another algorithm is then applied to the found conjunctions to compute the maximum and true probabilities of collisions taking place. The entire framework has been verified against the Conjunction Analysis Tools in AGIs Systems Toolkit and relative probability error smaller than 1.5% has been achieved in the final maximum collision probability. Two target-lists are produced based on the ranking of the objects according to the probability they will take part in any collision over the simulated time window. These probabilities are computed using the maximum probability approach, that is time-invariant, and estimates of the true collision probability that were computed with covariance information. The top-priority targets are compared, and the impacts of the data accuracy and its decay are highlighted. General conclusions regarding the importance of Space Surveillance and Tracking for the purpose of ADR are also drawn and a deterministic method for ADR target selection, which could reduce the number of ADR missions to be performed, is proposed.
NASA Astrophysics Data System (ADS)
Gao, Haixia; Li, Ting; Xiao, Changming
2016-05-01
When a simple system is in its nonequilibrium state, it will shift to its equilibrium state. Obviously, in this process, there are a series of nonequilibrium states. With the assistance of Bayesian statistics and hyperensemble, a probable probability distribution of these nonequilibrium states can be determined by maximizing the hyperensemble entropy. It is known that the largest probability is the equilibrium state, and the far a nonequilibrium state is away from the equilibrium one, the smaller the probability will be, and the same conclusion can also be obtained in the multi-state space. Furthermore, if the probability stands for the relative time the corresponding nonequilibrium state can stay, then the velocity of a nonequilibrium state returning back to its equilibrium can also be determined through the reciprocal of the derivative of this probability. It tells us that the far away the state from the equilibrium is, the faster the returning velocity will be; if the system is near to its equilibrium state, the velocity will tend to be smaller and smaller, and finally tends to 0 when it gets the equilibrium state.
NASA Astrophysics Data System (ADS)
Regnier, David; Lacroix, Denis; Scamps, Guillaume; Hashimoto, Yukio
2018-03-01
In a mean-field description of superfluidity, particle number and gauge angle are treated as quasiclassical conjugated variables. This level of description was recently used to describe nuclear reactions around the Coulomb barrier. Important effects of the relative gauge angle between two identical superfluid nuclei (symmetric collisions) on transfer probabilities and fusion barrier have been uncovered. A theory making contact with experiments should at least average over different initial relative gauge-angles. In the present work, we propose a new approach to obtain the multiple pair transfer probabilities between superfluid systems. This method, called phase-space combinatorial (PSC) technique, relies both on phase-space averaging and combinatorial arguments to infer the full pair transfer probability distribution at the cost of multiple mean-field calculations only. After benchmarking this approach in a schematic model, we apply it to the collision 20O+20O at various energies below the Coulomb barrier. The predictions for one pair transfer are similar to results obtained with an approximated projection method, whereas significant differences are found for two pairs transfer. Finally, we investigated the applicability of the PSC method to the contact between nonidentical superfluid systems. A generalization of the method is proposed and applied to the schematic model showing that the pair transfer probabilities are reasonably reproduced. The applicability of the PSC method to asymmetric nuclear collisions is investigated for the 14O+20O collision and it turns out that unrealistically small single- and multiple pair transfer probabilities are obtained. This is explained by the fact that relative gauge angle play in this case a minor role in the particle transfer process compared to other mechanisms, such as equilibration of the charge/mass ratio. We conclude that the best ground for probing gauge-angle effects in nuclear reaction and/or for applying the proposed PSC approach on pair transfer is the collisions of identical open-shell spherical nuclei.
The Efficacy of Using Diagrams When Solving Probability Word Problems in College
ERIC Educational Resources Information Center
Beitzel, Brian D.; Staley, Richard K.
2015-01-01
Previous experiments have shown a deleterious effect of visual representations on college students' ability to solve total- and joint-probability word problems. The present experiments used conditional-probability problems, known to be more difficult than total- and joint-probability problems. The diagram group was instructed in how to use tree…
First laser measurements to space debris in Poland
NASA Astrophysics Data System (ADS)
Lejba, Paweł; Suchodolski, Tomasz; Michałek, Piotr; Bartoszak, Jacek; Schillak, Stanisław; Zapaśnik, Stanisław
2018-05-01
The Borowiec Satellite Laser Ranging station (BORL 7811, Borowiec) being a part of the Space Research Centre of the Polish Academy of Sciences (SRC PAS) went through modernization in 2014-2015. One of the main tasks of the modernization was the installation of a high-energy laser module dedicated to space debris tracking. Surelite III by Continuum is a Nd:YAG pulse laser with 10 Hz repetition rate, a pulse width of 3-5 ns and a pulse energy of 450 mJ for green (532 nm). This new laser unit was integrated with the SLR system at Borowiec performing standard satellite tracking. In 2016 BORL 7811 participated actively to the observational campaigns related to the space debris targets from LEO region managed by the Space Debris Study Group (SDSG) of the International Laser Ranging Service (ILRS). Currently, Borowiec station regularly tracks 36 space debris from the LEO regime, including typical rocket bodies (Russian/Chinese) and cooperative targets like the inactive TOPEX/Poseidon, ENVISAT, OICETS and others. In this paper the first results of space debris laser measurements obtained by the Borowiec station in period August 2016 - January 2017 are presented. The results gained by the SRC PAS Borowiec station confirm the rotation of the defunct TOPEX/Poseidon satellite which spins with a period of approximately 10 s. The novelty of this work is the presentation of the sample results of the Chinese CZ-2C R/B target (NORAD catalogue number 31114) which is equipped (probably) with retroreflectors. Laser measurements to space debris is a very desirable topic for the next years, especially in the context of the Space Surveillance and Tracking (SST) activity. Some targets are very easy to track like defunct ENVISAT or TOPEX/Poseidon. On the other hand, there is a big population of different LEO targets with different orbital and physical parameters, which are challenging for laser ranging like small irregular debris and rocket boosters.
Quantum mechanics: The Bayesian theory generalized to the space of Hermitian matrices
NASA Astrophysics Data System (ADS)
Benavoli, Alessio; Facchini, Alessandro; Zaffalon, Marco
2016-10-01
We consider the problem of gambling on a quantum experiment and enforce rational behavior by a few rules. These rules yield, in the classical case, the Bayesian theory of probability via duality theorems. In our quantum setting, they yield the Bayesian theory generalized to the space of Hermitian matrices. This very theory is quantum mechanics: in fact, we derive all its four postulates from the generalized Bayesian theory. This implies that quantum mechanics is self-consistent. It also leads us to reinterpret the main operations in quantum mechanics as probability rules: Bayes' rule (measurement), marginalization (partial tracing), independence (tensor product). To say it with a slogan, we obtain that quantum mechanics is the Bayesian theory in the complex numbers.
The extension of the thermal-vacuum test optimization program to multiple flights
NASA Technical Reports Server (NTRS)
Williams, R. E.; Byrd, J.
1981-01-01
The thermal vacuum test optimization model developed to provide an approach to the optimization of a test program based on prediction of flight performance with a single flight option in mind is extended to consider reflight as in space shuttle missions. The concept of 'utility', developed under the name of 'availability', is used to follow performance through the various options encountered when the capabilities of reflight and retrievability of space shuttle are available. Also, a 'lost value' model is modified to produce a measure of the probability of a mission's success, achieving a desired utility using a minimal cost test strategy. The resulting matrix of probabilities and their associated costs provides a means for project management to evaluate various test and reflight strategies.
ERIC Educational Resources Information Center
Lincoln, Don
2013-01-01
They say that there is no such thing as a stupid question. In a pedagogically pure sense, that's probably true. But some questions do seem to flirt dangerously close to being really quite ridiculous. One such question might well be, "How many dimensions of space are there?" I mean, it's pretty obvious that there are three:…
Contaminant ions and waves in the space station environment
NASA Technical Reports Server (NTRS)
Murphy, G. B.
1988-01-01
The probable plasma (ions and electrons) and plasma wave environment that will exist in the vicinity of the Space Station and how this environment may affect the operation of proposed experiments are discussed. Differences between quiescent operational periods and non-operational periods are also addressed. Areas which need further work are identified and a course of action suggested.
Campus Officials Seek Building Efficiencies, One Square Foot at a Time
ERIC Educational Resources Information Center
Carlson, Scott
2009-01-01
Space is a serious, expensive business on college campuses. Following a decade-long building boom, a crippling recession, a spike in energy prices (with further increases probable), and in some regions fierce competition for a shrinking pool of students, the stakes of managing campus space have never been higher. Students, it is often assumed,…
Financial issues for commercial space ventures: Paying for the dreams
NASA Technical Reports Server (NTRS)
Egan, J. J.
1984-01-01
Various financial issues involved in commercial space enterprise are discussed. Particular emphasis is placed on the materials processing area: the current state of business plan and financial developments, what is needed for enhanced probability of success of future materials development efforts in attracting financial backing, and finally, the risks involved in this entire business area.
NASA Astrophysics Data System (ADS)
Bialas, A.
2006-04-01
A method to estimate moments of the phase-space density from event-by-event fluctuations is reviewed and its accuracy analyzed. Relation of these measurements to the determination of the entropy of the system is discussed. This is a summary of the results obtained recently together with W.Czyz and K.Zalewski.
ERIC Educational Resources Information Center
Abrahamson, Dor
2006-01-01
This snapshot introduces a computer-based representation and activity that enables students to simultaneously "see" the combinatorial space of a stochastic device (e.g., dice, spinner, coins) and its outcome distribution. The author argues that the "ambiguous" representation fosters student insight into probability. [Snapshots are subject to peer…
Roehle, Robert; Wieske, Viktoria; Schuetz, Georg M; Gueret, Pascal; Andreini, Daniele; Meijboom, Willem Bob; Pontone, Gianluca; Garcia, Mario; Alkadhi, Hatem; Honoris, Lily; Hausleiter, Jörg; Bettencourt, Nuno; Zimmermann, Elke; Leschka, Sebastian; Gerber, Bernhard; Rochitte, Carlos; Schoepf, U Joseph; Shabestari, Abbas Arjmand; Nørgaard, Bjarne; Sato, Akira; Knuuti, Juhani; Meijs, Matthijs F L; Brodoefel, Harald; Jenkins, Shona M M; Øvrehus, Kristian Altern; Diederichsen, Axel Cosmus Pyndt; Hamdan, Ashraf; Halvorsen, Bjørn Arild; Mendoza Rodriguez, Vladimir; Wan, Yung Liang; Rixe, Johannes; Sheikh, Mehraj; Langer, Christoph; Ghostine, Said; Martuscelli, Eugenio; Niinuma, Hiroyuki; Scholte, Arthur; Nikolaou, Konstantin; Ulimoen, Geir; Zhang, Zhaoqi; Mickley, Hans; Nieman, Koen; Kaufmann, Philipp A; Buechel, Ronny Ralf; Herzog, Bernhard A; Clouse, Melvin; Halon, David A; Leipsic, Jonathan; Bush, David; Jakamy, Reda; Sun, Kai; Yang, Lin; Johnson, Thorsten; Laissy, Jean-Pierre; Marcus, Roy; Muraglia, Simone; Tardif, Jean-Claude; Chow, Benjamin; Paul, Narinder; Maintz, David; Hoe, John; de Roos, Albert; Haase, Robert; Laule, Michael; Schlattmann, Peter; Dewey, Marc
2018-03-19
To analyse the implementation, applicability and accuracy of the pretest probability calculation provided by NICE clinical guideline 95 for decision making about imaging in patients with chest pain of recent onset. The definitions for pretest probability calculation in the original Duke clinical score and the NICE guideline were compared. We also calculated the agreement and disagreement in pretest probability and the resulting imaging and management groups based on individual patient data from the Collaborative Meta-Analysis of Cardiac CT (CoMe-CCT). 4,673 individual patient data from the CoMe-CCT Consortium were analysed. Major differences in definitions in the Duke clinical score and NICE guideline were found for the predictors age and number of risk factors. Pretest probability calculation using guideline criteria was only possible for 30.8 % (1,439/4,673) of patients despite availability of all required data due to ambiguity in guideline definitions for risk factors and age groups. Agreement regarding patient management groups was found in only 70 % (366/523) of patients in whom pretest probability calculation was possible according to both models. Our results suggest that pretest probability calculation for clinical decision making about cardiac imaging as implemented in the NICE clinical guideline for patients has relevant limitations. • Duke clinical score is not implemented correctly in NICE guideline 95. • Pretest probability assessment in NICE guideline 95 is impossible for most patients. • Improved clinical decision making requires accurate pretest probability calculation. • These refinements are essential for appropriate use of cardiac CT.
NASA Astrophysics Data System (ADS)
Privault, Nicolas
2016-05-01
We construct differential forms of all orders and a covariant derivative together with its adjoint on the probability space of a standard Poisson process, using derivation operators. In this framewok we derive a de Rham-Hodge-Kodaira decomposition as well as Weitzenböck and Clark-Ocone formulas for random differential forms. As in the Wiener space setting, this construction provides two distinct approaches to the vanishing of harmonic differential forms.
Advanced planning activity. [for interplanetary flight and space exploration
NASA Technical Reports Server (NTRS)
1974-01-01
Selected mission concepts for interplanetary exploration through 1985 were examined, including: (1) Jupiter orbiter performance characteristics; (2) solar electric propulsion missions to Mercury, Venus, Neptune, and Uranus; (3) space shuttle planetary missions; (4) Pioneer entry probes to Saturn and Uranus; (5) rendezvous with Comet Kohoutek and Comet Encke; (6) space tug capabilities; and (7) a Pioneer mission to Mars in 1979. Mission options, limitations, and performance predictions are assessed, along with probable configurational, boost, and propulsion requirements.
Space Shuttle Debris Transport
NASA Technical Reports Server (NTRS)
Gomez, Reynaldo J., III
2010-01-01
This slide presentation reviews the assessment of debris damage to the Space Shuttle, and the use of computation to assist in the space shuttle applications. The presentation reviews the sources of debris, a mechanism for determining the probability of damaging debris impacting the shuttle, tools used, eliminating potential damaging debris sources, the use of computation to assess while inflight damage, and a chart showing the applications that have been used on increasingly powerful computers simulate the shuttle and the debris transport.
Hubble Space Telescope Snapshot Search for Planetary Nebulae in Globular Clusters of the Local Group
NASA Astrophysics Data System (ADS)
Bond, Howard E.
2015-04-01
Single stars in ancient globular clusters (GCs) are believed incapable of producing planetary nebulae (PNs), because their post-asymptotic-giant-branch evolutionary timescales are slower than the dissipation timescales for PNs. Nevertheless, four PNs are known in Galactic GCs. Their existence likely requires more exotic evolutionary channels, including stellar mergers and common-envelope binary interactions. I carried out a snapshot imaging search with the Hubble Space Telescope (HST) for PNs in bright Local Group GCs outside the Milky Way. I used a filter covering the 5007 Å nebular emission line of [O iii], and another one in the nearby continuum, to image 66 GCs. Inclusion of archival HST frames brought the total number of extragalactic GCs imaged at 5007 Å to 75, whose total luminosity slightly exceeds that of the entire Galactic GC system. I found no convincing PNs in these clusters, aside from one PN in a young M31 cluster misclassified as a GC, and two PNs at such large angular separations from an M31 GC that membership is doubtful. In a ground-based spectroscopic survey of 274 old GCs in M31, Jacoby et al. found three candidate PNs. My HST images of one of them suggest that the [O iii] emission actually arises from ambient interstellar medium rather than a PN; for the other two candidates, there are broadband archival UV HST images that show bright, blue point sources that are probably the PNs. In a literature search, I also identified five further PN candidates lying near old GCs in M31, for which follow-up observations are necessary to confirm their membership. The rates of incidence of PNs are similar, and small but nonzero, throughout the GCs of the Local Group. Based on observations with the NASA/ESA Hubble Space Telescope obtained at the Space Telescope Science Institute, and from the data archive at STScI, which are operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555.
Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell; Markley, F. Landis
2010-01-01
When facing a conjunction between space objects, decision makers must chose whether to maneuver for collision avoidance or not. We apply a well-known decision procedure, the sequential probability ratio test, to this problem. We propose two approaches to the problem solution, one based on a frequentist method, and the other on a Bayesian method. The frequentist method does not require any prior knowledge concerning the conjunction, while the Bayesian method assumes knowledge of prior probability densities. Our results show that both methods achieve desired missed detection rates, but the frequentist method's false alarm performance is inferior to the Bayesian method's
Calculation of transmission probability by solving an eigenvalue problem
NASA Astrophysics Data System (ADS)
Bubin, Sergiy; Varga, Kálmán
2010-11-01
The electron transmission probability in nanodevices is calculated by solving an eigenvalue problem. The eigenvalues are the transmission probabilities and the number of nonzero eigenvalues is equal to the number of open quantum transmission eigenchannels. The number of open eigenchannels is typically a few dozen at most, thus the computational cost amounts to the calculation of a few outer eigenvalues of a complex Hermitian matrix (the transmission matrix). The method is implemented on a real space grid basis providing an alternative to localized atomic orbital based quantum transport calculations. Numerical examples are presented to illustrate the efficiency of the method.
Small violations of Bell inequalities for multipartite pure random states
NASA Astrophysics Data System (ADS)
Drumond, Raphael C.; Duarte, Cristhiano; Oliveira, Roberto I.
2018-05-01
For any finite number of parts, measurements, and outcomes in a Bell scenario, we estimate the probability of random N-qudit pure states to substantially violate any Bell inequality with uniformly bounded coefficients. We prove that under some conditions on the local dimension, the probability to find any significant amount of violation goes to zero exponentially fast as the number of parts goes to infinity. In addition, we also prove that if the number of parts is at least 3, this probability also goes to zero as the local Hilbert space dimension goes to infinity.
Lightning Strike Peak Current Probabilities as Related to Space Shuttle Operations
NASA Technical Reports Server (NTRS)
Johnson, Dale L.; Vaughan, William W.
2000-01-01
A summary is presented of basic lightning characteristics/criteria applicable to current and future aerospace vehicles. The paper provides estimates on the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating the probabilities of launch vehicles/objects being struck by lightning. This paper presents a summary of these results.
Is there substructure around M87?
NASA Astrophysics Data System (ADS)
Oldham, L. J.; Evans, N. W.
2016-10-01
We present a general method to identify infalling substructure in discrete data sets with position and line-of-sight velocity data. We exploit the fact that galaxies falling on to a brightest cluster galaxy (BCG) in a virialized cluster, or dwarf satellites falling on to a central galaxy like the Milky Way, follow nearly radial orbits. If the orbits are exactly radial, we show how to find the probability distribution for a satellite's energy, given a tracer density for the satellite population, by solving an Abel integral equation. This is an extension of Eddington's classical formula for the isotropic distribution function. When applied to a system of galaxies, clustering in energy space can then be quantified using the Kullback-Leibler divergence, and groups of objects can be identified which, though separated in the sky, may be falling in on the same orbit. This method is tested using mock data and applied to the satellite galaxy population around M87, the BCG in Virgo, and a number of associations that are found, which may represent infalling galaxy groups.
NASA Technical Reports Server (NTRS)
2005-01-01
RCW 79 is seen in the southern Milky Way, 17,200 light-years from Earth in the constellation Centaurus. The bubble is 70-light years in diameter, and probably took about one million years to form from the radiation and winds of hot young stars. The balloon of gas and dust is an example of stimulated star formation. Such stars are born when the hot bubble expands into the interstellar gas and dust around it. RCW 79 has spawned at least two groups of new stars along the edge of the large bubble. Some are visible inside the small bubble in the lower left corner. Another group of baby stars appears near the opening at the top. NASA's Spitzer Space Telescope easily detects infrared light from the dust particles in RCW 79. The young stars within RCW 79 radiate ultraviolet light that excites molecules of dust within the bubble. This causes the dust grains to emit infrared light that is detected by Spitzer and seen here as the extended red features.Essl, Franz; Dullinger, Stefan
2016-01-01
The search for traits that make alien species invasive has mostly concentrated on comparing successful invaders and different comparison groups with respect to average trait values. By contrast, little attention has been paid to trait variability among invaders. Here, we combine an analysis of trait differences between invasive and non-invasive species with a comparison of multidimensional trait variability within these two species groups. We collected data on biological and distributional traits for 1402 species of the native, non-woody vascular plant flora of Austria. We then compared the subsets of species recorded and not recorded as invasive aliens anywhere in the world, respectively, first, with respect to the sampled traits using univariate and multiple regression models; and, second, with respect to their multidimensional trait diversity by calculating functional richness and dispersion metrics. Attributes related to competitiveness (strategy type, nitrogen indicator value), habitat use (agricultural and ruderal habitats, occurrence under the montane belt), and propagule pressure (frequency) were most closely associated with invasiveness. However, even the best multiple model, including interactions, only explained a moderate fraction of the differences in invasive success. In addition, multidimensional variability in trait space was even larger among invasive than among non-invasive species. This pronounced variability suggests that invasive success has a considerable idiosyncratic component and is probably highly context specific. We conclude that basing risk assessment protocols on species trait profiles will probably face hardly reducible uncertainties. PMID:27187616
Klonner, Günther; Fischer, Stefan; Essl, Franz; Dullinger, Stefan
2016-01-01
The search for traits that make alien species invasive has mostly concentrated on comparing successful invaders and different comparison groups with respect to average trait values. By contrast, little attention has been paid to trait variability among invaders. Here, we combine an analysis of trait differences between invasive and non-invasive species with a comparison of multidimensional trait variability within these two species groups. We collected data on biological and distributional traits for 1402 species of the native, non-woody vascular plant flora of Austria. We then compared the subsets of species recorded and not recorded as invasive aliens anywhere in the world, respectively, first, with respect to the sampled traits using univariate and multiple regression models; and, second, with respect to their multidimensional trait diversity by calculating functional richness and dispersion metrics. Attributes related to competitiveness (strategy type, nitrogen indicator value), habitat use (agricultural and ruderal habitats, occurrence under the montane belt), and propagule pressure (frequency) were most closely associated with invasiveness. However, even the best multiple model, including interactions, only explained a moderate fraction of the differences in invasive success. In addition, multidimensional variability in trait space was even larger among invasive than among non-invasive species. This pronounced variability suggests that invasive success has a considerable idiosyncratic component and is probably highly context specific. We conclude that basing risk assessment protocols on species trait profiles will probably face hardly reducible uncertainties.
Gong, Xingchu; Chen, Huali; Chen, Teng; Qu, Haibin
2014-01-01
Quality by design (QbD) concept is a paradigm for the improvement of botanical injection quality control. In this work, water precipitation process for the manufacturing of Xueshuantong injection, a botanical injection made from Notoginseng Radix et Rhizoma, was optimized using a design space approach as a sample. Saponin recovery and total saponin purity (TSP) in supernatant were identified as the critical quality attributes (CQAs) of water precipitation using a risk assessment for all the processes of Xueshuantong injection. An Ishikawa diagram and experiments of fractional factorial design were applied to determine critical process parameters (CPPs). Dry matter content of concentrated extract (DMCC), amount of water added (AWA), and stirring speed (SS) were identified as CPPs. Box-Behnken designed experiments were carried out to develop models between CPPs and process CQAs. Determination coefficients were higher than 0.86 for all the models. High TSP in supernatant can be obtained when DMCC is low and SS is high. Saponin recoveries decreased as DMCC increased. Incomplete collection of supernatant was the main reason for the loss of saponins. Design space was calculated using a Monte-Carlo simulation method with acceptable probability of 0.90. Recommended normal operation region are located in DMCC of 0.38-0.41 g/g, AWA of 3.7-4.9 g/g, and SS of 280-350 rpm, with a probability more than 0.919 to attain CQA criteria. Verification experiment results showed that operating DMCC, SS, and AWA within design space can attain CQA criteria with high probability.
A tool for simulating collision probabilities of animals with marine renewable energy devices.
Schmitt, Pál; Culloch, Ross; Lieber, Lilian; Molander, Sverker; Hammar, Linus; Kregting, Louise
2017-01-01
The mathematical problem of establishing a collision probability distribution is often not trivial. The shape and motion of the animal as well as of the the device must be evaluated in a four-dimensional space (3D motion over time). Earlier work on wind and tidal turbines was limited to a simplified two-dimensional representation, which cannot be applied to many new structures. We present a numerical algorithm to obtain such probability distributions using transient, three-dimensional numerical simulations. The method is demonstrated using a sub-surface tidal kite as an example. Necessary pre- and post-processing of the data created by the model is explained, numerical details and potential issues and limitations in the application of resulting probability distributions are highlighted.
International Space Station (ISS) S-Band Corona Discharge Anomaly Consultation
NASA Technical Reports Server (NTRS)
Kichak, Robert A.; Leidecker, Henning; Battel, Steven; Ruitberg, Arthur; Sank, Victor
2008-01-01
The Assembly and Contingency Radio Frequency Group (ACRFG) onboard the International Space Station (ISS) is used for command and control communications and transmits (45 dBm or 32 watts) and receives at S-band. The system is nominally pressurized with gaseous helium (He) and nitrogen (N2) at 8 pounds per square inch absolute (psia). MacDonald, Dettwiler and Associates Ltd. (MDA) was engaged to analyze the operational characteristics of this unit in an effort to determine if the anomalous behavior was a result of a corona event. Based on this analysis, MDA did not recommend continued use of this ACRFG. The NESC was requested to provide expert support in the area of high-voltage corona and multipactoring in an S-Band RF system and to assess the probability of corona occurring in the ACRFG during the planned EVA. The NESC recommended minimal continued use of S/N 002 ACRFG until a replacement unit can be installed. Following replacement, S/N 002 will be subjected to destructive failure analysis in an effort to determine the proximate and root cause(s) of the anomalous behavior.
Primordial black holes and uncertainties in the choice of the window function
NASA Astrophysics Data System (ADS)
Ando, Kenta; Inomata, Keisuke; Kawasaki, Masahiro
2018-05-01
Primordial black holes (PBHs) can be produced by the perturbations that exit the horizon during the inflationary phase. While inflation models predict the power spectrum of the perturbations in Fourier space, the PBH abundance depends on the probability distribution function of density perturbations in real space. To estimate the PBH abundance in a given inflation model, we must relate the power spectrum in Fourier space to the probability density function in real space by coarse graining the perturbations with a window function. However, there are uncertainties on what window function should be used, which could change the relation between the PBH abundance and the power spectrum. This is particularly important in considering PBHs with mass 30 M⊙, which account for the LIGO events because the required power spectrum is severely constrained by the observations. In this paper, we investigate how large an influence the uncertainties on the choice of a window function has over the power spectrum required for LIGO PBHs. As a result, it is found that the uncertainties significantly affect the prediction for the stochastic gravitational waves induced by the second-order effect of the perturbations. In particular, the pulsar timing array constraints on the produced gravitational waves could disappear for the real-space top-hat window function.
Distinguishability notion based on Wootters statistical distance: Application to discrete maps
NASA Astrophysics Data System (ADS)
Gomez, Ignacio S.; Portesi, M.; Lamberti, P. W.
2017-08-01
We study the distinguishability notion given by Wootters for states represented by probability density functions. This presents the particularity that it can also be used for defining a statistical distance in chaotic unidimensional maps. Based on that definition, we provide a metric d ¯ for an arbitrary discrete map. Moreover, from d ¯ , we associate a metric space with each invariant density of a given map, which results to be the set of all distinguished points when the number of iterations of the map tends to infinity. Also, we give a characterization of the wandering set of a map in terms of the metric d ¯ , which allows us to identify the dissipative regions in the phase space. We illustrate the results in the case of the logistic and the circle maps numerically and analytically, and we obtain d ¯ and the wandering set for some characteristic values of their parameters. Finally, an extension of the metric space associated for arbitrary probability distributions (not necessarily invariant densities) is given along with some consequences. The statistical properties of distributions given by histograms are characterized in terms of the cardinal of the associated metric space. For two conjugate variables, the uncertainty principle is expressed in terms of the diameters of the associated metric space with those variables.
NASA Astrophysics Data System (ADS)
Butler, G. V.
1981-04-01
Early space station designs are considered, taking into account Herman Oberth's first space station, the London Daily Mail Study, the first major space station design developed during the moon mission, and the Manned Orbiting Laboratory Program of DOD. Attention is given to Skylab, new space station studies, the Shuttle and Spacelab, communication satellites, solar power satellites, a 30 meter diameter radiometer for geological measurements and agricultural assessments, the mining of the moons, and questions of international cooperation. It is thought to be very probable that there will be very large space stations at some time in the future. However, for the more immediate future a step-by-step development that will start with Spacelab stations of 3-4 men is envisaged.
Dynamical Epidemic Suppression Using Stochastic Prediction and Control
2004-10-28
initial probability density function (PDF), p: D C R2 -- R, is defined by the stochastic Frobenius - Perron For deterministic systems, normal methods of...induced chaos. To analyze the qualitative change, we apply the technique of the stochastic Frobenius - Perron operator [L. Billings et al., Phys. Rev. Lett...transition matrix describing the probability of transport from one region of phase space to another, which approximates the stochastic Frobenius - Perron
Victor A. Rudis
2000-01-01
Scant information exists about the spatial extent of human impact on forest resource supplies, i.e., depreciative and nonforest uses. I used observations of ground-sampled land use and intrusions on forest land to map the probability of resource use and human impact for broad areas. Data came from a seven-state survey region (Alabama, Arkansas, Louisiana, Mississippi,...
Victor A. Rudis
2000-01-01
Scant information exists about the spatial extent of human impact on forest resource supplies, i.e., depreciative and nonforest uses. I used observations of ground-sampled land use and intrusions on forest land to map the probability of resource use and human impact for broad areas. Data came from a seven State survey region (Alabama, Arkansas, Louisiana, Mississippi,...
The NASA/MSFC Coherent Lidar Technology Advisory Team
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.
1999-01-01
The SPAce Readiness Coherent Lidar Experiment (SPARCLE) mission was proposed as a low cost technology demonstration mission, using a 2-micron, 100-mJ, 6-Hz, 25-cm, coherent lidar system based on demonstrated technology. SPARCLE was selected in late October 1997 to be NASA's New Millennium Program (NMP) second earth-observing (EO-2) mission. To maximize the success probability of SPARCLE, NASA/MSFC desired expert guidance in the areas of coherent laser radar (CLR) theory, CLR wind measurement, fielding of CLR systems, CLR alignment validation, and space lidar experience. This led to the formation of the NASA/MSFC Coherent Lidar Technology Advisory Team (CLTAT) in December 1997. A threefold purpose for the advisory team was identified as: 1) guidance to the SPARCLE mission, 2) advice regarding the roadmap of post-SPARCLE coherent Doppler wind lidar (CDWL) space missions and the desired matching technology development plan 3, and 3) general coherent lidar theory, simulation, hardware, and experiment information exchange. The current membership of the CLTAT is shown. Membership does not result in any NASA or other funding at this time. We envision the business of the CLTAT to be conducted mostly by email, teleconference, and occasional meetings. The three meetings of the CLTAT to date, in Jan. 1998, July 1998, and Jan. 1999, have all been collocated with previously scheduled meetings of the Working Group on Space-Based Lidar Winds. The meetings have been very productive. Topics discussed include the SPARCLE technology validation plan including pre-launch end-to-end testing, the space-based wind mission roadmap beyond SPARCLE and its implications on the resultant technology development, the current values and proposed future advancement in lidar system efficiency, and the difference between using single-mode fiber optical mixing vs. the traditional free space optical mixing.
NASA Technical Reports Server (NTRS)
Ponomarev, A. L.; Huff, J. L.; Cucinotta, F. A.
2011-01-01
Future long-tem space travel will face challenges from radiation concerns as the space environment poses health risk to humans in space from radiations with high biological efficiency and adverse post-flight long-term effects. Solar particles events may dramatically affect the crew performance, while Galactic Cosmic Rays will induce a chronic exposure to high-linear-energy-transfer (LET) particles. These types of radiation, not present on the ground level, can increase the probability of a fatal cancer later in astronaut life. No feasible shielding is possible from radiation in space, especially for the heavy ion component, as suggested solutions will require a dramatic increase in the mass of the mission. Our research group focuses on fundamental research and strategic analysis leading to better shielding design and to better understanding of the biological mechanisms of radiation damage. We present our recent effort to model DNA damage and tissue damage using computational models based on the physics of heavy ion radiation, DNA structure and DNA damage and repair in human cells. Our particular area of expertise include the clustered DNA damage from high-LET radiation, the visualization of DSBs (DNA double strand breaks) via DNA damage foci, image analysis and the statistics of the foci for different experimental situations, chromosomal aberration formation through DSB misrepair, the kinetics of DSB repair leading to a model-derived spectrum of chromosomal aberrations, and, finally, the simulation of human tissue and the pattern of apoptotic cell damage. This compendium of theoretical and experimental data sheds light on the complex nature of radiation interacting with human DNA, cells and tissues, which can lead to mutagenesis and carcinogenesis later in human life after the space mission.
Wavefronts, actions and caustics determined by the probability density of an Airy beam
NASA Astrophysics Data System (ADS)
Espíndola-Ramos, Ernesto; Silva-Ortigoza, Gilberto; Sosa-Sánchez, Citlalli Teresa; Julián-Macías, Israel; de Jesús Cabrera-Rosas, Omar; Ortega-Vidals, Paula; Alejandro Juárez-Reyes, Salvador; González-Juárez, Adriana; Silva-Ortigoza, Ramón
2018-07-01
The main contribution of the present work is to use the probability density of an Airy beam to identify its maxima with the family of caustics associated with the wavefronts determined by the level curves of a one-parameter family of solutions to the Hamilton–Jacobi equation with a given potential. To this end, we give a classical mechanics characterization of a solution of the one-dimensional Schrödinger equation in free space determined by a complete integral of the Hamilton–Jacobi and Laplace equations in free space. That is, with this type of solution, we associate a two-parameter family of wavefronts in the spacetime, which are the level curves of a one-parameter family of solutions to the Hamilton–Jacobi equation with a determined potential, and a one-parameter family of caustics. The general results are applied to an Airy beam to show that the maxima of its probability density provide a discrete set of: caustics, wavefronts and potentials. The results presented here are a natural generalization of those obtained by Berry and Balazs in 1979 for an Airy beam. Finally, we remark that, in a natural manner, each maxima of the probability density of an Airy beam determines a Hamiltonian system.
The rational status of quantum cognition.
Pothos, Emmanuel M; Busemeyer, Jerome R; Shiffrin, Richard M; Yearsley, James M
2017-07-01
Classic probability theory (CPT) is generally considered the rational way to make inferences, but there have been some empirical findings showing a divergence between reasoning and the principles of classical probability theory (CPT), inviting the conclusion that humans are irrational. Perhaps the most famous of these findings is the conjunction fallacy (CF). Recently, the CF has been shown consistent with the principles of an alternative probabilistic framework, quantum probability theory (QPT). Does this imply that QPT is irrational or does QPT provide an alternative interpretation of rationality? Our presentation consists of 3 parts. First, we examine the putative rational status of QPT using the same argument as used to establish the rationality of CPT, the Dutch Book (DB) argument, according to which reasoners should not commit to bets guaranteeing a loss. We prove the rational status of QPT by formulating it as a particular case of an extended form of CPT, with separate probability spaces produced by changing context. Second, we empirically examine the key requirement for whether a CF can be rational or not; the results show that participants indeed behave rationally, at least relative to the representations they employ. Finally, we consider whether the conditions for the CF to be rational are applicable in the outside (nonmental) world. Our discussion provides a general and alternative perspective for rational probabilistic inference, based on the idea that contextuality requires either reasoning in separate CPT probability spaces or reasoning with QPT principles. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon
2015-01-01
This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.
Structural study of quasi-one-dimensional vanadium pyroxene LiVSi{sub 2}O{sub 6} single crystals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ishii, Yuto; Matsushita, Yoshitaka; Oda, Migaku
Single crystals of quasi-one-dimensional vanadium pyroxene LiVSi{sub 2}O{sub 6} were synthesized and the crystal structures at 293 K and 113 K were studied using X-ray diffraction experiments. We found a structural phase transition from the room-temperature crystal structure with space group C2/c to a low-temperature structure with space group P2{sub 1}/c, resulting from a rotational displacement of SiO{sub 4} tetrahedra. The temperature dependence of magnetic susceptibility shows a broad maximum around 116 K, suggesting an opening of the Haldane gap expected for one-dimensional antiferromagnets with S=1. However, an antiferromagnetic long-range order was developed below 24 K, probably caused by amore » weak inter-chain magnetic coupling in the compound. - Graphical abstract: Low temperature crystal structure of LiVSi{sub 2}O{sub 6} and an orbital arrangement within the V-O zig-zag chain along the c-axis. - Highlights: • A low temperature structure of LiVSi{sub 2}O{sub 6} was determined by single crystal X-ray diffraction measurements. • The origin of the structural transition is a rotational displacement of SiO{sub 4} tetrahedra. • The uniform orbital overlap in the V-O zigzag chain makes the system a quasi one-dimensional antiferromagnet.« less
Environmental Equity and the Role of Public Policy: Experiences in the Rijnmond Region
NASA Astrophysics Data System (ADS)
Kruize, Hanneke; Driessen, Peter P. J.; Glasbergen, Pieter; van Egmond, Klaas (N. D.)
2007-10-01
This Φ Ψ study of environmental equity uses secondary quantitative data to analyze socioeconomic disparities in environmental conditions in the Rijnmond region of the Netherlands. The disparities of selected environmental indicators—exposure to traffic noise (road, rail, and air), NO2, external safety risks, and the availability of public green space—are analyzed both separately and in combination. Not only exposures to environmental burdens (“bads”) were investigated, but also access to environmental benefits (“goods”). Additionally, we held interviews and reviewed documents to grasp the mechanisms underlying the environmental equity situation, with an emphasis on the role of public policy. Environmental equity is not a priority in public policy for the greater Rotterdam region known as the Rijnmond region, yet environmental standards have been established to provide a minimum environmental quality to all local residents. In general, environmental quality has improved in this region, and the accumulation of negative environmental outcomes (“bads”) has been limited. However, environmental standards for road traffic noise and NO2 are being exceeded, probably because of the pressure on space and the traffic intensity. We found an association of environmental “bads” with income for rail traffic noise and availability of public green space. In the absence of regulation, positive environmental outcomes (“goods”) are mainly left up to market forces. Consequently, higher-income groups generally have more access to environmental “goods” than lower-income groups.
Low-Rank Discriminant Embedding for Multiview Learning.
Li, Jingjing; Wu, Yue; Zhao, Jidong; Lu, Ke
2017-11-01
This paper focuses on the specific problem of multiview learning where samples have the same feature set but different probability distributions, e.g., different viewpoints or different modalities. Since samples lying in different distributions cannot be compared directly, this paper aims to learn a latent subspace shared by multiple views assuming that the input views are generated from this latent subspace. Previous approaches usually learn the common subspace by either maximizing the empirical likelihood, or preserving the geometric structure. However, considering the complementarity between the two objectives, this paper proposes a novel approach, named low-rank discriminant embedding (LRDE), for multiview learning by taking full advantage of both sides. By further considering the duality between data points and features of multiview scene, i.e., data points can be grouped based on their distribution on features, while features can be grouped based on their distribution on the data points, LRDE not only deploys low-rank constraints on both sample level and feature level to dig out the shared factors across different views, but also preserves geometric information in both the ambient sample space and the embedding feature space by designing a novel graph structure under the framework of graph embedding. Finally, LRDE jointly optimizes low-rank representation and graph embedding in a unified framework. Comprehensive experiments in both multiview manner and pairwise manner demonstrate that LRDE performs much better than previous approaches proposed in recent literatures.
Space logistics simulation: Launch-on-time
NASA Technical Reports Server (NTRS)
Nii, Kendall M.
1990-01-01
During 1989-1990 the Center for Space Construction developed the Launch-On-Time (L-O-T) Model to help asses and improve the likelihood of successfully supporting space construction requiring multi-logistic delivery flights. The model chose a reference by which the L-O-T probability and improvements to L-O-T probability can be judged. The measure of improvement was chosen as the percent reduction in E(S(sub N)), the total expected amount of unscheduled 'hold' time. We have also previously developed an approach to determining the reduction in E(S(sub N)) by reducing some of the causes of unscheduled holds and increasing the speed at which the problems causing the holds may be 'fixed.' We provided a mathematical (binary linear programming) model for measuring the percent reduction in E(S(sub N)) given such improvements. In this presentation we shall exercise the model which was developed and draw some conclusions about the following: methods used, data available and needed, and make suggestions for areas of improvement in 'real world' application of the model.
NASA Technical Reports Server (NTRS)
Piascik, Robert S.; Prosser, William H.
2011-01-01
The Director of the NASA Engineering and Safety Center (NESC), requested an independent assessment of the anomalous gaseous hydrogen (GH2) flow incident on the Space Shuttle Program (SSP) Orbiter Vehicle (OV)-105 during the Space Transportation System (STS)-126 mission. The main propulsion system (MPS) engine #2 GH2 flow control valve (FCV) LV-57 transition from low towards high flow position without being commanded. Post-flight examination revealed that the FCV LV-57 poppet had experienced a fatigue failure that liberated a section of the poppet flange. The NESC assessment provided a peer review of the computational fluid dynamics (CFD), stress analysis, and impact testing. A probability of detection (POD) study was requested by the SSP Orbiter Project for the eddy current (EC) nondestructive evaluation (NDE) techniques that were developed to inspect the flight FCV poppets. This report contains the findings and recommendations from the NESC assessment.
NASA Technical Reports Server (NTRS)
1972-01-01
The Accident Model Document is one of three documents of the Preliminary Safety Analysis Report (PSAR) - Reactor System as applied to a Space Base Program. Potential terrestrial nuclear hazards involving the zirconium hydride reactor-Brayton power module are identified for all phases of the Space Base program. The accidents/events that give rise to the hazards are defined and abort sequence trees are developed to determine the sequence of events leading to the hazard and the associated probabilities of occurence. Source terms are calculated to determine the magnitude of the hazards. The above data is used in the mission accident analysis to determine the most probable and significant accidents/events in each mission phase. The only significant hazards during the prelaunch and launch ascent phases of the mission are those which arise form criticality accidents. Fission product inventories during this time period were found to be very low due to very limited low power acceptance testing.
NASA Technical Reports Server (NTRS)
Piascik, Robert S.; Prosser, William H.
2011-01-01
The Director of the NASA Engineering and Safety Center (NESC), requested an independent assessment of the anomalous gaseous hydrogen (GH2) flow incident on the Space Shuttle Program (SSP) Orbiter Vehicle (OV)-105 during the Space Transportation System (STS)-126 mission. The main propulsion system (MPS) engine #2 GH2 flow control valve (FCV) LV-57 transition from low towards high flow position without being commanded. Post-flight examination revealed that the FCV LV-57 poppet had experienced a fatigue failure that liberated a section of the poppet flange. The NESC assessment provided a peer review of the computational fluid dynamics (CFD), stress analysis, and impact testing. A probability of detection (POD) study was requested by the SSP Orbiter Project for the eddy current (EC) nondestructive evaluation (NDE) techniques that were developed to inspect the flight FCV poppets. This report contains the Appendices to the main report.
NASA Astrophysics Data System (ADS)
Auslander, Joseph Simcha
We begin by defining the concept of `open' Markov processes, which are continuous-time Markov chains where probability can flow in and out through certain `boundary' states. We study open Markov processes which in the absence of such boundary flows admit equilibrium states satisfying detailed balance, meaning that the net flow of probability vanishes between all pairs of states. External couplings which fix the probabilities of boundary states can maintain such systems in non-equilibrium steady states in which non-zero probability currents flow. We show that these non-equilibrium steady states minimize a quadratic form which we call 'dissipation.' This is closely related to Prigogine's principle of minimum entropy production. We bound the rate of change of the entropy of a driven non-equilibrium steady state relative to the underlying equilibrium state in terms of the flow of probability through the boundary of the process. We then consider open Markov processes as morphisms in a symmetric monoidal category by splitting up their boundary states into certain sets of `inputs' and `outputs.' Composition corresponds to gluing the outputs of one such open Markov process onto the inputs of another so that the probability flowing out of the first process is equal to the probability flowing into the second. Tensoring in this category corresponds to placing two such systems side by side. We construct a `black-box' functor characterizing the behavior of an open Markov process in terms of the space of possible steady state probabilities and probability currents along the boundary. The fact that this is a functor means that the behavior of a composite open Markov process can be computed by composing the behaviors of the open Markov processes from which it is composed. We prove a similar black-boxing theorem for reaction networks whose dynamics are given by the non-linear rate equation. Along the way we describe a more general category of open dynamical systems where composition corresponds to gluing together open dynamical systems.
NASA Astrophysics Data System (ADS)
Harbert, Emily Grace
We begin by defining the concept of `open' Markov processes, which are continuous-time Markov chains where probability can flow in and out through certain `boundary' states. We study open Markov processes which in the absence of such boundary flows admit equilibrium states satisfying detailed balance, meaning that the net flow of probability vanishes between all pairs of states. External couplings which fix the probabilities of boundary states can maintain such systems in non-equilibrium steady states in which non-zero probability currents flow. We show that these non-equilibrium steady states minimize a quadratic form which we call 'dissipation.' This is closely related to Prigogine's principle of minimum entropy production. We bound the rate of change of the entropy of a driven non-equilibrium steady state relative to the underlying equilibrium state in terms of the flow of probability through the boundary of the process. We then consider open Markov processes as morphisms in a symmetric monoidal category by splitting up their boundary states into certain sets of `inputs' and `outputs.' Composition corresponds to gluing the outputs of one such open Markov process onto the inputs of another so that the probability flowing out of the first process is equal to the probability flowing into the second. Tensoring in this category corresponds to placing two such systems side by side. We construct a `black-box' functor characterizing the behavior of an open Markov process in terms of the space of possible steady state probabilities and probability currents along the boundary. The fact that this is a functor means that the behavior of a composite open Markov process can be computed by composing the behaviors of the open Markov processes from which it is composed. We prove a similar black-boxing theorem for reaction networks whose dynamics are given by the non-linear rate equation. Along the way we describe a more general category of open dynamical systems where composition corresponds to gluing together open dynamical systems.
Thaker, Maria; Vanak, Abi T; Owen, Cailey R; Ogden, Monika B; Niemann, Sophie M; Slotow, Rob
2011-02-01
Studies that focus on single predator-prey interactions can be inadequate for understanding antipredator responses in multi-predator systems. Yet there is still a general lack of information about the strategies of prey to minimize predation risk from multiple predators at the landscape level. Here we examined the distribution of seven African ungulate species in the fenced Karongwe Game Reserve (KGR), South Africa, as a function of predation risk from all large carnivore species (lion, leopard, cheetah, African wild dog, and spotted hyena). Using observed kill data, we generated ungulate-specific predictions of relative predation risk and of riskiness of habitats. To determine how ungulates minimize predation risk at the landscape level, we explicitly tested five hypotheses consisting of strategies that reduce the probability of encountering predators, and the probability of being killed. All ungulate species avoided risky habitats, and most selected safer habitats, thus reducing their probability of being killed. To reduce the probability of encountering predators, most of the smaller prey species (impala, warthog, waterbuck, kudu) avoided the space use of all predators, while the larger species (wildebeest, zebra, giraffe) only avoided areas where lion and leopard space use were high. The strength of avoidance for the space use of predators generally did not correspond to the relative predation threat from those predators. Instead, ungulates used a simpler behavioral rule of avoiding the activity areas of sit-and-pursue predators (lion and leopard), but not those of cursorial predators (cheetah and African wild dog). In general, selection and avoidance of habitats was stronger than avoidance of the predator activity areas. We expect similar decision rules to drive the distribution pattern of ungulates in other African savannas and in other multi-predator systems, especially where predators differ in their hunting modes.
NASA Technical Reports Server (NTRS)
Vitali, Roberto; Lutomski, Michael G.
2004-01-01
National Aeronautics and Space Administration s (NASA) International Space Station (ISS) Program uses Probabilistic Risk Assessment (PRA) as part of its Continuous Risk Management Process. It is used as a decision and management support tool to not only quantify risk for specific conditions, but more importantly comparing different operational and management options to determine the lowest risk option and provide rationale for management decisions. This paper presents the derivation of the probability distributions used to quantify the failure rates and the probability of failures of the basic events employed in the PRA model of the ISS. The paper will show how a Bayesian approach was used with different sources of data including the actual ISS on orbit failures to enhance the confidence in results of the PRA. As time progresses and more meaningful data is gathered from on orbit failures, an increasingly accurate failure rate probability distribution for the basic events of the ISS PRA model can be obtained. The ISS PRA has been developed by mapping the ISS critical systems such as propulsion, thermal control, or power generation into event sequences diagrams and fault trees. The lowest level of indenture of the fault trees was the orbital replacement units (ORU). The ORU level was chosen consistently with the level of statistically meaningful data that could be obtained from the aerospace industry and from the experts in the field. For example, data was gathered for the solenoid valves present in the propulsion system of the ISS. However valves themselves are composed of parts and the individual failure of these parts was not accounted for in the PRA model. In other words the failure of a spring within a valve was considered a failure of the valve itself.
Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt
2017-11-01
When a location is cued, targets appearing at that location are detected more quickly. When a target feature is cued, targets bearing that feature are detected more quickly. These attentional cueing effects are only superficially similar. More detailed analyses find distinct temporal and accuracy profiles for the two different types of cues. This pattern parallels work with probability manipulations, where both feature and spatial probability are known to affect detection accuracy and reaction times. However, little has been done by way of comparing these effects. Are probability manipulations on space and features distinct? In a series of five experiments, we systematically varied spatial probability and feature probability along two dimensions (orientation or color). In addition, we decomposed response times into initiation and movement components. Targets appearing at the probable location were reported more quickly and more accurately regardless of whether the report was based on orientation or color. On the other hand, when either color probability or orientation probability was manipulated, response time and accuracy improvements were specific for that probable feature dimension. Decomposition of the response time benefits demonstrated that spatial probability only affected initiation times, whereas manipulations of feature probability affected both initiation and movement times. As detection was made more difficult, the two effects further diverged, with spatial probability disproportionally affecting initiation times and feature probability disproportionately affecting accuracy. In conclusion, all manipulations of probability, whether spatial or featural, affect detection. However, only feature probability affects perceptual precision, and precision effects are specific to the probable attribute.
Quantitative evaluation of Alzheimer's disease
NASA Astrophysics Data System (ADS)
Duchesne, S.; Frisoni, G. B.
2009-02-01
We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.
Chicoli, A.; Butail, S.; Lun, Y.; Bak-Coleman, J.; Coombs, S.; Paley, D.A.
2014-01-01
To assess how flow affects school structure and threat detection, startle response rates of solitary and small groups of giant danio Devario aequipinnatus were compared to visual looming stimuli in flow and no-flow conditions. The instantaneous position and heading of each D. aequipinnatus were extracted from high-speed videos. Behavioural results indicate that (1) school structure is altered in flow such that D. aequipinnatus orient upstream while spanning out in a crosswise direction, (2) the probability of at least one D. aequipinnatus detecting the visual looming stimulus is higher in flow than no flow for both solitary D. aequipinnatus and groups of eight D. aequipinnatus, however, (3) the probability of three or more individuals responding is higher in no flow than flow. Taken together, these results indicate a higher probability of stimulus detection in flow but a higher probability of internal transmission of information in no flow. Finally, results were well predicted by a computational model of collective fright response that included the probability of direct detection (based on signal detection theory) and indirect detection (i.e. via interactions between group members) of threatening stimuli. This model provides a new theoretical framework for analysing the collective transfer of information among groups of fishes and other organisms. PMID:24773538
NASA Astrophysics Data System (ADS)
Gandomkar, Ziba; Brennan, Patrick C.; Mello-Thoms, Claudia
2017-03-01
Mitotic count is helpful in determining the aggressiveness of breast cancer. In previous studies, it was shown that the agreement among pathologists for grading mitotic index is fairly modest, as mitoses have a large variety of appearances and they could be mistaken for other similar objects. In this study, we determined local and contextual features that differ significantly between easily identifiable mitoses and challenging ones. The images were obtained from the Mitosis-Atypia 2014 challenge. In total, the dataset contained 453 mitotic figures. Two pathologists annotated each mitotic figure. In case of disagreement, an opinion from a third pathologist was requested. The mitoses were grouped into three categories, those recognized as "a true mitosis" by both pathologists ,those labelled as "a true mitosis" by only one of the first two readers and also the third pathologist, and those annotated as "probably a mitosis" by all readers or the majority of them. After color unmixing, the mitoses were segmented from H channel. Shape-based features along with intensity-based and textural features were extracted from H-channel, blue ratio channel and five different color spaces. Holistic features describing each image were also considered. The Kruskal-Wallis H test was used to identify significantly different features. Multiple comparisons were done using the rank-based version of Tukey-Kramer test. The results indicated that there are local and global features which differ significantly among different groups. In addition, variations between mitoses in different groups were captured in the features from HSL and LCH color space more than other ones.
Effects of dynamical grouping on cooperation in N-person evolutionary snowdrift game
NASA Astrophysics Data System (ADS)
Ji, M.; Xu, C.; Hui, P. M.
2011-09-01
A population typically consists of agents that continually distribute themselves into different groups at different times. This dynamic grouping has recently been shown to be essential in explaining many features observed in human activities including social, economic, and military activities. We study the effects of dynamic grouping on the level of cooperation in a modified evolutionary N-person snowdrift game. Due to the formation of dynamical groups, the competition takes place in groups of different sizes at different times and players of different strategies are mixed by the grouping dynamics. It is found that the level of cooperation is greatly enhanced by the dynamic grouping of agents, when compared with a static population of the same size. As a parameter β, which characterizes the relative importance of the reward and cost, increases, the fraction of cooperative players fC increases and it is possible to achieve a fully cooperative state. Analytically, we present a dynamical equation that incorporates the effects of the competing game and group size distribution. The distribution of cooperators in different groups is assumed to be a binomial distribution, which is confirmed by simulations. Results from the analytic equation are in good agreement with numerical results from simulations. We also present detailed simulation results of fC over the parameter space spanned by the probabilities of group coalescence νm and group fragmentation νp in the grouping dynamics. A high νm and low νp promotes cooperation, and a favorable reward characterized by a high β would lead to a fully cooperative state.
About Schrödinger Equation on Fractals Curves Imbedding in R 3
NASA Astrophysics Data System (ADS)
Golmankhaneh, Alireza Khalili; Golmankhaneh, Ali Khalili; Baleanu, Dumitru
2015-04-01
In this paper we introduced the quantum mechanics on fractal time-space. In a suggested formalism the time and space vary on Cantor-set and Von-Koch curve, respectively. Using Feynman path method in quantum mechanics and F α -calculus we find Schrëdinger equation on on fractal time-space. The Hamiltonian and momentum fractal operator has been indicated. More, the continuity equation and the probability density is given in view of F α -calculus.
Joint search and sensor management for geosynchronous satellites
NASA Astrophysics Data System (ADS)
Zatezalo, A.; El-Fallah, A.; Mahler, R.; Mehra, R. K.; Pham, K.
2008-04-01
Joint search and sensor management for space situational awareness presents daunting scientific and practical challenges as it requires a simultaneous search for new, and the catalog update of the current space objects. We demonstrate a new approach to joint search and sensor management by utilizing the Posterior Expected Number of Targets (PENT) as the objective function, an observation model for a space-based EO/IR sensor, and a Probability Hypothesis Density Particle Filter (PHD-PF) tracker. Simulation and results using actual Geosynchronous Satellites are presented.
Atmospheric constraint statistics for the Space Shuttle mission planning
NASA Technical Reports Server (NTRS)
Smith, O. E.
1983-01-01
The procedures used to establish statistics of atmospheric constraints of interest to the Space Shuttle mission planning are presented. The statistics considered are for the frequency of occurrence, runs, and time conditional probabilities of several atmospheric constraints for each of the Space Shuttle mission phases. The mission phases considered are (1) prelaunch, (2) launch operations, (3) return to launch site, (4) abort once around landing, and (5) end of mission landing. Previously announced in STAR as N82-33417
Arms control and The President's Strategic Defense Initiative
NASA Astrophysics Data System (ADS)
Bon, J. J.
1985-04-01
The President's Strategic Defense Initiative (SDI) provides the hope for eliminating the threat from ballistic missiles. This study evaluates the impact of SDI on existing and future arms control agreements. Because new or modified space-related treaties are a probable result of the SDI, this study concludes that the best single strategy for arms control negotiations is to preserve overall US interests and maintain open technological options vice severely limiting any space technology that might some day become part of a space-based defensive system.
On the violation of the invariance of the light speed in theoretical investigations
NASA Astrophysics Data System (ADS)
Chubykalo, A.; Espinoza, A.; Gonzalez-Sanchez, A.; Gutiérrez Rodríguez, A.
2017-11-01
In this review, we analyze some of the most important theoretical attempts to challenge the invariance of the light speed postulated by the Special Theory of Relativity (STR). Most of those studies, however, show that STR has great stability with respect to various kinds of modifications in its axioms. This stability probably is due to the fact that in these modifications there is no so much a violation of the physical postulate of the invariance of the speed of light, as its mathematical expansion in the form of making resort to a more general affine space. In these modifications, we refer to more general transformation groups, including scale transformation of the speed of light and time c‧ = γc, t‧ = γ-1t.
Solar observations carried out at the INAF - Catania Astrophysical Observatory
NASA Astrophysics Data System (ADS)
Zuccarello, F.; Contarino, L.; Romano, P.
2011-10-01
Solar observations at the INAF - Catania Astrophysical Observatory are carried out by means of an equatorial spar, which includes: a Cook refractor, used to make daily drawings of sunspot groups from visual observations; a 150-mm refractor with an Hα Lyot filter for chromospheric observations; a 150-mm refractor feeding an Hα Halle filter for limb observations of the chromosphere. The photospheric and chromospheric data are daily distributed to several international Solar Data Centers. Recently, a program of Flare Warning has been implemented, with the aim of determining the probability that an active region yields a flare on the basis of its characteristics deduced from optical observations. Some science results obtained by means of solar data acquired at the INAF - Catania Astrophysical Observatory, as well as by space-instruments data, are briefly described.
Student Solution Manual for Mathematical Methods for Physics and Engineering Third Edition
NASA Astrophysics Data System (ADS)
Riley, K. F.; Hobson, M. P.
2006-03-01
Preface; 1. Preliminary algebra; 2. Preliminary calculus; 3. Complex numbers and hyperbolic functions; 4. Series and limits; 5. Partial differentiation; 6. Multiple integrals; 7. Vector algebra; 8. Matrices and vector spaces; 9. Normal modes; 10. Vector calculus; 11. Line, surface and volume integrals; 12. Fourier series; 13. Integral transforms; 14. First-order ordinary differential equations; 15. Higher-order ordinary differential equations; 16. Series solutions of ordinary differential equations; 17. Eigenfunction methods for differential equations; 18. Special functions; 19. Quantum operators; 20. Partial differential equations: general and particular; 21. Partial differential equations: separation of variables; 22. Calculus of variations; 23. Integral equations; 24. Complex variables; 25. Application of complex variables; 26. Tensors; 27. Numerical methods; 28. Group theory; 29. Representation theory; 30. Probability; 31. Statistics.
Disorder in Ag7GeSe5I, a superionic conductor: temperature-dependent anharmonic structural study.
Albert, Stéphanie; Pillet, Sébastien; Lecomte, Claude; Pradel, Annie; Ribes, Michel
2008-02-01
A temperature-dependent structural investigation of the substituted argyrodite Ag(7)GeSe(5)I has been carried out on a single crystal from 15 to 475 K, in steps of 50 K, and correlated to its conductivity properties. The argyrodite crystallizes in a cubic cell with the F\\bar 43m space group. The crystal structure exhibits high static and dynamic disorder which has been efficiently accounted for using a combination of (i) Gram-Charlier development of the Debye-Waller factors for iodine and silver, and (ii) a split-atom model for Ag(+) ions. An increased delocalization of the mobile d(10) Ag(+) cations with temperature has been clearly shown by the inspection of the joint probability-density functions; the corresponding diffusion pathways have been determined.
Heer, D M
1986-01-01
The impact of the number, order, and spacing of siblings on child and adult outcomes has been the topic of research by scholars in 4 separate fields (human biology, psychology, sociology, and economics), and the barriers to communication between academic disciplines are strong. Also most researchers have had to work with data sets gathered for other purposes. This has resulted in a relative inadequacy of research. Social scientists have 3 theories concerning the relationship between the number, order, and spacing of siblings and child and adult outcomes: that an increase in the number of siblings or a decrease in the spacing between them dilutes the time and material resources that parents can give to each child and that these resource dilutions hinder the outcome for each child; that account must be taken not only of parental resources but also of the resources given to each child by his/her siblings; and that there is no causal relationship between number, order and spacing of siblings and child outcomes and that any apparent relationships are spurious. In light of these theories, the question arises as to how should the sibling variables be measured. The most important aspect of sibling number is that it is a variable over time. Yet, the proper measurement of sibling number has an additional complication. According to all existing theories, the ages of the other siblings are relevant for the outcome for the given child. All of the relevant information is now available only when it is possible to construct a matrix in which the rows present the age of the given child and the columns the age grouping of the siblings for whom a count of sibling number will be made. Many such matrices could be developed, some much more elaborate than others. For illustrative purposes, Table 1 presents the matrix of the number of siblings for a child who is the first-born among 5 children, all of whom are spaced exactly 3 years apart and all of whom are financially dependent only up to exact age 21. Table 2 presents the matrix for the last-born child among 5 children with characteristics identical to those in Table 1. It can be inferred from these tables that the oldest child in the family, as compared to the youngest child, probably will suffer from a diminution of parental resources, most likely financial resources, in adolescence. The youngest will suffer from a reduction of parental resources, probably time resources, in infancy and early childhood. Research concerned with the consequences of the number and spacing of children should be based on data sets for which some version of this matrix can be constructed.
NASA Astrophysics Data System (ADS)
Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.
2018-02-01
Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.
Pretest probability estimation in the evaluation of patients with possible deep vein thrombosis.
Vinson, David R; Patel, Jason P; Irving, Cedric S
2011-07-01
An estimation of pretest probability is integral to the proper interpretation of a negative compression ultrasound in the diagnostic assessment of lower-extremity deep vein thrombosis. We sought to determine the rate, method, and predictors of pretest probability estimation in such patients. This cross-sectional study of outpatients was conducted in a suburban community hospital in 2006. Estimation of pretest probability was done by enzyme-linked immunosorbent assay d-dimer, Wells criteria, and unstructured clinical impression. Using logistic regression analysis, we measured predictors of documented risk assessment. A cohort analysis was undertaken to compare 3-month thromboembolic outcomes between risk groups. Among 524 cases, 289 (55.2%) underwent pretest probability estimation using the following methods: enzyme-linked immunosorbent assay d-dimer (228; 43.5%), clinical impression (106; 20.2%), and Wells criteria (24; 4.6%), with 69 (13.2%) patients undergoing a combination of at least two methods. Patient factors were not predictive of pretest probability estimation, but the specialty of the clinician was predictive; emergency physicians (P < .0001) and specialty clinicians (P = .001) were less likely than primary care clinicians to perform risk assessment. Thromboembolic events within 3 months were experienced by 0 of 52 patients in the explicitly low-risk group, 4 (1.8%) of 219 in the explicitly moderate- to high-risk group, and 1 (0.4%) of 226 in the group that did not undergo explicit risk assessment. Negative ultrasounds in the workup of deep vein thrombosis are commonly interpreted in isolation apart from pretest probability estimations. Risk assessments varied by physician specialties. Opportunities exist for improvement in the diagnostic evaluation of these patients. Copyright © 2011 Elsevier Inc. All rights reserved.
Space Station laboratory module power loading analysis
NASA Astrophysics Data System (ADS)
Fu, S. J.
1994-07-01
The electrical power system of Space Station Freedom is an isolated electrical power generation and distribution network designed to meet the demands of a large number of electrical loads. An algorithm is developed to determine the power bus loading status under normal operating conditions to ensure the supply meets demand. The probabilities of power availability for payload operations (experiments) are also derived.
NASA Technical Reports Server (NTRS)
Kirsten, P. W.; Richardson, D. F.; Wilson, C. M.
1983-01-01
Aerodynaic performance, stability and control data obtained from the first five reentries of the Space Shuttle orbiter are given. Flight results are compared to pedicted data from Mach 26.4 to Mach 0.4. Differences between flight and predicted data as well as probable causes for the discrepancies are given.
ERIC Educational Resources Information Center
Ruckle, L. J.; Belloni, M.; Robinett, R. W.
2012-01-01
The biharmonic oscillator and the asymmetric linear well are two confining power-law-type potentials for which complete bound-state solutions are possible in both classical and quantum mechanics. We examine these problems in detail, beginning with studies of their trajectories in position and momentum space, evaluation of the classical probability…
ERIC Educational Resources Information Center
Patching, Geoffrey R.; Englund, Mats P.; Hellstrom, Ake
2012-01-01
Despite the importance of both response probability and response time for testing models of choice, there is a dearth of chronometric studies examining systematic asymmetries that occur over time- and space-orders in the method of paired comparisons. In this study, systematic asymmetries in discriminating the magnitude of paired visual stimuli are…
NASA Technical Reports Server (NTRS)
1979-01-01
This specification establishes the natural and induced environments to which the power extension package may be exposed during ground operations and space operations with the shuttle system. Space induced environments are applicable at the Orbiter attach point interface location. All probable environments are systematically listed according to each ground and mission phase.
A new method for detecting small and dim targets in starry background
NASA Astrophysics Data System (ADS)
Yao, Rui; Zhang, Yanning; Jiang, Lei
2011-08-01
Small visible optical space targets detection is one of the key issues in the research of long-range early warning and space debris surveillance. The SNR(Signal to Noise Ratio) of the target is very low because of the self influence of image device. Random noise and background movement also increase the difficulty of target detection. In order to detect small visible optical space targets effectively and rapidly, we bring up a novel detecting method based on statistic theory. Firstly, we get a reasonable statistical model of visible optical space image. Secondly, we extract SIFT(Scale-Invariant Feature Transform) feature of the image frames, and calculate the transform relationship, then use the transform relationship to compensate whole visual field's movement. Thirdly, the influence of star was wiped off by using interframe difference method. We find segmentation threshold to differentiate candidate targets and noise by using OTSU method. Finally, we calculate statistical quantity to judge whether there is the target for every pixel position in the image. Theory analysis shows the relationship of false alarm probability and detection probability at different SNR. The experiment result shows that this method could detect target efficiently, even the target passing through stars.
Global tracking of space debris via CPHD and consensus
NASA Astrophysics Data System (ADS)
Wei, Baishen; Nener, Brett; Liu, Weifeng; Ma, Liang
2017-05-01
Space debris tracking is of great importance for safe operation of spacecraft. This paper presents an algorithm that achieves global tracking of space debris with a multi-sensor network. The sensor network has unknown and possibly time-varying topology. A consensus algorithm is used to effectively counteract the effects of data incest. Gaussian Mixture-Cardinalized Probability Hypothesis Density (GM-CPHD) filtering is used to estimate the state of the space debris. As an example of the method, 45 clusters of sensors are used to achieve global tracking. The performance of the proposed approach is demonstrated by simulation experiments.
Vacuum Decay via Lorentzian Wormholes
NASA Astrophysics Data System (ADS)
Rosales, J. L.
We speculate about the space-time description due to the presence of Lorentzian worm-holes (handles in space-time joining two distant regions or other universes) in quantum gravity. The semiclassical rate of production of these Lorentzian wormholes in Reissner-Nordström space-times is calculated as a result of the spontaneous decay of vacuum due to a real tunneling configuration. In the magnetic case it only depends on the value of the field theoretical fine structure constant. We predict that the quantum probability corresponding to the nucleation of such geodesically complete space-times should be acutally negligible in our physical Universe.
Probability of coincidental similarity among the orbits of small bodies - I. Pairing
NASA Astrophysics Data System (ADS)
Jopek, Tadeusz Jan; Bronikowska, Małgorzata
2017-09-01
Probability of coincidental clustering among orbits of comets, asteroids and meteoroids depends on many factors like: the size of the orbital sample searched for clusters or the size of the identified group, it is different for groups of 2,3,4,… members. Probability of coincidental clustering is assessed by the numerical simulation, therefore, it depends also on the method used for the synthetic orbits generation. We have tested the impact of some of these factors. For a given size of the orbital sample we have assessed probability of random pairing among several orbital populations of different sizes. We have found how these probabilities vary with the size of the orbital samples. Finally, keeping fixed size of the orbital sample we have shown that the probability of random pairing can be significantly different for the orbital samples obtained by different observation techniques. Also for the user convenience we have obtained several formulae which, for given size of the orbital sample can be used to calculate the similarity threshold corresponding to the small value of the probability of coincidental similarity among two orbits.
Behavioral connectivity among bighorn sheep suggests potential for disease spread
Borg, Nathan J.; Mitchell, Michael S.; Lukacs, Paul M.; Mack, Curt M.; Waits, Lisette P.; Krausman, Paul R.
2017-01-01
Connectivity is important for population persistence and can reduce the potential for inbreeding depression. Connectivity between populations can also facilitate disease transmission; respiratory diseases are one of the most important factors affecting populations of bighorn sheep (Ovis canadensis). The mechanisms of connectivity in populations of bighorn sheep likely have implications for spread of disease, but the behaviors leading to connectivity between bighorn sheep groups are not well understood. From 2007–2012, we radio-collared and monitored 56 bighorn sheep in the Salmon River canyon in central Idaho. We used cluster analysis to define social groups of bighorn sheep and then estimated connectivity between these groups using a multi-state mark-recapture model. Social groups of bighorn sheep were spatially segregated and linearly distributed along the Salmon River canyon. Monthly probabilities of movement between adjacent male and female groups ranged from 0.08 (±0.004 SE) to 0.76 (±0.068) for males and 0.05 (±0.132) to 0.24 (±0.034) for females. Movements of males were extensive and probabilities of movement were considerably higher during the rut. Probabilities of movement for females were typically smaller than those of males and did not change seasonally. Whereas adjacent groups of bighorn sheep along the Salmon River canyon were well connected, connectivity between groups north and south of the Salmon River was limited. The novel application of a multi-state model to a population of bighorn sheep allowed us to estimate the probability of movement between adjacent social groups and approximate the level of connectivity across the population. Our results suggest high movement rates of males during the rut are the most likely to result in transmission of pathogens among both male and female groups. Potential for disease spread among female groups was smaller but non-trivial. Land managers can plan grazing of domestic sheep for spring and summer months when males are relatively inactive. Removal or quarantine of social groups may reduce probability of disease transmission in populations of bighorn sheep consisting of linearly distributed social groups.
Duranton, Charlotte; Range, Friederike; Virányi, Zsófia
2017-07-01
Dogs are renowned for being skilful at using human-given communicative cues such as pointing. Results are contradictory, however, when it comes to dogs' following human gaze, probably due to methodological discrepancies. Here we investigated whether dogs follow human gaze to one of two food locations better than into distant space even after comparable pre-training. In Experiments 1 and 2, the gazing direction of dogs was recorded in a gaze-following into distant space and in an object-choice task where no choice was allowed, in order to allow a direct comparison between tasks, varying the ostensive nature of the gazes. We found that dogs only followed repeated ostensive human gaze into distant space, whereas they followed all gaze cues in the object-choice task. Dogs followed human gaze better in the object-choice task than when there was no obvious target to look at. In Experiment 3, dogs were tested in another object-choice task and were allowed to approach a container. Ostensive cues facilitated the dogs' following gaze with gaze as well as their choices: we found that dogs in the ostensive group chose the indicated container at chance level, whereas they avoided this container in the non-ostensive group. We propose that dogs may perceive the object-choice task as a competition over food and may interpret non-ostensive gaze as an intentional cue that indicates the experimenter's interest in the food location she has looked at. Whether ostensive cues simply mitigate the competitive perception of this situation or they alter how dogs interpret communicative gaze needs further investigation. Our findings also show that following gaze with one's gaze and actually choosing one of the two containers in an object-choice task need to be considered as different variables. The present study clarifies a number of questions related to gaze-following in dogs and adds to a growing body of evidence showing that human ostensive cues can strongly modify dog behaviour.
van Lamsweerde, Amanda E; Beck, Melissa R
2015-12-01
In this study, we investigated whether the ability to learn probability information is affected by the type of representation held in visual working memory. Across 4 experiments, participants detected changes to displays of coloured shapes. While participants detected changes in 1 dimension (e.g., colour), a feature from a second, nonchanging dimension (e.g., shape) predicted which object was most likely to change. In Experiments 1 and 3, items could be grouped by similarity in the changing dimension across items (e.g., colours and shapes were repeated in the display), while in Experiments 2 and 4 items could not be grouped by similarity (all features were unique). Probability information from the predictive dimension was learned and used to increase performance, but only when all of the features within a display were unique (Experiments 2 and 4). When it was possible to group by feature similarity in the changing dimension (e.g., 2 blue objects appeared within an array), participants were unable to learn probability information and use it to improve performance (Experiments 1 and 3). The results suggest that probability information can be learned in a dimension that is not explicitly task-relevant, but only when the probability information is represented with the changing dimension in visual working memory. (c) 2015 APA, all rights reserved).
Mapping probabilities of extreme continental water storage changes from space gravimetry
NASA Astrophysics Data System (ADS)
Kusche, J.; Eicker, A.; Forootan, E.; Springer, A.; Longuevergne, L.
2016-12-01
Using data from the Gravity Recovery and Climate Experiment (GRACE) mission, we derive statistically robust 'hotspot' regions of high probability of peak anomalous - i.e. with respect to the seasonal cycle - water storage (of up to 0.7 m one-in-five-year return level) and flux (up to 0.14 m/mon). Analysis of, and comparison with, up to 32 years of ERA-Interim reanalysis fields reveals generally good agreement of these hotspot regions to GRACE results, and that most exceptions are located in the Tropics. However, a simulation experiment reveals that differences observed by GRACE are statistically significant, and further error analysis suggests that by around the year 2020 it will be possible to detect temporal changes in the frequency of extreme total fluxes (i.e. combined effects of mainly precipitation and floods) for at least 10-20% of the continental area, assuming that we have a continuation of GRACE by its follow-up GRACE-FO. J. Kusche et al. (2016): Mapping probabilities of extreme continental water storage changes from space gravimetry, Geophysical Research Letters, accepted online, doi:10.1002/2016GL069538
Objective Lightning Probability Forecasts for East-Central Florida Airports
NASA Technical Reports Server (NTRS)
Crawford, Winfred C.
2013-01-01
The forecasters at the National Weather Service in Melbourne, FL, (NWS MLB) identified a need to make more accurate lightning forecasts to help alleviate delays due to thunderstorms in the vicinity of several commercial airports in central Florida at which they are responsible for issuing terminal aerodrome forecasts. Such forecasts would also provide safer ground operations around terminals, and would be of value to Center Weather Service Units serving air traffic controllers in Florida. To improve the forecast, the AMU was tasked to develop an objective lightning probability forecast tool for the airports using data from the National Lightning Detection Network (NLDN). The resulting forecast tool is similar to that developed by the AMU to support space launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) for use by the 45th Weather Squadron (45 WS) in previous tasks (Lambert and Wheeler 2005, Lambert 2007). The lightning probability forecasts are valid for the time periods and areas needed by the NWS MLB forecasters in the warm season months, defined in this task as May-September.
Dobramysl, U; Holcman, D
2018-02-15
Is it possible to recover the position of a source from the steady-state fluxes of Brownian particles to small absorbing windows located on the boundary of a domain? To address this question, we develop a numerical procedure to avoid tracking Brownian trajectories in the entire infinite space. Instead, we generate particles near the absorbing windows, computed from the analytical expression of the exit probability. When the Brownian particles are generated by a steady-state gradient at a single point, we compute asymptotically the fluxes to small absorbing holes distributed on the boundary of half-space and on a disk in two dimensions, which agree with stochastic simulations. We also derive an expression for the splitting probability between small windows using the matched asymptotic method. Finally, when there are more than two small absorbing windows, we show how to reconstruct the position of the source from the diffusion fluxes. The present approach provides a computational first principle for the mechanism of sensing a gradient of diffusing particles, a ubiquitous problem in cell biology.
We'll Meet Again: Revealing Distributional and Temporal Patterns of Social Contact
Pachur, Thorsten; Schooler, Lael J.; Stevens, Jeffrey R.
2014-01-01
What are the dynamics and regularities underlying social contact, and how can contact with the people in one's social network be predicted? In order to characterize distributional and temporal patterns underlying contact probability, we asked 40 participants to keep a diary of their social contacts for 100 consecutive days. Using a memory framework previously used to study environmental regularities, we predicted that the probability of future contact would follow in systematic ways from the frequency, recency, and spacing of previous contact. The distribution of contact probability across the members of a person's social network was highly skewed, following an exponential function. As predicted, it emerged that future contact scaled linearly with frequency of past contact, proportionally to a power function with recency of past contact, and differentially according to the spacing of past contact. These relations emerged across different contact media and irrespective of whether the participant initiated or received contact. We discuss how the identification of these regularities might inspire more realistic analyses of behavior in social networks (e.g., attitude formation, cooperation). PMID:24475073
The statistics of Pearce element diagrams and the Chayes closure problem
NASA Astrophysics Data System (ADS)
Nicholls, J.
1988-05-01
Pearce element ratios are defined as having a constituent in their denominator that is conserved in a system undergoing change. The presence of a conserved element in the denominator simplifies the statistics of such ratios and renders them subject to statistical tests, especially tests of significance of the correlation coefficient between Pearce element ratios. Pearce element ratio diagrams provide unambigous tests of petrologic hypotheses because they are based on the stoichiometry of rock-forming minerals. There are three ways to recognize a conserved element: 1. The petrologic behavior of the element can be used to select conserved ones. They are usually the incompatible elements. 2. The ratio of two conserved elements will be constant in a comagmatic suite. 3. An element ratio diagram that is not constructed with a conserved element in the denominator will have a trend with a near zero intercept. The last two criteria can be tested statistically. The significance of the slope, intercept and correlation coefficient can be tested by estimating the probability of obtaining the observed values from a random population of arrays. This population of arrays must satisfy two criteria: 1. The population must contain at least one array that has the means and variances of the array of analytical data for the rock suite. 2. Arrays with the means and variances of the data must not be so abundant in the population that nearly every array selected at random has the properties of the data. The population of random closed arrays can be obtained from a population of open arrays whose elements are randomly selected from probability distributions. The means and variances of these probability distributions are themselves selected from probability distributions which have means and variances equal to a hypothetical open array that would give the means and variances of the data on closure. This hypothetical open array is called the Chayes array. Alternatively, the population of random closed arrays can be drawn from the compositional space available to rock-forming processes. The minerals comprising the available space can be described with one additive component per mineral phase and a small number of exchange components. This space is called Thompson space. Statistics based on either space lead to the conclusion that Pearce element ratios are statistically valid and that Pearce element diagrams depict the processes that create chemical inhomogeneities in igneous rock suites.
Pérez, Germán M; Salomón, Luis A; Montero-Cabrera, Luis A; de la Vega, José M García; Mascini, Marcello
2016-05-01
A novel heuristic using an iterative select-and-purge strategy is proposed. It combines statistical techniques for sampling and classification by rigid molecular docking through an inverse virtual screening scheme. This approach aims to the de novo discovery of short peptides that may act as docking receptors for small target molecules when there are no data available about known association complexes between them. The algorithm performs an unbiased stochastic exploration of the sample space, acting as a binary classifier when analyzing the entire peptides population. It uses a novel and effective criterion for weighting the likelihood of a given peptide to form an association complex with a particular ligand molecule based on amino acid sequences. The exploratory analysis relies on chemical information of peptides composition, sequence patterns, and association free energies (docking scores) in order to converge to those peptides forming the association complexes with higher affinities. Statistical estimations support these results providing an association probability by improving predictions accuracy even in cases where only a fraction of all possible combinations are sampled. False positives/false negatives ratio was also improved with this method. A simple rigid-body docking approach together with the proper information about amino acid sequences was used. The methodology was applied in a retrospective docking study to all 8000 possible tripeptide combinations using the 20 natural amino acids, screened against a training set of 77 different ligands with diverse functional groups. Afterward, all tripeptides were screened against a test set of 82 ligands, also containing different functional groups. Results show that our integrated methodology is capable of finding a representative group of the top-scoring tripeptides. The associated probability of identifying the best receptor or a group of the top-ranked receptors is more than double and about 10 times higher, respectively, when compared to classical random sampling methods.
Kelsall, Helen Louise; McKenzie, Dean Philip; Forbes, Andrew Benjamin; Roberts, Minainyo Helen; Urquhart, Donna Michelle; Sim, Malcolm Ross
2014-04-01
Occupational activities such as lifting loads, working in constrained spaces, and training increase the risk of pain-related musculoskeletal disorders (MSDs) in military veterans. Few studies have investigated MSD and psychological disorder in veterans, and previous studies had limitations. This cross-sectional study compared pain-related MSD and psychological comorbidity and well-being between 1381 male Australian 1990-1991 Gulf War veterans (veterans) and a military comparison group (n=1377, of whom 39.6% were serving and 32.7% had previously deployed). At a medical assessment, 2000-2002, reported doctor-diagnosed arthritis or rheumatism, back or neck problems, joint problems, and soft tissue disorders were rated by medical practitioners as nonmedical, unlikely, possible, or probable diagnoses. Only probable MSDs were analysed. Psychological disorders in the past 12 months were measured using the Composite International Diagnostic Interview. The Short-Form Health Survey (SF-12) assessed 4-week physical and mental well-being. Almost one-quarter of veterans (24.5%) and the comparison group (22.4%) reported an MSD. Having any or specific MSD was associated with depression and posttraumatic stress disorder (PTSD), but not alcohol disorders. Physical and mental well-being was poorer in those with an MSD compared to those without, in both study groups (eg, veterans with any MSD, difference in SF-12 physical component summary scale medians = -10.49: 95% confidence interval -12.40, -8.57), and in those with MSD and psychological comorbidity compared with MSD alone. Comorbidity of any MSD and psychological disorder was more common in veterans, but MSDs were associated with depression, PTSD, and poorer well-being in both groups. Psychological comorbidity needs consideration in MSD management. Longitudinal studies are needed to assess directionality and causality. Copyright © 2013 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.
The COLA Collision Avoidance Method
NASA Astrophysics Data System (ADS)
Assmann, K.; Berger, J.; Grothkopp, S.
2009-03-01
In the following we present a collision avoidance method named COLA. The method has been designed to predict collisions for Earth orbiting spacecraft on any orbits, including orbit changes, with other space-born objects. The point in time of a collision and the collision probability are determined. To guarantee effective processing the COLA method uses a modular design and is composed of several components which are either developed within this work or deduced from existing algorithms: A filtering module, the close approach determination, the collision detection and the collision probability calculation. A software tool which implements the COLA method has been verified using various test cases built from sample missions. This software has been implemented in the C++ programming language and serves as a universal collision detection tool at LSE Space Engineering & Operations AG.
NASA Astrophysics Data System (ADS)
Tipler, F. J.
1982-10-01
An assessment is presented of the probability of the existence of intelligent extraterrestrial life in view of biological evolutionary constraints, in order to furnish some perspective for the hopes and claims of search of extraterrestrial intelligence (SETI) enthusiasts. Attention is given to a hypothetical extraterrestrial civilization's exploration/colonization of interstellar space by means of von Neumann machine-like, endlessly self-replicating space probes which would eventually reach the planetary systems of all stars in the Galaxy. These probes would be able to replicate the biology of their creator species, upon reaching a hospitable planet. It is suggested that the fundamental technological feasibility of such schemes, and their geometrically progressive comprehension of the Galaxy, would make actual colonization of the earth by extraterrestrials so probable as to destroy the hopes of SETI backers for occasional contact.
Constraints on the pre-impact orbits of Solar system giant impactors
NASA Astrophysics Data System (ADS)
Jackson, Alan P.; Gabriel, Travis S. J.; Asphaug, Erik I.
2018-03-01
We provide a fast method for computing constraints on impactor pre-impact orbits, applying this to the late giant impacts in the Solar system. These constraints can be used to make quick, broad comparisons of different collision scenarios, identifying some immediately as low-probability events, and narrowing the parameter space in which to target follow-up studies with expensive N-body simulations. We benchmark our parameter space predictions, finding good agreement with existing N-body studies for the Moon. We suggest that high-velocity impact scenarios in the inner Solar system, including all currently proposed single impact scenarios for the formation of Mercury, should be disfavoured. This leaves a multiple hit-and-run scenario as the most probable currently proposed for the formation of Mercury.
NASA Technical Reports Server (NTRS)
Anderson, Leif; Box, Neil; Carter, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael
2012-01-01
There are two general shortcomings to the current annual sparing assessment: 1. The vehicle functions are currently assessed according to confidence targets, which can be misleading- overly conservative or optimistic. 2. The current confidence levels are arbitrarily determined and do not account for epistemic uncertainty (lack of knowledge) in the ORU failure rate. There are two major categories of uncertainty that impact Sparing Assessment: (a) Aleatory Uncertainty: Natural variability in distribution of actual failures around an Mean Time Between Failure (MTBF) (b) Epistemic Uncertainty : Lack of knowledge about the true value of an Orbital Replacement Unit's (ORU) MTBF We propose an approach to revise confidence targets and account for both categories of uncertainty, an approach we call Probability and Confidence Trade-space (PACT) evaluation.
Multiscale/Multifunctional Probabilistic Composite Fatigue
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A multilevel (multiscale/multifunctional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.
Contextual Awareness for Robust Robot Autonomy
2013-12-30
are some ob - stacles that are either undetectable or only partially detectable to the sensors. Transparent obstacles are particularly challenging for...the probability that a certain object will be found in a certain space (e.g., the probability that coffee is found in the kitchen). While searching...with which it finds such objects, to find abnormal situations (e.g., if someone is actively hiding all the coffee from the CoBots). One of the most
Phase transitions in Nowak Sznajd opinion dynamics
NASA Astrophysics Data System (ADS)
Wołoszyn, Maciej; Stauffer, Dietrich; Kułakowski, Krzysztof
2007-05-01
The Nowak modification of the Sznajd opinion dynamics model on the square lattice assumes that with probability β the opinions flip due to mass-media advertising from down to up, and vice versa. Besides, with probability α the Sznajd rule applies that a neighbour pair agreeing in its two opinions convinces all its six neighbours of that opinion. Our Monte Carlo simulations and mean-field theory find sharp phase transitions in the parameter space.
The photon-plasmon transitions and diagnostics of the space plasma turbulence
NASA Astrophysics Data System (ADS)
Glushkov, Alexander; Glushkov, Alexander; Khetselius, Olga
We present a new approach to treating the space plasma turbulence, based on using to make diagnostic data regarding the photon-plasmon transitions. The theoretical definition of characteristics for these transitions is caried out within consistent theoretical approach, based on the Gell-Mann and Low formalism (energy approach in QED theory).We apply it to calculation of such transitions (Ps) with emission of photon and Langmuir quanta. It is well known that the hfs states of positronium Ps Ps differ in spin S, life time t and mode of annihilation. As a rule, probabilities of the cascade radiation transitions are more than the annihilation probability. The ortho-Ps atom has a metastable state 23s1 and probability of two-photon radiation transition from this state into 13s1 state (1.8•10(-3) 1/s) is significantly less than probability of the three-photon annihilation directly from 23s1level 8.9•10(5) s(-1), i.e. it is usually supposed that the ortho-Ps annihilates from 23s1state. Another situation may take place in plasma, where it is arisen the competition process of destruction of the metastable level - the photonplasmon transition 23s1-13s1with emission of photon and Langmuir quanta. In this paper we carried out the calculation of the probability of the Ps photon-plasmon transition and propose tu use it for diagnostics of the space plasma (dusty one etc.).Standard S-matrix calculation with using an expression for tensor of dielectric permeability of the isotropic space plasma and dispersion relationships for transverse and Langmuir waves [3] allows getting the corresponding probability P(ph-pl). Numerical value of P(ph-pl) is 5.2•10(6)•UL(s-1), where UL is density of the Langmuir waves energy. Our value is correlated with estimate, available in literature [3]: P(phpl)= 6•10(6)•UL (s-1). Comparison of the obtained probability with the life time t(3) allows getting the condition of predominance of the photon-plasmon transition over three-photon annihilation. It is demonstrated how the considered transition may control the population of 23s1 level and search of the long-lived Ps state that is further used for diagnostics of the space plasma turbulence. At last the experimental realization of the indicated methodics is discussed. References: 1. L.N.Ivanov, V.S.Letokhov, Com.Mod.Phys.D: At.Mol.Phys. 4,169 (1985); A.V.Glushkov, L.N.Ivanov, Phys.Lett.A,170, 36 (1992); Preprint of Institute for Specteroscopy of RAS, N AS-2, Troitsk (1992); L.N.Ivanov,E.P.Ivanova, L.V.Knight, Phys.Rev.A 48 4365 (1993); A.V.Glushkov,E.P.Ivanova, J.Quant.Spectr.Rad.Tr.(US) 36,127 (1986); 2. A.V.Glushkov,S.V.Malin etal, Bound Vol. Paris-Meudon Observ.,1995; J.Techn.Phys. 38 211, 219 (1997); In: New projects and new lines of research in nuclear physics. Eds. G.Fazio and F.Hanappe, Singapore : World Scientific.-2003.- P.242-250 ; Int.J.Quant.Chem. 99, 889 (2004); 104, 512 (2005). 3. V.I.Gol'dansky, Physical Chemistry of Positron and Positronium.-N.-Y., 1976;S.A.Kaplan, V.N.Tsytoivich, Plasma astrophysics.-Moscow, 1987; V.I.Gol'dansky, V.S.Letokhov, JETP 67, 533 (1974).
Vatanparast, Rodina; Lantz, Sarah; Ward, Kristine; Crilley, Pamela Ann; Styler, Michael
2012-11-01
The initial diagnosis of heparin-induced thrombocytopenia (HIT) is made on clinical grounds because the assays with the highest sensitivity (eg, heparin-platelet factor 4 antibody enzyme-linked immunosorbent assay [ELISA]) and specificity (eg, serotonin release assay) may not be readily available. The clinical utility of the pretest scoring system, the 4Ts, was developed and validated by Lo et al in the Journal of Thrombosis and Haemostasis in 2006. The pretest scoring system looks at the degree and timing of thrombocytopenia, thrombosis, and the possibility of other etiologies. Based on the 4T score, patients can be categorized as having a high, intermediate, or low probability of having HIT. We conducted a retrospective study of 100 consecutive patients who were tested for HIT during their hospitalization at Hahnemann University Hospital (Philadelphia, PA) in 2009. Of the 100 patients analyzed, 72, 23, and 5 patients had 4T pretest probability scores of low, intermediate, and high, respectively. A positive HIT ELISA (optical density > 1.0 unit) was detected in 0 of 72 patients (0%) in the low probability group, in 5 of 23 patients (22%) in the intermediate probability group, and in 2 of 5 patients (40%) in the high probability group. The average turnaround time for the HIT ELISA was 4 to 5 days. Fourteen (19%) of the 72 patients with a low pretest probability of HIT were treated with a direct thrombin inhibitor. Ten (71%) of the 14 patients in the low probability group treated with a direct thrombin inhibitor had a major complication of bleeding requiring blood transfusion support. In this retrospective study, a low 4T score showed 100% correlation with a negative HIT antibody assay. We recommend incorporating the 4T scoring system into institutional core measures when assessing a patient with suspected HIT, selecting only patients with intermediate to high probability for therapeutic intervention, which may translate into reduced morbidity and lower health care costs.
A model of hygiene practices and consumption patterns in the consumer phase.
Christensen, Bjarke B; Rosenquist, Hanne; Sommer, Helle M; Nielsen, Niels L; Fagt, Sisse; Andersen, Niels L; Nørrung, Birgit
2005-02-01
A mathematical model is presented, which addresses individual hygiene practices during food preparation and consumption patterns in private homes. Further, the model links food preparers and consumers based on their relationship to household types. For different age and gender groups, the model estimates (i) the probability of ingesting a meal where precautions have not been taken to avoid the transfer of microorganisms from raw food to final meal (a risk meal), exemplified by the event that the cutting board was not washed during food preparation, and (ii) the probability of ingesting a risk meal in a private home, where chicken was the prepared food item (a chicken risk meal). Chicken was included in the model, as chickens are believed to be the major source of human exposure to the foodborne pathogen Campylobacter. Monte Carlo simulations showed that the probability of ingesting a risk meal was highest for young males (aged 18-29 years) and lowest for the elderly above 60 years of age. Children aged 0-4 years had a higher probability of ingesting a risk meal than children aged 5-17 years. This difference between age and gender groups was ascribed to the variations in the hygiene levels of food preparers. By including the probability of ingesting a chicken meal at home, simulations revealed that all age groups, except the group above 60 years of age, had approximately the same probability of ingesting a chicken risk meal, the probability of females being slightly higher than that of males. The simulated results show that the probability of ingesting a chicken risk meal at home does not only depend on the hygiene practices of the persons preparing the food, but also on the consumption patterns of consumers, and the relationship between people preparing and ingesting food. This finding supports the need of including information on consumer behavior and preparation hygiene in the consumer phase of exposure assessments.
Man-made space debris - Does it restrict free access to space
NASA Technical Reports Server (NTRS)
Wolfe, M.; Chobotov, V.; Kessler, D.; Reynolds, R.
1981-01-01
Consideration is given to the hazards posed by existing and future man-made space debris to spacecraft operations. The components of the hazard are identified as those fragments resulting from spacecraft explosions and spent stages which can be tracked, those fragments which are too small to be tracked at their present distances, and future debris, which, if present trends in spacecraft design and operation continue, may lead to an unacceptably high probability of collision with operational spacecraft within a decade. It is argued that a coordinated effort must be undertaken by all space users to evaluate means of space debris control in order to allow for the future unrestricted use of near-earth space. A plan for immediate action to forestall the space debris problem by activities in the areas of education, debris monitoring and collection technology, space vehicle design, space operational procedures and practices and space policies and treaties is proposed.
Stationary properties of maximum-entropy random walks.
Dixit, Purushottam D
2015-10-01
Maximum-entropy (ME) inference of state probabilities using state-dependent constraints is popular in the study of complex systems. In stochastic systems, how state space topology and path-dependent constraints affect ME-inferred state probabilities remains unknown. To that end, we derive the transition probabilities and the stationary distribution of a maximum path entropy Markov process subject to state- and path-dependent constraints. A main finding is that the stationary distribution over states differs significantly from the Boltzmann distribution and reflects a competition between path multiplicity and imposed constraints. We illustrate our results with particle diffusion on a two-dimensional landscape. Connections with the path integral approach to diffusion are discussed.
Probability of detection of defects in coatings with electronic shearography
NASA Astrophysics Data System (ADS)
Maddux, Gary A.; Horton, Charles M.; Lansing, Matthew D.; Gnacek, William J.; Newton, Patrick L.
1994-07-01
The goal of this research was to utilize statistical methods to evaluate the probability of detection (POD) of defects in coatings using electronic shearography. The coating system utilized in the POD studies was to be the paint system currently utilized on the external casings of the NASA Space Transportation System (STS) Revised Solid Rocket Motor (RSRM) boosters. The population of samples was to be large enough to determine the minimum defect size for 90 percent probability of detection of 95 percent confidence POD on these coatings. Also, the best methods to excite coatings on aerospace components to induce deformations for measurement by electronic shearography were to be determined.
Probability of detection of defects in coatings with electronic shearography
NASA Technical Reports Server (NTRS)
Maddux, Gary A.; Horton, Charles M.; Lansing, Matthew D.; Gnacek, William J.; Newton, Patrick L.
1994-01-01
The goal of this research was to utilize statistical methods to evaluate the probability of detection (POD) of defects in coatings using electronic shearography. The coating system utilized in the POD studies was to be the paint system currently utilized on the external casings of the NASA Space Transportation System (STS) Revised Solid Rocket Motor (RSRM) boosters. The population of samples was to be large enough to determine the minimum defect size for 90 percent probability of detection of 95 percent confidence POD on these coatings. Also, the best methods to excite coatings on aerospace components to induce deformations for measurement by electronic shearography were to be determined.
NASA Technical Reports Server (NTRS)
Deiwert, G. S.; Yoshikawa, K. K.
1975-01-01
A semiclassical model proposed by Pearson and Hansen (1974) for computing collision-induced transition probabilities in diatomic molecules is tested by the direct-simulation Monte Carlo method. Specifically, this model is described by point centers of repulsion for collision dynamics, and the resulting classical trajectories are used in conjunction with the Schroedinger equation for a rigid-rotator harmonic oscillator to compute the rotational energy transition probabilities necessary to evaluate the rotation-translation exchange phenomena. It is assumed that a single, average energy spacing exists between the initial state and possible final states for a given collision.
NASA Astrophysics Data System (ADS)
Ballı, C.; Acar, M.; Caglar, F.; Tan, E.; Onol, B.; Karan, H.; Unal, Y. S.
2012-04-01
The main focus of this study is to compare the 24 hourly WRF model and HYSPLIT performances to the observations in terms of concentrations using FMS technique and to determine the probabilities of the spread of the modeled concentrations. In this study, 0.25-degree grid size ECMWF operational model data set is used to generate 24-hour forecasts of atmospheric fields by the WRF model. Each daily forecast is started for both 00 UTC and 12 UTC for the months of January and July of 2009. The interested model area is downscaled by the ratio of 3, starting from 9km resolution to the 1km resolution. 45 vertical levels were structured for the 3 nested domains of which Istanbul is centered. After the WRF model was used for these four sets of simulations, the dispersions of particles are analyzed by using HYSPLIT model. 30,000 particulates with the initial delivery of 5,000 particles to the atmosphere are released at 10m over Istanbul. The concentration analyses are performed for the nested domains in the order of one mother domain only, domain 1 and 2, and three nested-domains, which are named as WRFD1, WRFD12, and WRFD123, respectively. The Figure of Merit in Space (FMS) method is applied to the HYSPLIT results, which are obtained from the WRF model in order to perform the space analysis to be able to compare them to the concentrations calculated by ECMWF Interim data. FMS can be counted as the statistical coefficient of this space analysis, so one can expect that high FMS values can show high agreement between observations and model results. Since FMS is a ratio between the intersections of the areas to their union, it is not possible to deduce whether the model over predicts or under predicts, but it is a good indicator for the spread of the concentration in space. In this study, we have used percentage values of FMS for the fixed time as January and July 2009 and for a fixed concentration level. FMS analysis is applied to the three domain structures as defined above, WRFD1, WRFD12, and WRFD123. FMS values are calculated for the threshold value of 1 pgm-3. The FMS results verify that WRF model wind velocity results are in good agreement with ECMWF ERA Interim data for the level of 10m. FMS values show us that probabilities of 13 days are higher than 50% for July average. Whereas, in January, only 4 days pass over 50%. Consequently, this indicate that July model forecasts may give better results than January forecasts. Moreover, we have calculated the probabilities of the concentration spread for both July and January and detected different spreads between 12 UTC and 00 UTC initialization. Therefore, 12 UTC results show higher probabilities than 00 UTC. According to January 00 UTC and 12 UTC model results, dominant direction of particles' spread is southwesterly. Consistently, the higher probability concentrations can be seen in the Black Sea region extending to the Northern neighbors of Turkey with the probability of approximately 20%. We also observed secondary dominant particles dispersion in the northeast direction with the probability of 25% extending to the Northern Aegean Sea and to the coast of Greece. Since Istanbul is the hypothetical origin location of particle release, the highest probability of concentrations is seen in this location. In July, for 00 UTC, the highest probability spread is toward to the south. Because the predominant wind direction in summer is northeasterly in the northwestern part of Turkey, north Aegean and Marmara Seas are affected by particles with 40% chance. Although, for further south, this probability is decreased to 25 to 30%, Central and Western Anatolia and the border of Greece are still at higher risk. As a result, our analyses indicate that if there is an explosion in Istanbul area, high-risk regions depend on the season. If it occurs in winter, the transported hazardous particles might affect the northern part of Turkey and its neighbors, while in summer the southern and western part of Turkey is under the threat. Key words: Turkey, FMS and probability analyses, concentration analysis, WRF, HYSPLIT models.
NASA Astrophysics Data System (ADS)
Schmincke, Hans-Ulrich; Rausch, Juanita; Kutterolf, Steffen; Freundt, Armin
2010-10-01
We analyzed bare human footprints in Holocene tuff preserved in two pits in the Acahualinca barrio in the northern outskirts of Managua (Nicaragua). Lithology, volcanology, and age of the deposits are discussed in a companion paper (Schmincke et al. Bull Volcanol doi:
Assessment of zero gravity effects on space worker health and safety
NASA Technical Reports Server (NTRS)
1980-01-01
One objective of the study is to assess the effects of all currently known deviations from normal of medical, physiological, and biochemical parameters which appear to be due to zero gravity (zero-g) environment and to acceleration and deceleration to be experienced, as outlined in the references Solar Power Satellites (SPS) design, by space worker. Study results include identification of possible health or safety effects on space workers either immediate or delayed due to the zero gravity environment and acceleration and deceleration; estimation of the probability that an individual will be adversely affected; description of the possible consequence to work efficiency in persons adversely affected; and description of the possible/probable consequences to immediate and future health of individuals exposed to this environment. A research plan, which addresses the uncertainties in current knowledge regarding the health and safety hazards to exposed SPS space workers, is presented. Although most adverse affects experienced during space flight soon disappeared upon return to the Earth's environment, there remains a definite concern for the long-term effects to SPS space workers who might spend as much as half their time in space during a possible five year career period. The proposed 90 day up/90 day down cycle, coupled with the fact that most of the effects of weightlessness may persist throughout the flight along with the realization that recovery may occupy much of the terrestrial stay, may keep the SPS workers in a deviant physical condition or state of flux for 60 to 100% of their five year career.
Trap configuration and spacing influences parameter estimates in spatial capture-recapture models
Sun, Catherine C.; Fuller, Angela K.; Royle, J. Andrew
2014-01-01
An increasing number of studies employ spatial capture-recapture models to estimate population size, but there has been limited research on how different spatial sampling designs and trap configurations influence parameter estimators. Spatial capture-recapture models provide an advantage over non-spatial models by explicitly accounting for heterogeneous detection probabilities among individuals that arise due to the spatial organization of individuals relative to sampling devices. We simulated black bear (Ursus americanus) populations and spatial capture-recapture data to evaluate the influence of trap configuration and trap spacing on estimates of population size and a spatial scale parameter, sigma, that relates to home range size. We varied detection probability and home range size, and considered three trap configurations common to large-mammal mark-recapture studies: regular spacing, clustered, and a temporal sequence of different cluster configurations (i.e., trap relocation). We explored trap spacing and number of traps per cluster by varying the number of traps. The clustered arrangement performed well when detection rates were low, and provides for easier field implementation than the sequential trap arrangement. However, performance differences between trap configurations diminished as home range size increased. Our simulations suggest it is important to consider trap spacing relative to home range sizes, with traps ideally spaced no more than twice the spatial scale parameter. While spatial capture-recapture models can accommodate different sampling designs and still estimate parameters with accuracy and precision, our simulations demonstrate that aspects of sampling design, namely trap configuration and spacing, must consider study area size, ranges of individual movement, and home range sizes in the study population.
Space Radiation Cancer Risks and Uncertainities for Different Mission Time Periods
NASA Technical Reports Server (NTRS)
Kim,Myung-Hee Y.; Cucinotta, Francis A.
2012-01-01
Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which includes high energy protons and high charge and energy (HZE) nuclei. For long duration missions, space radiation presents significant health risks including cancer mortality. Probabilistic risk assessment (PRA) is essential for radiation protection of crews on long term space missions outside of the protection of the Earth s magnetic field and for optimization of mission planning and costs. For the assessment of organ dosimetric quantities and cancer risks, the particle spectra at each critical body organs must be characterized. In implementing a PRA approach, a statistical model of SPE fluence was developed, because the individual SPE occurrences themselves are random in nature while the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. An overall cumulative probability of a GCR environment for a specified mission period was estimated for the temporal characterization of the GCR environment represented by the deceleration potential (theta). Finally, this probabilistic approach to space radiation cancer risk was coupled with a model of the radiobiological factors and uncertainties in projecting cancer risks. Probabilities of fatal cancer risk and 95% confidence intervals will be reported for various periods of space missions.