Probabilistic Rock Slope Engineering.
1984-06-01
4 U rmy Corps PROBABILISTIC ROCK SLOPE ENGINEERING by Stanley M. Miller jGeotechnical Engineer 509 E. Calle Avenue Tucson, Arizona 85705 Co N 00 IFI...NUMBERS Geological Engineer CW71 1ork Unit 31755 509 E. Calle Avenue, Tucson, Arizona 85705 11. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE...communication, J. P. Sa,.-1Iy, Inspiration Consolidated Copper Co., Inspiration, Ariz., 1980. Personal communication, R. D. Call, Pincock, Allen, and
A Hough Transform Global Probabilistic Approach to Multiple-Subject Diffusion MRI Tractography
2010-04-01
distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by...umn.edu 2 ABSTRACT A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in...criteria for aligning curves and particularly tracts. In this work, we present a global probabilistic approach inspired by the voting procedure provided
Quantum-like Modeling of Cognition
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2015-09-01
This paper begins with a historical review of the mutual influence of physics and psychology, from Freud's invention of psychic energy inspired by von Boltzmann' thermodynamics to the enrichment quantum physics gained from the side of psychology by the notion of complementarity (the invention of Niels Bohr who was inspired by William James), besides we consider the resonance of the correspondence between Wolfgang Pauli and Carl Jung in both physics and psychology. Then we turn to the problem of development of mathematical models for laws of thought starting with Boolean logic and progressing towards foundations of classical probability theory. Interestingly, the laws of classical logic and probability are routinely violated not only by quantum statistical phenomena but by cognitive phenomena as well. This is yet another common feature between quantum physics and psychology. In particular, cognitive data can exhibit a kind of the probabilistic interference effect. This similarity with quantum physics convinced a multi-disciplinary group of scientists (physicists, psychologists, economists, sociologists) to apply the mathematical apparatus of quantum mechanics to modeling of cognition. We illustrate this activity by considering a few concrete phenomena: the order and disjunction effects, recognition of ambiguous figures, categorization-decision making. In Appendix 1 we briefly present essentials of theory of contextual probability and a method of representations of contextual probabilities by complex probability amplitudes (solution of the ``inverse Born's problem'') based on a quantum-like representation algorithm (QLRA).
Magnetic Tunnel Junction Mimics Stochastic Cortical Spiking Neurons
NASA Astrophysics Data System (ADS)
Sengupta, Abhronil; Panda, Priyadarshini; Wijesinghe, Parami; Kim, Yusung; Roy, Kaushik
2016-07-01
Brain-inspired computing architectures attempt to mimic the computations performed in the neurons and the synapses in the human brain in order to achieve its efficiency in learning and cognitive tasks. In this work, we demonstrate the mapping of the probabilistic spiking nature of pyramidal neurons in the cortex to the stochastic switching behavior of a Magnetic Tunnel Junction in presence of thermal noise. We present results to illustrate the efficiency of neuromorphic systems based on such probabilistic neurons for pattern recognition tasks in presence of lateral inhibition and homeostasis. Such stochastic MTJ neurons can also potentially provide a direct mapping to the probabilistic computing elements in Belief Networks for performing regenerative tasks.
NASA Astrophysics Data System (ADS)
Serb, Alexander; Bill, Johannes; Khiat, Ali; Berdan, Radu; Legenstein, Robert; Prodromakis, Themis
2016-09-01
In an increasingly data-rich world the need for developing computing systems that cannot only process, but ideally also interpret big data is becoming continuously more pressing. Brain-inspired concepts have shown great promise towards addressing this need. Here we demonstrate unsupervised learning in a probabilistic neural network that utilizes metal-oxide memristive devices as multi-state synapses. Our approach can be exploited for processing unlabelled data and can adapt to time-varying clusters that underlie incoming data by supporting the capability of reversible unsupervised learning. The potential of this work is showcased through the demonstration of successful learning in the presence of corrupted input data and probabilistic neurons, thus paving the way towards robust big-data processors.
Probabilistic graphs as a conceptual and computational tool in hydrology and water management
NASA Astrophysics Data System (ADS)
Schoups, Gerrit
2014-05-01
Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.
Global Optimization of Interplanetary Trajectories in the Presence of Realistic Mission Constraints
NASA Technical Reports Server (NTRS)
Hinckley, David; Englander, Jacob; Hitt, Darren
2015-01-01
Single trial evaluations Trial creation by Phase-wise GA-style or DE-inspired recombination Bin repository structure requires an initialization period Non-exclusionary Kill Distance Population collapse mechanic Main loop Creation Probabilistic switch between GA and DE creation types Locally optimize Submit to repository Repeat.
A probabilistic approach to randomness in geometric configuration of scalable origami structures
NASA Astrophysics Data System (ADS)
Liu, Ke; Paulino, Glaucio; Gardoni, Paolo
2015-03-01
Origami, an ancient paper folding art, has inspired many solutions to modern engineering challenges. The demand for actual engineering applications motivates further investigation in this field. Although rooted from the historic art form, many applications of origami are based on newly designed origami patterns to match the specific requirenments of an engineering problem. The application of origami to structural design problems ranges from micro-structure of materials to large scale deployable shells. For instance, some origami-inspired designs have unique properties such as negative Poisson ratio and flat foldability. However, origami structures are typically constrained by strict mathematical geometric relationships, which in reality, can be easily violated, due to, for example, random imperfections introduced during manufacturing, or non-uniform deformations under working conditions (e.g. due to non-uniform thermal effects). Therefore, the effects of uncertainties in origami-like structures need to be studied in further detail in order to provide a practical guide for scalable origami-inspired engineering designs. Through reliability and probabilistic analysis, we investigate the effect of randomness in origami structures on their mechanical properties. Dislocations of vertices of an origami structure have different impacts on different mechanical properties, and different origami designs could have different sensitivities to imperfections. Thus we aim to provide a preliminary understanding of the structural behavior of some common scalable origami structures subject to randomness in their geometric configurations in order to help transition the technology toward practical applications of origami engineering.
Supramolecular domains in mixed peptide self-assembled monolayers on gold nanoparticles.
Duchesne, Laurence; Wells, Geoff; Fernig, David G; Harris, Sarah A; Lévy, Raphaël
2008-09-01
Self-organization in mixed self-assembled monolayers of small molecules provides a route towards nanoparticles with complex molecular structures. Inspired by structural biology, a strategy based on chemical cross-linking is introduced to probe proximity between functional peptides embedded in a mixed self-assembled monolayer at the surface of a nanoparticle. The physical basis of the proximity measurement is a transition from intramolecular to intermolecular cross-linking as the functional peptides get closer. Experimental investigations of a binary peptide self-assembled monolayer show that this transition happens at an extremely low molar ratio of the functional versus matrix peptide. Molecular dynamics simulations of the peptide self-assembled monolayer are used to calculate the volume explored by the reactive groups. Comparison of the experimental results with a probabilistic model demonstrates that the peptides are not randomly distributed at the surface of the nanoparticle, but rather self-organize into supramolecular domains.
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Mustafa Sacit; none,; Flanagan, George F.
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less
ERIC Educational Resources Information Center
Denison, Stephanie; Trikutam, Pallavi; Xu, Fei
2014-01-01
A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar…
Physics-based Morphology Analysis and Adjoint Optimization of Flexible Flapping Wings
2016-08-30
understand the underlying physics of flexible wings in flying insects and birds towards the bio -inspired wing designs with superior aerodynamic...flapping flights have been developed to understand the underlying physics of flexible wings in flying insects and birds towards the bio -inspired wing...been developed to understand the underlying physics of flexible wings in flying insects and birds towards the bio -inspired wing designs with superior
ERIC Educational Resources Information Center
Pine, William E.; Taylor, William W. L.
1991-01-01
Describes a science project, Interactive Space Physics Ionosphere Radio Experiments (INSPIRE), that allows students to work with physicists to address unanswered questions about the physics of space. (ZWH)
NASA Astrophysics Data System (ADS)
Hoffmann, K.; Srouji, R. G.; Hansen, S. O.
2017-12-01
The technology development within the structural design of long-span bridges in Norwegian fjords has created a need for reformulating the calculation format and the physical quantities used to describe the properties of wind and the associated wind-induced effects on bridge decks. Parts of a new probabilistic format describing the incoming, undisturbed wind is presented. It is expected that a fixed probabilistic format will facilitate a more physically consistent and precise description of the wind conditions, which in turn increase the accuracy and considerably reduce uncertainties in wind load assessments. Because the format is probabilistic, a quantification of the level of safety and uncertainty in predicted wind loads is readily accessible. A simple buffeting response calculation demonstrates the use of probabilistic wind data in the assessment of wind loads and responses. Furthermore, vortex-induced fatigue damage is discussed in relation to probabilistic wind turbulence data and response measurements from wind tunnel tests.
Perspective: Stochastic magnetic devices for cognitive computing
NASA Astrophysics Data System (ADS)
Roy, Kaushik; Sengupta, Abhronil; Shim, Yong
2018-06-01
Stochastic switching of nanomagnets can potentially enable probabilistic cognitive hardware consisting of noisy neural and synaptic components. Furthermore, computational paradigms inspired from the Ising computing model require stochasticity for achieving near-optimality in solutions to various types of combinatorial optimization problems such as the Graph Coloring Problem or the Travelling Salesman Problem. Achieving optimal solutions in such problems are computationally exhaustive and requires natural annealing to arrive at the near-optimal solutions. Stochastic switching of devices also finds use in applications involving Deep Belief Networks and Bayesian Inference. In this article, we provide a multi-disciplinary perspective across the stack of devices, circuits, and algorithms to illustrate how the stochastic switching dynamics of spintronic devices in the presence of thermal noise can provide a direct mapping to the computational units of such probabilistic intelligent systems.
Yurtkuran, Alkın; Emel, Erdal
2016-01-01
The artificial bee colony (ABC) algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA) to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.
To help address the Food Quality Protection Act of 1996, a physically-based, two-stage Monte Carlo probabilistic model has been developed to quantify and analyze aggregate exposure and dose to pesticides via multiple routes and pathways. To illustrate model capabilities and ide...
ERIC Educational Resources Information Center
Tischhauser, Karen
2015-01-01
Students need inspiration to write. Assigning is not teaching. In order to inspire students to write fiction worth reading, teachers must take them through the process of writing. Physical objects inspire good writing with depth. In this article, the reader will be taken through the process of inspiring young writers through the use of boxes.…
Compiling probabilistic, bio-inspired circuits on a field programmable analog array
Marr, Bo; Hasler, Jennifer
2014-01-01
A field programmable analog array (FPAA) is presented as an energy and computational efficiency engine: a mixed mode processor for which functions can be compiled at significantly less energy costs using probabilistic computing circuits. More specifically, it will be shown that the core computation of any dynamical system can be computed on the FPAA at significantly less energy per operation than a digital implementation. A stochastic system that is dynamically controllable via voltage controlled amplifier and comparator thresholds is implemented, which computes Bernoulli random variables. From Bernoulli variables it is shown exponentially distributed random variables, and random variables of an arbitrary distribution can be computed. The Gillespie algorithm is simulated to show the utility of this system by calculating the trajectory of a biological system computed stochastically with this probabilistic hardware where over a 127X performance improvement over current software approaches is shown. The relevance of this approach is extended to any dynamical system. The initial circuits and ideas for this work were generated at the 2008 Telluride Neuromorphic Workshop. PMID:24847199
NASA Astrophysics Data System (ADS)
2004-01-01
Physics on Stage: Physics on Stage focuses on life Women in Physics: DNA posters highlight the role of women Physics on Stage: Not just fair but better than ever Physics on Stage: Food inspires teaching of physics Physics on Stage: Powerful performances dispel the myth of boring physics Physics Songs: Physics inspires some of our readers to sing Physics on Stage: Awards recognize achievements of science teachers in Europe Curriculum: Japan tests Advancing Physics UK Assessment System: Assessment overhaul is overdue Future Physicists: Ambassadors are bringing physics alive Physics at work: Physics at work still going strong Teaching Teachers: US coalition helps new teachers Forthcoming Events
Probabilistic sharing solves the problem of costly punishment
NASA Astrophysics Data System (ADS)
Chen, Xiaojie; Szolnoki, Attila; Perc, Matjaž
2014-08-01
Cooperators that refuse to participate in sanctioning defectors create the second-order free-rider problem. Such cooperators will not be punished because they contribute to the public good, but they also eschew the costs associated with punishing defectors. Altruistic punishers—those that cooperate and punish—are at a disadvantage, and it is puzzling how such behaviour has evolved. We show that sharing the responsibility to sanction defectors rather than relying on certain individuals to do so permanently can solve the problem of costly punishment. Inspired by the fact that humans have strong but also emotional tendencies for fair play, we consider probabilistic sanctioning as the simplest way of distributing the duty. In well-mixed populations the public goods game is transformed into a coordination game with full cooperation and defection as the two stable equilibria, while in structured populations pattern formation supports additional counterintuitive solutions that are reminiscent of Parrondo's paradox.
Denison, Stephanie; Trikutam, Pallavi; Xu, Fei
2014-08-01
A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar contexts with different outcomes. Can infants rapidly acquire probabilistic physical knowledge, such as some leaves fall and some glasses break by simply observing the statistical regularity with which objects behave and apply that knowledge in subsequent reasoning? We taught 11-month-old infants physical constraints on objects and asked them to reason about the probability of different outcomes when objects were drawn from a large distribution. Infants could have reasoned either by using the perceptual similarity between the samples and larger distributions or by applying physical rules to adjust base rates and estimate the probabilities. Infants learned the physical constraints quickly and used them to estimate probabilities, rather than relying on similarity, a version of the representativeness heuristic. These results indicate that infants can rapidly and flexibly acquire physical knowledge about objects following very brief exposure and apply it in subsequent reasoning. PsycINFO Database Record (c) 2014 APA, all rights reserved.
INSPIRE - Premission. [Interactive NASA Space Physics Ionosphere Radio Experiment
NASA Technical Reports Server (NTRS)
Taylor, William W. L.; Mideke, Michael; Pine, William E.; Ericson, James D.
1992-01-01
The Interactive NASA Space Physics Ionosphere Radio Experiment (INSPIRE) designed to assist in a Space Experiments with Particle Accelerators (SEPAC) project is discussed. INSPIRE is aimed at recording data from a large number of receivers on the ground to determine the exact propagation paths and absorption of radio waves at frequencies between 50 Hz and 7 kHz. It is indicated how to participate in the experiment that will involve high school classes, colleges, and amateur radio operators.
Practices of Waldorf-Inspired Schools. Research Brief
ERIC Educational Resources Information Center
Friedlaender, Diane; Beckham, Kyle; Zheng, Xinhua; Darling-Hammond, Linda
2015-01-01
"Growing a Waldorf-Inspired Approach in a Public School District" documents the practices and outcomes of Alice Birney, a Waldorf-Inspired School in Sacramento City Unified School District (SCUSD). This study highlights how such a school addresses students' academic, social, emotional, physical, and creative development. The study also…
Automated liver segmentation using a normalized probabilistic atlas
NASA Astrophysics Data System (ADS)
Linguraru, Marius George; Li, Zhixi; Shah, Furhawn; Chin, See; Summers, Ronald M.
2009-02-01
Probabilistic atlases of anatomical organs, especially the brain and the heart, have become popular in medical image analysis. We propose the construction of probabilistic atlases which retain structural variability by using a size-preserving modified affine registration. The organ positions are modeled in the physical space by normalizing the physical organ locations to an anatomical landmark. In this paper, a liver probabilistic atlas is constructed and exploited to automatically segment liver volumes from abdominal CT data. The atlas is aligned with the patient data through a succession of affine and non-linear registrations. The overlap and correlation with manual segmentations are 0.91 (0.93 DICE coefficient) and 0.99 respectively. Little work has taken place on the integration of volumetric measures of liver abnormality to clinical evaluations, which rely on linear estimates of liver height. Our application measures the liver height at the mid-hepatic line (0.94 correlation with manual measurements) and indicates that its combination with volumetric estimates could assist the development of a noninvasive tool to assess hepatomegaly.
A Probabilistic Model of Social Working Memory for Information Retrieval in Social Interactions.
Li, Liyuan; Xu, Qianli; Gan, Tian; Tan, Cheston; Lim, Joo-Hwee
2018-05-01
Social working memory (SWM) plays an important role in navigating social interactions. Inspired by studies in psychology, neuroscience, cognitive science, and machine learning, we propose a probabilistic model of SWM to mimic human social intelligence for personal information retrieval (IR) in social interactions. First, we establish a semantic hierarchy as social long-term memory to encode personal information. Next, we propose a semantic Bayesian network as the SWM, which integrates the cognitive functions of accessibility and self-regulation. One subgraphical model implements the accessibility function to learn the social consensus about IR-based on social information concept, clustering, social context, and similarity between persons. Beyond accessibility, one more layer is added to simulate the function of self-regulation to perform the personal adaptation to the consensus based on human personality. Two learning algorithms are proposed to train the probabilistic SWM model on a raw dataset of high uncertainty and incompleteness. One is an efficient learning algorithm of Newton's method, and the other is a genetic algorithm. Systematic evaluations show that the proposed SWM model is able to learn human social intelligence effectively and outperforms the baseline Bayesian cognitive model. Toward real-world applications, we implement our model on Google Glass as a wearable assistant for social interaction.
The James Webb Space Telescope: Inspiration and Context for Physics and Chemistry Teaching
ERIC Educational Resources Information Center
Hillier, Dan; Johnston, Tania; Davies, John
2012-01-01
This article describes the design, delivery, evaluation and impact of a CPD course for physics and chemistry teachers. A key aim of the course was to use the context of the James Webb Space Telescope project to inspire teachers and lead to enriched teaching of STEM subjects. (Contains 1 box and 3 figures.)
INSPIRE: A VLF Radio Project for High School Students
ERIC Educational Resources Information Center
Marshall, Jill A.; Pine, Bill; Taylor, William W. L.
2007-01-01
Since 1988 the Interactive NASA Space Physics Ionospheric Radio Experiment, or INSPIRE, has given students the opportunity to build research-quality VLF radio receivers and make observations of both natural and stimulated radio waves in the atmosphere. Any high school science class is eligible to join the INSPIRE volunteer observing network and…
Bayesian estimation of Karhunen–Loève expansions; A random subspace approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhary, Kenny; Najm, Habib N.
One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less
Bayesian estimation of Karhunen–Loève expansions; A random subspace approach
Chowdhary, Kenny; Najm, Habib N.
2016-04-13
One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less
Guzauskas, Gregory F; Villa, Kathleen F; Vanhove, Geertrui F; Fisher, Vicki L; Veenstra, David L
2017-03-01
To estimate the risk-benefit trade-off of a pediatric-inspired regimen versus hyperfractionated cyclophosphamide, vincristine, doxorubicin, and dexamethasone (hyper-CVAD) for first-line treatment of adolescents/young adult (AYA; ages 16-39 years) patients with Philadelphia-negative acute lymphoblastic leukemia. Patient outcomes were simulated using a 6-state Markov model, including complete response (CR), no CR, first relapse, second CR, second relapse, and death. A Weibull distribution was fit to the progression-free survival curve of hyper-CVAD-treated AYA patients from a single-center study, and comparable patient data from a retrospective study of pediatric regimen-treated AYA patients were utilized to estimate a relative progression difference (hazard ratio = 0.51) and model survival differences. Health-state utilities were estimated based on treatment stage, with an assumption that the pediatric protocol had 0.10 disutility compared with hyper-CVAD before the maintenance phase of treatment. Total life-years and quality-adjusted life-years (QALYs) were compared between treatment protocols at 1, 5, and 10 years, with additional probabilistic sensitivity analyses. Treatment with the pediatric-inspired protocol was associated with a 0.04 increase in life-years, but a 0.01 decrease in QALYs at 1 year. By years 5 and 10, the pediatric-inspired protocol resulted in 0.18 and 0.24 increase in life-years and 0.25 and 0.32 increase in QALYs, respectively, relative to hyper-CVAD. The lower quality of life associated with the induction and intensification phases of pediatric treatment was offset by more favorable progression-free survival and overall survival relative to hyper-CVAD. Our exploratory analysis suggests that, compared with hyper-CVAD, pediatric-inspired protocols may increase life-years throughout treatment stages and QALYs in the long term.
From cyclone tracks to the costs of European winter storms: A probabilistic loss assessment model
NASA Astrophysics Data System (ADS)
Renggli, Dominik; Corti, Thierry; Reese, Stefan; Wueest, Marc; Viktor, Elisabeth; Zimmerli, Peter
2014-05-01
The quantitative assessment of the potential losses of European winter storms is essential for the economic viability of a global reinsurance company. For this purpose, reinsurance companies generally use probabilistic loss assessment models. This work presents an innovative approach to develop physically meaningful probabilistic events for Swiss Re's new European winter storm loss model. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20th Century Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of properties of historical events (e.g. track, intensity). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account. The low-resolution wind footprints taken from 20th Century Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints of the historical and probabilistic winter storm events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country- and risk-specific vulnerability functions and detailed market- or client-specific exposure information to compute (re-)insurance risk premiums.
Gravish, Nick; Lauder, George V
2018-03-29
For centuries, designers and engineers have looked to biology for inspiration. Biologically inspired robots are just one example of the application of knowledge of the natural world to engineering problems. However, recent work by biologists and interdisciplinary teams have flipped this approach, using robots and physical models to set the course for experiments on biological systems and to generate new hypotheses for biological research. We call this approach robotics-inspired biology; it involves performing experiments on robotic systems aimed at the discovery of new biological phenomena or generation of new hypotheses about how organisms function that can then be tested on living organisms. This new and exciting direction has emerged from the extensive use of physical models by biologists and is already making significant advances in the areas of biomechanics, locomotion, neuromechanics and sensorimotor control. Here, we provide an introduction and overview of robotics-inspired biology, describe two case studies and suggest several directions for the future of this exciting new research area. © 2018. Published by The Company of Biologists Ltd.
ERIC Educational Resources Information Center
Bao, Lei; Redish, Edward F.
2002-01-01
Explains the critical role of probability in making sense of quantum physics and addresses the difficulties science and engineering undergraduates experience in helping students build a model of how to think about probability in physical systems. (Contains 17 references.) (Author/YDS)
NASA Astrophysics Data System (ADS)
Hussin, Haydar; van Westen, Cees; Reichenbach, Paola
2013-04-01
Local and regional authorities in mountainous areas that deal with hydro-meteorological hazards like landslides and floods try to set aside budgets for emergencies and risk mitigation. However, future losses are often not calculated in a probabilistic manner when allocating budgets or determining how much risk is acceptable. The absence of probabilistic risk estimates can create a lack of preparedness for reconstruction and risk reduction costs and a deficiency in promoting risk mitigation and prevention in an effective way. The probabilistic risk of natural hazards at local scale is usually ignored all together due to the difficulty in acknowledging, processing and incorporating uncertainties in the estimation of losses (e.g. physical damage, fatalities and monetary loss). This study attempts to set up a working framework for a probabilistic risk assessment (PRA) of landslides and floods at a municipal scale using the Fella river valley (Eastern Italian Alps) as a multi-hazard case study area. The emphasis is on the evaluation and determination of the uncertainty in the estimation of losses from multi-hazards. To carry out this framework some steps are needed: (1) by using physically based stochastic landslide and flood models we aim to calculate the probability of the physical impact on individual elements at risk, (2) this is then combined with a statistical analysis of the vulnerability and monetary value of the elements at risk in order to include their uncertainty in the risk assessment, (3) finally the uncertainty from each risk component is propagated into the loss estimation. The combined effect of landslides and floods on the direct risk to communities in narrow alpine valleys is also one of important aspects that needs to be studied.
A Bayesian Developmental Approach to Robotic Goal-Based Imitation Learning.
Chung, Michael Jae-Yoon; Friesen, Abram L; Fox, Dieter; Meltzoff, Andrew N; Rao, Rajesh P N
2015-01-01
A fundamental challenge in robotics today is building robots that can learn new skills by observing humans and imitating human actions. We propose a new Bayesian approach to robotic learning by imitation inspired by the developmental hypothesis that children use self-experience to bootstrap the process of intention recognition and goal-based imitation. Our approach allows an autonomous agent to: (i) learn probabilistic models of actions through self-discovery and experience, (ii) utilize these learned models for inferring the goals of human actions, and (iii) perform goal-based imitation for robotic learning and human-robot collaboration. Such an approach allows a robot to leverage its increasing repertoire of learned behaviors to interpret increasingly complex human actions and use the inferred goals for imitation, even when the robot has very different actuators from humans. We demonstrate our approach using two different scenarios: (i) a simulated robot that learns human-like gaze following behavior, and (ii) a robot that learns to imitate human actions in a tabletop organization task. In both cases, the agent learns a probabilistic model of its own actions, and uses this model for goal inference and goal-based imitation. We also show that the robotic agent can use its probabilistic model to seek human assistance when it recognizes that its inferred actions are too uncertain, risky, or impossible to perform, thereby opening the door to human-robot collaboration.
A Bayesian Developmental Approach to Robotic Goal-Based Imitation Learning
Chung, Michael Jae-Yoon; Friesen, Abram L.; Fox, Dieter; Meltzoff, Andrew N.; Rao, Rajesh P. N.
2015-01-01
A fundamental challenge in robotics today is building robots that can learn new skills by observing humans and imitating human actions. We propose a new Bayesian approach to robotic learning by imitation inspired by the developmental hypothesis that children use self-experience to bootstrap the process of intention recognition and goal-based imitation. Our approach allows an autonomous agent to: (i) learn probabilistic models of actions through self-discovery and experience, (ii) utilize these learned models for inferring the goals of human actions, and (iii) perform goal-based imitation for robotic learning and human-robot collaboration. Such an approach allows a robot to leverage its increasing repertoire of learned behaviors to interpret increasingly complex human actions and use the inferred goals for imitation, even when the robot has very different actuators from humans. We demonstrate our approach using two different scenarios: (i) a simulated robot that learns human-like gaze following behavior, and (ii) a robot that learns to imitate human actions in a tabletop organization task. In both cases, the agent learns a probabilistic model of its own actions, and uses this model for goal inference and goal-based imitation. We also show that the robotic agent can use its probabilistic model to seek human assistance when it recognizes that its inferred actions are too uncertain, risky, or impossible to perform, thereby opening the door to human-robot collaboration. PMID:26536366
Quantum Mechanics predicts evolutionary biology.
Torday, J S
2018-07-01
Nowhere are the shortcomings of conventional descriptive biology more evident than in the literature on Quantum Biology. In the on-going effort to apply Quantum Mechanics to evolutionary biology, merging Quantum Mechanics with the fundamentals of evolution as the First Principles of Physiology-namely negentropy, chemiosmosis and homeostasis-offers an authentic opportunity to understand how and why physics constitutes the basic principles of biology. Negentropy and chemiosmosis confer determinism on the unicell, whereas homeostasis constitutes Free Will because it offers a probabilistic range of physiologic set points. Similarly, on this basis several principles of Quantum Mechanics also apply directly to biology. The Pauli Exclusion Principle is both deterministic and probabilistic, whereas non-localization and the Heisenberg Uncertainty Principle are both probabilistic, providing the long-sought after ontologic and causal continuum from physics to biology and evolution as the holistic integration recognized as consciousness for the first time. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hanson, David F.
2017-04-01
Bio-inspired intelligent robots are coming of age in both research and industry, propelling market growth for robots and A.I. However, conventional motors limit bio-inspired robotics. EAP actuators and sensors could improve the simplicity, compliance, physical scaling, and offer bio-inspired advantages in robotic locomotion, grasping and manipulation, and social expressions. For EAP actuators to realize their transformative potential, further innovations are needed: the actuators must be robust, fast, powerful, manufacturable, and affordable. This presentation surveys progress, opportunities, and challenges in the author's latest work in social robots and EAP actuators, and proposes a roadmap for EAP actuators in bio-inspired intelligent robotics.
NASA Technical Reports Server (NTRS)
Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo
2015-01-01
Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.
Statistical physics inspired energy-efficient coded-modulation for optical communications.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2012-04-15
Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America
Bruno, Galileo, Einstein: The Value of Myths in Physics
NASA Astrophysics Data System (ADS)
Martinez, Alberto
2015-03-01
Usually, historical myths are portrayed as something to be avoided in a physics classroom. Instead, I will discuss the positive function of myths and how they can be used to improve physics education. First, on the basis of historical research from primary sources and significant new findings about the Catholic Inquisition, I will discuss how to use the inspirational story of Giordano Bruno when discussing cosmology. Next, I will discuss the recurring story about Galileo and the Leaning Tower of Pisa. Finally, I will discuss how neglected stories about the young Albert Einstein can help to inspire students.
NASA Astrophysics Data System (ADS)
Gueddana, Amor; Attia, Moez; Chatta, Rihab
2015-03-01
In this work, we study the error sources standing behind the non-perfect linear optical quantum components composing a non-deterministic quantum CNOT gate model, which performs the CNOT function with a success probability of 4/27 and uses a double encoding technique to represent photonic qubits at the control and the target. We generalize this model to an abstract probabilistic CNOT version and determine the realizability limits depending on a realistic range of the errors. Finally, we discuss physical constraints allowing the implementation of the Asymmetric Partially Polarizing Beam Splitter (APPBS), which is at the heart of correctly realizing the CNOT function.
NASA Astrophysics Data System (ADS)
2014-05-01
UK public libraries offer walk-in access to research Atoms for Peace? The Atomic Weapons Establishment and UK universities Students present their research to academics: CERN@school Science in a suitcase: Marvin and Milo visit Ethiopia Inspiring telescopes A day for everyone teaching physics 2014 Forthcoming Events
Stochastic Human Exposure and Dose Simulation Model for Pesticides
SHEDS-Pesticides (Stochastic Human Exposure and Dose Simulation Model for Pesticides) is a physically-based stochastic model developed to quantify exposure and dose of humans to multimedia, multipathway pollutants. Probabilistic inputs are combined in physical/mechanistic algorit...
NASA Astrophysics Data System (ADS)
Yu, Bo; Ning, Chao-lie; Li, Bing
2017-03-01
A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.
Harnessing graphical structure in Markov chain Monte Carlo learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stolorz, P.E.; Chew P.C.
1996-12-31
The Monte Carlo method is recognized as a useful tool in learning and probabilistic inference methods common to many datamining problems. Generalized Hidden Markov Models and Bayes nets are especially popular applications. However, the presence of multiple modes in many relevant integrands and summands often renders the method slow and cumbersome. Recent mean field alternatives designed to speed things up have been inspired by experience gleaned from physics. The current work adopts an approach very similar to this in spirit, but focusses instead upon dynamic programming notions as a basis for producing systematic Monte Carlo improvements. The idea is tomore » approximate a given model by a dynamic programming-style decomposition, which then forms a scaffold upon which to build successively more accurate Monte Carlo approximations. Dynamic programming ideas alone fail to account for non-local structure, while standard Monte Carlo methods essentially ignore all structure. However, suitably-crafted hybrids can successfully exploit the strengths of each method, resulting in algorithms that combine speed with accuracy. The approach relies on the presence of significant {open_quotes}local{close_quotes} information in the problem at hand. This turns out to be a plausible assumption for many important applications. Example calculations are presented, and the overall strengths and weaknesses of the approach are discussed.« less
NASA Astrophysics Data System (ADS)
Appleby, D. M.
2007-02-01
Einstein initially objected to the probabilistic aspect of quantum mechanics—the idea that God is playing at dice. Later he changed his ground, and focussed instead on the point that the Copenhagen Interpretation leads to what Einstein saw as the abandonment of physical realism. We argue here that Einstein's initial intuition was perfectly sound, and that it is precisely the fact that quantum mechanics is a fundamentally probabilistic theory which is at the root of all the controversies regarding its interpretation. Probability is an intrinsically logical concept. This means that the quantum state has an essentially logical significance. It is extremely difficult to reconcile that fact with Einstein's belief, that it is the task of physics to give us a vision of the world apprehended sub specie aeternitatis. Quantum mechanics thus presents us with a simple choice: either to follow Einstein in looking for a theory which is not probabilistic at the fundamental level, or else to accept that physics does not in fact put us in the position of God looking down on things from above. There is a widespread fear that the latter alternative must inevitably lead to a greatly impoverished, positivistic view of physical theory. It appears to us, however, that the truth is just the opposite. The Einsteinian vision is much less attractive than it seems at first sight. In particular, it is closely connected with philosophical reductionism.
Probabilistic Climate Scenario Information for Risk Assessment
NASA Astrophysics Data System (ADS)
Dairaku, K.; Ueno, G.; Takayabu, I.
2014-12-01
Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in Japan, we compared the physics ensemble experiments using the 60km global atmospheric model of the Meteorological Research Institute (MRI-AGCM) with multi-model ensemble experiments with global atmospheric-ocean coupled models (CMIP3) of SRES A1b scenario experiments. The MRI-AGCM shows relatively good skills particularly in tropics for temperature and geopotential height. Variability in surface air temperature of physical ensemble experiments with MRI-AGCM was within the range of one standard deviation of the CMIP3 model in the Asia region. On the other hand, the variability of precipitation was relatively well represented compared with the variation of the CMIP3 models. Models which show the similar reproducibility in the present climate shows different future climate change. We couldn't find clear relationships between present climate and future climate change in temperature and precipitation. We develop a new method to produce probabilistic information of climate change scenarios by weighting model ensemble experiments based on a regression model (Krishnamurti et al., Science, 1999). The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. The prototype of probabilistic information in Japan represents the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Acknowledgments: This study was supported by the SOUSEI Program, funded by Ministry of Education, Culture, Sports, Science and Technology, Government of Japan.
ERIC Educational Resources Information Center
Gracia, Enrique; Herrero, Juan
2006-01-01
Objective: This study aimed to explore the relationship between perceived neighborhood social disorder and attitudes toward reporting child physical abuse. Method: Data from a national probabilistic sample (N = 9,759) were used. Responses about the perception of neighborhood social disorder, perceived frequency of child physical abuse in Spanish…
Great Quotes To Inspire Great Teachers.
ERIC Educational Resources Information Center
benShea, Noah
This book provides a collection of quotes designed to offer support and inspiration to teachers as they face the daily emotional, spiritual, intellectual, and physical challenges of their professional and personal lives. The book's 26 sections focus on: the art of teaching; adversity; behavior; character; children; collaboration and teamwork;…
Growing a Waldorf-Inspired Approach in a Public School District
ERIC Educational Resources Information Center
Friedlaender, Diane; Beckham, Kyle; Zheng, Xinhua; Darling-Hammond, Linda
2015-01-01
This report documents the practices and outcomes of Alice Birney, a public K-8 Waldorf-Inspired School in Sacramento City Unified School District (SCUSD). This study highlights how such a school addresses students' academic, social, emotional, physical, and creative development. Birney students outperform similar students in SCUSD on several…
NASA Astrophysics Data System (ADS)
Haven, Emmanuel; Khrennikov, Andrei
2013-01-01
Preface; Part I. Physics Concepts in Social Science? A Discussion: 1. Classical, statistical and quantum mechanics: all in one; 2. Econophysics: statistical physics and social science; 3. Quantum social science: a non-mathematical motivation; Part II. Mathematics and Physics Preliminaries: 4. Vector calculus and other mathematical preliminaries; 5. Basic elements of quantum mechanics; 6. Basic elements of Bohmian mechanics; Part III. Quantum Probabilistic Effects in Psychology: Basic Questions and Answers: 7. A brief overview; 8. Interference effects in psychology - an introduction; 9. A quantum-like model of decision making; Part IV. Other Quantum Probabilistic Effects in Economics, Finance and Brain Sciences: 10. Financial/economic theory in crisis; 11. Bohmian mechanics in finance and economics; 12. The Bohm-Vigier Model and path simulation; 13. Other applications to economic/financial theory; 14. The neurophysiological sources of quantum-like processing in the brain; Conclusion; Glossary; Index.
A geometric theory for Lévy distributions
NASA Astrophysics Data System (ADS)
Eliazar, Iddo
2014-08-01
Lévy distributions are of prime importance in the physical sciences, and their universal emergence is commonly explained by the Generalized Central Limit Theorem (CLT). However, the Generalized CLT is a geometry-less probabilistic result, whereas physical processes usually take place in an embedding space whose spatial geometry is often of substantial significance. In this paper we introduce a model of random effects in random environments which, on the one hand, retains the underlying probabilistic structure of the Generalized CLT and, on the other hand, adds a general and versatile underlying geometric structure. Based on this model we obtain geometry-based counterparts of the Generalized CLT, thus establishing a geometric theory for Lévy distributions. The theory explains the universal emergence of Lévy distributions in physical settings which are well beyond the realm of the Generalized CLT.
A geometric theory for Lévy distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eliazar, Iddo, E-mail: eliazar@post.tau.ac.il
2014-08-15
Lévy distributions are of prime importance in the physical sciences, and their universal emergence is commonly explained by the Generalized Central Limit Theorem (CLT). However, the Generalized CLT is a geometry-less probabilistic result, whereas physical processes usually take place in an embedding space whose spatial geometry is often of substantial significance. In this paper we introduce a model of random effects in random environments which, on the one hand, retains the underlying probabilistic structure of the Generalized CLT and, on the other hand, adds a general and versatile underlying geometric structure. Based on this model we obtain geometry-based counterparts ofmore » the Generalized CLT, thus establishing a geometric theory for Lévy distributions. The theory explains the universal emergence of Lévy distributions in physical settings which are well beyond the realm of the Generalized CLT.« less
NASA Technical Reports Server (NTRS)
Canfield, R. C.; Ricchiazzi, P. J.
1980-01-01
An approximate probabilistic radiative transfer equation and the statistical equilibrium equations are simultaneously solved for a model hydrogen atom consisting of three bound levels and ionization continuum. The transfer equation for L-alpha, L-beta, H-alpha, and the Lyman continuum is explicitly solved assuming complete redistribution. The accuracy of this approach is tested by comparing source functions and radiative loss rates to values obtained with a method that solves the exact transfer equation. Two recent model solar-flare chromospheres are used for this test. It is shown that for the test atmospheres the probabilistic method gives values of the radiative loss rate that are characteristically good to a factor of 2. The advantage of this probabilistic approach is that it retains a description of the dominant physical processes of radiative transfer in the complete redistribution case, yet it achieves a major reduction in computational requirements.
Meng, Jingxin; Liu, Hongliang; Liu, Xueli; Yang, Gao; Zhang, Pengchao; Wang, Shutao; Jiang, Lei
2014-09-24
By mimicking certain biochemical and physical attributes of biological cells, bio-inspired particles have attracted great attention for potential biomedical applications based on cell-like biological functions. Inspired by leukocytes, hierarchical biointerfaces are designed and prepared based on specific molecules-modified leukocyte-inspired particles. These biointerfaces can efficiently recognize cancer cells from whole blood samples through the synergistic effect of molecular recognition and topographical interaction. Compared to flat, mono-micro or nano-biointerfaces, these micro/nano hierarchical biointerfaces are better able to promote specific recognition interactions, resulting in an enhanced cell-capture efficiency. It is anticipated that this study may provide promising guidance to develop new bio-inspired hierarchical biointerfaces for biomedical applications. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Nagpal, Vinod K.
2007-01-01
An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.
Stochastic fundamental diagram for probabilistic traffic flow modeling.
DOT National Transportation Integrated Search
2011-09-01
Flowing water in river, transported gas or oil in pipe, electric current in wire, moving : goods on conveyor, molecular motors in living cell, and driving vehicles on a highway are : various kinds of flow from physical or non-physical systems, yet ea...
Sustaining Physics Teacher Education Coalition Programs in Physics Teacher Education
ERIC Educational Resources Information Center
Scherr, Rachel E.; Plisch, Monica; Goertzen, Renee Michelle
2017-01-01
Understanding the mechanisms of increasing the number of physics teachers educated per year at institutions with thriving physics teacher preparation programs may inspire and support other institutions in building thriving programs of their own. The Physics Teacher Education Coalition (PhysTEC), led by the American Physical Society (APS) and the…
Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
2008-06-01
The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to construct a single Kolmogorov probability space). These investigations were started hundred of years ago by J. Boole (who invented Boolean algebras). The complete solution of the problem was obtained by Soviet mathematician Vorobjev in 60th. Surprisingly probabilists and statisticians obtained inequalities for probabilities and correlations among which one can find the famous Bell’s inequality and its generalizations. Such inequalities appeared simply as constraints for probabilistic compatibility. In this framework one can not see a priori any link to such problems as nonlocality and “death of reality” which are typically linked to Bell’s type inequalities in physical literature. We analyze the difference between positions of mathematicians and quantum physicists. In particular, we found that one of the most reasonable explanations of probabilistic incompatibility is mixing in Bell’s type inequalities statistical data from a number of experiments performed under different experimental contexts.
GoDisco: Selective Gossip Based Dissemination of Information in Social Community Based Overlays
NASA Astrophysics Data System (ADS)
Datta, Anwitaman; Sharma, Rajesh
We propose and investigate a gossip based, social principles and behavior inspired decentralized mechanism (GoDisco) to disseminate information in online social community networks, using exclusively social links and exploiting semantic context to keep the dissemination process selective to relevant nodes. Such a designed dissemination scheme using gossiping over a egocentric social network is unique and is arguably a concept whose time has arrived, emulating word of mouth behavior and can have interesting applications like probabilistic publish/subscribe, decentralized recommendation and contextual advertisement systems, to name a few. Simulation based experiments show that despite using only local knowledge and contacts, the system has good global coverage and behavior.
APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels
NASA Astrophysics Data System (ADS)
Klüser, L.; Killius, N.; Gesell, G.
2015-10-01
The cloud processing scheme APOLLO (AVHRR Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. It builds upon the physical principles that have served well in the original APOLLO scheme. Nevertheless, a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is no longer performed as a binary yes/no decision based on these physical principles. It is rather expressed as cloud probability for each satellite pixel. Consequently, the outcome of the algorithm can be tuned from being sure to reliably identify clear pixels to conditions of reliably identifying definitely cloudy pixels, depending on the purpose. The probabilistic approach allows retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for application to large amounts of historical satellite data. The radiative transfer solution is approximated by the same two-stream approach which also had been used for the original APOLLO. This allows the algorithm to be applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e., within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from NOAA-18 are presented.
APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels
NASA Astrophysics Data System (ADS)
Klüser, L.; Killius, N.; Gesell, G.
2015-04-01
The cloud processing scheme APOLLO (Avhrr Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. While building upon the physical principles having served well in the original APOLLO a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is not performed as a binary yes/no decision based on these physical principals but is expressed as cloud probability for each satellite pixel. Consequently the outcome of the algorithm can be tuned from clear confident to cloud confident depending on the purpose. The probabilistic approach allows to retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for the application with large amounts of historical satellite data. Thus the radiative transfer solution is approximated by the same two stream approach which also had been used for the original APOLLO. This allows the algorithm to be robust enough for being applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e. within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from on NOAA-18 are presented.
The pdf approach to turbulent polydispersed two-phase flows
NASA Astrophysics Data System (ADS)
Minier, Jean-Pierre; Peirano, Eric
2001-10-01
The purpose of this paper is to develop a probabilistic approach to turbulent polydispersed two-phase flows. The two-phase flows considered are composed of a continuous phase, which is a turbulent fluid, and a dispersed phase, which represents an ensemble of discrete particles (solid particles, droplets or bubbles). Gathering the difficulties of turbulent flows and of particle motion, the challenge is to work out a general modelling approach that meets three requirements: to treat accurately the physically relevant phenomena, to provide enough information to address issues of complex physics (combustion, polydispersed particle flows, …) and to remain tractable for general non-homogeneous flows. The present probabilistic approach models the statistical dynamics of the system and consists in simulating the joint probability density function (pdf) of a number of fluid and discrete particle properties. A new point is that both the fluid and the particles are included in the pdf description. The derivation of the joint pdf model for the fluid and for the discrete particles is worked out in several steps. The mathematical properties of stochastic processes are first recalled. The various hierarchies of pdf descriptions are detailed and the physical principles that are used in the construction of the models are explained. The Lagrangian one-particle probabilistic description is developed first for the fluid alone, then for the discrete particles and finally for the joint fluid and particle turbulent systems. In the case of the probabilistic description for the fluid alone or for the discrete particles alone, numerical computations are presented and discussed to illustrate how the method works in practice and the kind of information that can be extracted from it. Comments on the current modelling state and propositions for future investigations which try to link the present work with other ideas in physics are made at the end of the paper.
3D Traffic Scene Understanding From Movable Platforms.
Geiger, Andreas; Lauer, Martin; Wojek, Christian; Stiller, Christoph; Urtasun, Raquel
2014-05-01
In this paper, we present a novel probabilistic generative model for multi-object traffic scene understanding from movable platforms which reasons jointly about the 3D scene layout as well as the location and orientation of objects in the scene. In particular, the scene topology, geometry, and traffic activities are inferred from short video sequences. Inspired by the impressive driving capabilities of humans, our model does not rely on GPS, lidar, or map knowledge. Instead, it takes advantage of a diverse set of visual cues in the form of vehicle tracklets, vanishing points, semantic scene labels, scene flow, and occupancy grids. For each of these cues, we propose likelihood functions that are integrated into a probabilistic generative model. We learn all model parameters from training data using contrastive divergence. Experiments conducted on videos of 113 representative intersections show that our approach successfully infers the correct layout in a variety of very challenging scenarios. To evaluate the importance of each feature cue, experiments using different feature combinations are conducted. Furthermore, we show how by employing context derived from the proposed method we are able to improve over the state-of-the-art in terms of object detection and object orientation estimation in challenging and cluttered urban environments.
A baker's dozen of new particle flows for nonlinear filters, Bayesian decisions and transport
NASA Astrophysics Data System (ADS)
Daum, Fred; Huang, Jim
2015-05-01
We describe a baker's dozen of new particle flows to compute Bayes' rule for nonlinear filters, Bayesian decisions and learning as well as transport. Several of these new flows were inspired by transport theory, but others were inspired by physics or statistics or Markov chain Monte Carlo methods.
CERN - Six Decades of Science, Innovation, Cooperation, and Inspiration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quigg, Chris
The European Laboratory for Particle Physics, which straddles the Swiss-French border northwest of Geneva, celebrates its sixtieth birthday in 2014 CERN is the preeminent particle-physics institution in the world, currently emphasizing the study of collisions of protons and heavy nuclei at very high energies and the exploration of physics on the electroweak scale (energies where electromagnetism and the weak nuclear force merge). With brilliant accomplishments in research, innovation, and education, and a sustained history of cooperation among people from different countries and cultures, CERN ranks as one of the signal achievements of the postwar European Project. For physicists the worldmore » over, the laboratory is a source of pride and inspiration.« less
Lee, Insuk; Li, Zhihua; Marcotte, Edward M.
2007-01-01
Background Probabilistic functional gene networks are powerful theoretical frameworks for integrating heterogeneous functional genomics and proteomics data into objective models of cellular systems. Such networks provide syntheses of millions of discrete experimental observations, spanning DNA microarray experiments, physical protein interactions, genetic interactions, and comparative genomics; the resulting networks can then be easily applied to generate testable hypotheses regarding specific gene functions and associations. Methodology/Principal Findings We report a significantly improved version (v. 2) of a probabilistic functional gene network [1] of the baker's yeast, Saccharomyces cerevisiae. We describe our optimization methods and illustrate their effects in three major areas: the reduction of functional bias in network training reference sets, the application of a probabilistic model for calculating confidences in pair-wise protein physical or genetic interactions, and the introduction of simple thresholds that eliminate many false positive mRNA co-expression relationships. Using the network, we predict and experimentally verify the function of the yeast RNA binding protein Puf6 in 60S ribosomal subunit biogenesis. Conclusions/Significance YeastNet v. 2, constructed using these optimizations together with additional data, shows significant reduction in bias and improvements in precision and recall, in total covering 102,803 linkages among 5,483 yeast proteins (95% of the validated proteome). YeastNet is available from http://www.yeastnet.org. PMID:17912365
Probabilistic In Situ Stress Estimation and Forecasting using Sequential Data Assimilation
NASA Astrophysics Data System (ADS)
Fichtner, A.; van Dinther, Y.; Kuensch, H. R.
2017-12-01
Our physical understanding and forecasting ability of earthquakes, and other solid Earth dynamic processes, is significantly hampered by limited indications on the evolving state of stress and strength on faults. Integrating observations and physics-based numerical modeling to quantitatively estimate this evolution of a fault's state is crucial. However, systematic attempts are limited and tenuous, especially in light of the scarcity and uncertainty of natural data and the difficulty of modelling the physics governing earthquakes. We adopt the statistical framework of sequential data assimilation - extensively developed for weather forecasting - to efficiently integrate observations and prior knowledge in a forward model, while acknowledging errors in both. To prove this concept we perform a perfect model test in a simplified subduction zone setup, where we assimilate synthetic noised data on velocities and stresses from a single location. Using an Ensemble Kalman Filter, these data and their errors are assimilated to update 150 ensemble members from a Partial Differential Equation-driven seismic cycle model. Probabilistic estimates of fault stress and dynamic strength evolution capture the truth exceptionally well. This is possible, because the sampled error covariance matrix contains prior information from the physics that relates velocities, stresses and pressure at the surface to those at the fault. During the analysis step, stress and strength distributions are thus reconstructed such that fault coupling can be updated to either inhibit or trigger events. In the subsequent forecast step the physical equations are solved to propagate the updated states forward in time and thus provide probabilistic information on the occurrence of the next event. At subsequent assimilation steps, the system's forecasting ability turns out to be significantly better than that of a periodic recurrence model (requiring an alarm 17% vs. 68% of the time). This thus provides distinct added value with respect to using observations or numerical models separately. Although several challenges for applications to a natural setting remain, these first results indicate the large potential of data assimilation techniques for probabilistic seismic hazard assessment and other challenges in dynamic solid earth systems.
INSPIRE and SPIRES Log File Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Cole; /Wheaton Coll. /SLAC
2012-08-31
SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are mademore » between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.« less
NASA Astrophysics Data System (ADS)
Li, Chen; Fearing, Ronald; Full, Robert
Most animals move in nature in a variety of locomotor modes. For example, to traverse obstacles like dense vegetation, cockroaches can climb over, push across, reorient their bodies to maneuver through slits, or even transition among these modes forming diverse locomotor pathways; if flipped over, they can also self-right using wings or legs to generate body pitch or roll. By contrast, most locomotion studies have focused on a single mode such as running, walking, or jumping, and robots are still far from capable of life-like, robust, multi-modal locomotion in the real world. Here, we present two recent studies using bio-inspired robots, together with new locomotion energy landscapes derived from locomotor-environment interaction physics, to begin to understand the physics of multi-modal locomotion. (1) Our experiment of a cockroach-inspired legged robot traversing grass-like beam obstacles reveals that, with a terradynamically ``streamlined'' rounded body like that of the insect, robot traversal becomes more probable by accessing locomotor pathways that overcome lower potential energy barriers. (2) Our experiment of a cockroach-inspired self-righting robot further suggests that body vibrations are crucial for exploring locomotion energy landscapes and reaching lower barrier pathways. Finally, we posit that our new framework of locomotion energy landscapes holds promise to better understand and predict multi-modal biological and robotic movement.
Limited-scope probabilistic safety analysis for the Los Alamos Meson Physics Facility (LAMPF)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharirli, M.; Rand, J.L.; Sasser, M.K.
1992-01-01
The reliability of instrumentation and safety systems is a major issue in the operation of accelerator facilities. A probabilistic safety analysis was performed or the key safety and instrumentation systems at the Los Alamos Meson Physics Facility (LAMPF). in Phase I of this unique study, the Personnel Safety System (PSS) and the Current Limiters (XLs) were analyzed through the use of the fault tree analyses, failure modes and effects analysis, and criticality analysis. Phase II of the program was done to update and reevaluate the safety systems after the Phase I recommendations were implemented. This paper provides a brief reviewmore » of the studies involved in Phases I and II of the program.« less
Limited-scope probabilistic safety analysis for the Los Alamos Meson Physics Facility (LAMPF)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharirli, M.; Rand, J.L.; Sasser, M.K.
1992-12-01
The reliability of instrumentation and safety systems is a major issue in the operation of accelerator facilities. A probabilistic safety analysis was performed or the key safety and instrumentation systems at the Los Alamos Meson Physics Facility (LAMPF). in Phase I of this unique study, the Personnel Safety System (PSS) and the Current Limiters (XLs) were analyzed through the use of the fault tree analyses, failure modes and effects analysis, and criticality analysis. Phase II of the program was done to update and reevaluate the safety systems after the Phase I recommendations were implemented. This paper provides a brief reviewmore » of the studies involved in Phases I and II of the program.« less
Fast, Nonlinear, Fully Probabilistic Inversion of Large Geophysical Problems
NASA Astrophysics Data System (ADS)
Curtis, A.; Shahraeeni, M.; Trampert, J.; Meier, U.; Cho, G.
2010-12-01
Almost all Geophysical inverse problems are in reality nonlinear. Fully nonlinear inversion including non-approximated physics, and solving for probability distribution functions (pdf’s) that describe the solution uncertainty, generally requires sampling-based Monte-Carlo style methods that are computationally intractable in most large problems. In order to solve such problems, physical relationships are usually linearized leading to efficiently-solved, (possibly iterated) linear inverse problems. However, it is well known that linearization can lead to erroneous solutions, and in particular to overly optimistic uncertainty estimates. What is needed across many Geophysical disciplines is a method to invert large inverse problems (or potentially tens of thousands of small inverse problems) fully probabilistically and without linearization. This talk shows how very large nonlinear inverse problems can be solved fully probabilistically and incorporating any available prior information using mixture density networks (driven by neural network banks), provided the problem can be decomposed into many small inverse problems. In this talk I will explain the methodology, compare multi-dimensional pdf inversion results to full Monte Carlo solutions, and illustrate the method with two applications: first, inverting surface wave group and phase velocities for a fully-probabilistic global tomography model of the Earth’s crust and mantle, and second inverting industrial 3D seismic data for petrophysical properties throughout and around a subsurface hydrocarbon reservoir. The latter problem is typically decomposed into 104 to 105 individual inverse problems, each solved fully probabilistically and without linearization. The results in both cases are sufficiently close to the Monte Carlo solution to exhibit realistic uncertainty, multimodality and bias. This provides far greater confidence in the results, and in decisions made on their basis.
Nacre-inspired integrated strong and tough reduced graphene oxide-poly(acrylic acid) nanocomposites
NASA Astrophysics Data System (ADS)
Wan, Sijie; Hu, Han; Peng, Jingsong; Li, Yuchen; Fan, Yuzun; Jiang, Lei; Cheng, Qunfeng
2016-03-01
Inspired by the relationship between interface interactions and the high performance mechanical properties of nacre, a strong and tough nacre-inspired nanocomposite was demonstrated based on graphene oxide (GO) and polyacrylic acid (PAA) prepared via a vacuum-assisted filtration self-assembly process. The abundant hydrogen bonding between GO and PAA results in both high strength and toughness of the bioinspired nanocomposites, which are 2 and 3.3 times higher than that of pure reduced GO film, respectively. In addition, the effect of environmental relative humidity on the mechanical properties of bioinspired nanocomposites is also investigated, and is consistent with previous theoretical predictions. Moreover, this nacre-inspired nanocomposite also displays high electrical conductivity of 108.9 S cm-1. These excellent physical properties allow this type of nacre-inspired nanocomposite to be used in many applications, such as flexible electrodes, aerospace applications, and artificial muscles etc. This nacre-inspired strategy also opens an avenue for constructing integrated high performance graphene-based nanocomposites in the near future.
Playing Funny: An Introduction to "Commedia dell' Arte."
ERIC Educational Resources Information Center
Grantham, Barry
2001-01-01
Discusses the use of "Commedia," a way of performing inspired by the historical "Commedia dell' Arte." Notes that it has proved a fertile source of inspiration for all types of physical and stylized theatre and a useful training tool for performers in many fields. Presents a series of exercises designed to introduce the student to Commedia…
[Physical activity in a probabilistic sample in the city of Rio de Janeiro].
Gomes, V B; Siqueira, K S; Sichieri, R
2001-01-01
This study evaluated physical activity in a probabilistic sample of 4,331 individuals 12 years of age and older residing in the city of Rio de Janeiro, who participated in a household survey in 1996. Occupation and leisure activity were grouped according to categories of energy expenditure. The study also evaluated number of hours watching TV, using the computer, or playing video-games. Only 3.6% of males and 0.3% of females reported heavy occupational work. A full 59.8% of males and 77.8% of females reported never performing recreational physical activity, and there was an increase in this prevalence with age, especially for men. Women's leisure activities involved less energy expenditure and had a lower median duration than those of men. Mean daily TV/video/computer time was greater for women than for men. The greater the level of schooling, the higher the frequency of physical activity for both sexes. Analyzed jointly, these data show the low energy expenditure through physical activity by the population of the city of Rio de Janeiro. Women, the middle-aged, the elderly, and low-income individuals were at greatest risk of not performing recreational physical activity.
CERN@school: demonstrating physics with the Timepix detector
NASA Astrophysics Data System (ADS)
Whyntie, T.; Bithray, H.; Cook, J.; Coupe, A.; Eddy, D.; Fickling, R. L.; McKenna, J.; Parker, B.; Paul, A.; Shearer, N.
2015-10-01
This article shows how the Timepix hybrid silicon pixel detector, developed by the Medipix2 Collaboration, can be used by students and teachers alike to demonstrate some key aspects of any well-rounded physics curriculum with CERN@school. After an overview of the programme, the detector's capabilities for measuring and visualising ionising radiation are examined. The classification of clusters - groups of adjacent pixels - is discussed with respect to identifying the different types of particles. Three demonstration experiments - background radiation measurements, radiation profiles and the attenuation of radiation - are described; these can used as part of lessons or as inspiration for independent research projects. Results for exemplar data-sets are presented for reference, as well as details of ongoing research projects inspired by these experiments. Interested readers are encouraged to join the CERN@school Collaboration and so contribute to achieving the programme's aim of inspiring the next generation of scientists and engineers.
Buchsbaum, Daphna; Seiver, Elizabeth; Bridgers, Sophie; Gopnik, Alison
2012-01-01
A major challenge children face is uncovering the causal structure of the world around them. Previous research on children's causal inference has demonstrated their ability to learn about causal relationships in the physical environment using probabilistic evidence. However, children must also learn about causal relationships in the social environment, including discovering the causes of other people's behavior, and understanding the causal relationships between others' goal-directed actions and the outcomes of those actions. In this chapter, we argue that social reasoning and causal reasoning are deeply linked, both in the real world and in children's minds. Children use both types of information together and in fact reason about both physical and social causation in fundamentally similar ways. We suggest that children jointly construct and update causal theories about their social and physical environment and that this process is best captured by probabilistic models of cognition. We first present studies showing that adults are able to jointly infer causal structure and human action structure from videos of unsegmented human motion. Next, we describe how children use social information to make inferences about physical causes. We show that the pedagogical nature of a demonstrator influences children's choices of which actions to imitate from within a causal sequence and that this social information interacts with statistical causal evidence. We then discuss how children combine evidence from an informant's testimony and expressed confidence with evidence from their own causal observations to infer the efficacy of different potential causes. We also discuss how children use these same causal observations to make inferences about the knowledge state of the social informant. Finally, we suggest that psychological causation and attribution are part of the same causal system as physical causation. We present evidence that just as children use covariation between physical causes and their effects to learn physical causal relationships, they also use covaration between people's actions and the environment to make inferences about the causes of human behavior.
Information Theoretic Characterization of Physical Theories with Projective State Space
NASA Astrophysics Data System (ADS)
Zaopo, Marco
2015-08-01
Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.
Physical limits on ground motion at Yucca Mountain
Andrews, D.J.; Hanks, T.C.; Whitney, J.W.
2007-01-01
Physical limits on possible maximum ground motion at Yucca Mountain, Nevada, the designated site of a high-level radioactive waste repository, are set by the shear stress available in the seismogenic depth of the crust and by limits on stress change that can propagate through the medium. We find in dynamic deterministic 2D calculations that maximum possible horizontal peak ground velocity (PGV) at the underground repository site is 3.6 m/sec, which is smaller than the mean PGV predicted by the probabilistic seismic hazard analysis (PSHA) at annual exceedance probabilities less than 10-6 per year. The physical limit on vertical PGV, 5.7 m/sec, arises from supershear rupture and is larger than that from the PSHA down to 10-8 per year. In addition to these physical limits, we also calculate the maximum ground motion subject to the constraint of known fault slip at the surface, as inferred from paleoseismic studies. Using a published probabilistic fault displacement hazard curve, these calculations provide a probabilistic hazard curve for horizontal PGV that is lower than that from the PSHA. In all cases the maximum ground motion at the repository site is found by maximizing constructive interference of signals from the rupture front, for physically realizable rupture velocity, from all parts of the fault. Vertical PGV is maximized for ruptures propagating near the P-wave speed, and horizontal PGV is maximized for ruptures propagating near the Rayleigh-wave speed. Yielding in shear with a Mohr-Coulomb yield condition reduces ground motion only a modest amount in events with supershear rupture velocity, because ground motion consists primarily of P waves in that case. The possibility of compaction of the porous unsaturated tuffs at the higher ground-motion levels is another attenuating mechanism that needs to be investigated.
A Tony Thomas-Inspired Guide to INSPIRE
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Connell, Heath B.; /Fermilab
2010-04-01
The SPIRES database was created in the late 1960s to catalogue the high energy physics preprints received by the SLAC Library. In the early 1990s it became the first database on the web and the first website outside of Europe. Although indispensible to the HEP community, its aging software infrastructure is becoming a serious liability. In a joint project involving CERN, DESY, Fermilab and SLAC, a new database, INSPIRE, is being created to replace SPIRES using CERN's modern, open-source Invenio database software. INSPIRE will maintain the content and functionality of SPIRES plus many new features. I describe this evolution frommore » the birth of SPIRES to the current day, noting that the career of Tony Thomas spans this timeline.« less
Bio-Inspired Self-Cleaning Surfaces
NASA Astrophysics Data System (ADS)
Liu, Kesong; Jiang, Lei
2012-08-01
Self-cleaning surfaces have drawn a lot of interest for both fundamental research and practical applications. This review focuses on the recent progress in mechanism, preparation, and application of self-cleaning surfaces. To date, self-cleaning has been demonstrated by the following four conceptual approaches: (a) TiO2-based superhydrophilic self-cleaning, (b) lotus effect self-cleaning (superhydrophobicity with a small sliding angle), (c) gecko setae-inspired self-cleaning, and (d) underwater organisms-inspired antifouling self-cleaning. Although a number of self-cleaning products have been commercialized, the remaining challenges and future outlook of self-cleaning surfaces are also briefly addressed. Through evolution, nature, which has long been a source of inspiration for scientists and engineers, has arrived at what is optimal. We hope this review will stimulate interdisciplinary collaboration among material science, chemistry, biology, physics, nanoscience, engineering, etc., which is essential for the rational design and reproducible construction of bio-inspired multifunctional self-cleaning surfaces in practical applications.
Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure
NASA Astrophysics Data System (ADS)
Tsai, C.; Yeh, J. J. J.
2017-12-01
A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.
Başar, Erol; Güntekin, Bahar
2007-04-01
The Cartesian System is a fundamental conceptual and analytical framework related and interwoven with the concept and applications of Newtonian Dynamics. In order to analyze quantum processes physicist moved to a Probabilistic Cartesian System in which the causality principle became a probabilistic one. This means the trajectories of particles (obeying quantum rules) can be described only with the concept of cloudy wave packets. The approach to the brain-body-mind problem requires more than the prerequisite of modern physics and quantum dynamics. In the analysis of the brain-body-mind construct we have to include uncertain causalities and consequently multiple uncertain causalities. These multiple causalities originate from (1) nonlinear properties of the vegetative system (e.g. irregularities in biochemical transmitters, cardiac output, turbulences in the vascular system, respiratory apnea, nonlinear oscillatory interactions in peristalsis); (2) nonlinear behavior of the neuronal electricity (e.g. chaotic behavior measured by EEG), (3) genetic modulations, and (4) additional to these physiological entities nonlinear properties of physical processes in the body. The brain shows deterministic chaos with a correlation dimension of approx. D(2)=6, the smooth muscles approx. D(2)=3. According to these facts we propose a hyper-probabilistic approach or a hyper-probabilistic Cartesian System to describe and analyze the processes in the brain-body-mind system. If we add aspects as our sentiments, emotions and creativity to this construct, better said to this already hyper-probabilistic construct, this "New Cartesian System" is more than hyper-probabilistic, it is a nebulous system, we can predict the future only in a nebulous way; however, despite this chain of reasoning we can still provide predictions on brain-body-mind incorporations. We tentatively assume that the processes or mechanisms of the brain-body-mind system can be analyzed and predicted similar to the metaphor of "finding the walking path in a cloudy or foggy day". This is meant by stating "The Nebulous Cartesian System" (NCS). Descartes, at his time undertaking his genius step, did not possess the knowledge of today's physiology and modern physics; we think that the time has come to consider such a New Cartesian System. To deal with this, we propose the utilization of the Heisenberg S-Matrix and a modified version of the Feynman Diagrams which we call "Brain Feynman Diagrams". Another metaphor to consider within the oscillatory approach of the NCS is the "string theory". We also emphasize that fundamental steps should be undertaken in order to create the own dynamical framework of the brain-body-mind incorporation; suggestions or metaphors from physics and mathematics are useful; however, the grammar of the brains intrinsic language must be understood with the help of a new biologically founded, adaptive-probabilistic Cartesian system. This new Cartesian System will undergo mutations and transcend to the philosophy of Henri Bergson in parallel to the Evolution theory of Charles Darwin to open gateways for approaching the brain-body-mind problem.
Utility of the Physical Examination in Detecting Pulmonary Hypertension. A Mixed Methods Study
Colman, Rebecca; Whittingham, Heather; Tomlinson, George; Granton, John
2014-01-01
Introduction Patients with pulmonary hypertension (PH) often present with a variety of physical findings reflecting a volume or pressure overloaded right ventricle (RV). However, there is no consensus regarding the diagnostic utility of the physical examination in PH. Methods We conducted a systematic review of publications that evaluated the clinical examination and diagnosis of PH using MEDLINE (1946–2013) and EMBASE (1947–2013). We also prospectively evaluated the diagnostic utility of the physical examination findings. Patients who underwent right cardiac catheterization for any reason were recruited. After informed consent, participants were examined by 6 physicians (3 “specialists” and 3 “generalists”) who were unaware of the results of the patient's hemodynamics. Each examiner independently assessed patients for the presence of a RV lift, loud P2, jugular venous distension (JVD), tricuspid insufficiency murmur and right-sided 4th heart sound at rest and during a slow inspiration. A global rating (scale of 1–5) of the likelihood that the patient had pulmonary hypertension was provided by each examiner. Results 31 articles that assessed the physical examination in PH were included in the final analysis. There was heterogeneity amongst the studies and many did not include control data. The sign most associated with PH in the literature was a loud pulmonic component of the second heart sound (P2). In our prospective study physical examination was performed on 52 subjects (25 met criteria for PH; mPAP ≥25 mmHg). The physical sign with the highest likelihood ratio (LR) was a loud P2 on inspiration with a LR +ve 1.9, 95% CrI [1.2, 3.1] when data from all examiners was analyzed together. Results from the specialist examiners had higher diagnostic utility; a loud P2 on inspiration was associated with a positive LR of 3.2, 95% CrI [1.5, 6.2] and a right sided S4 on inspiration had a LR +ve 4.7, 95% CI [1.0, 15.6]. No aspect of the physical exam, could consistently rule out PH (negative LRs 0.7–1.3). Conclusions The presence of a loud P2 or audible right-sided 4th heart sound are associated with PH. However the physical examination is unreliable for determining the presence of PH. PMID:25343585
Utility of the physical examination in detecting pulmonary hypertension. A mixed methods study.
Colman, Rebecca; Whittingham, Heather; Tomlinson, George; Granton, John
2014-01-01
Patients with pulmonary hypertension (PH) often present with a variety of physical findings reflecting a volume or pressure overloaded right ventricle (RV). However, there is no consensus regarding the diagnostic utility of the physical examination in PH. We conducted a systematic review of publications that evaluated the clinical examination and diagnosis of PH using MEDLINE (1946-2013) and EMBASE (1947-2013). We also prospectively evaluated the diagnostic utility of the physical examination findings. Patients who underwent right cardiac catheterization for any reason were recruited. After informed consent, participants were examined by 6 physicians (3 "specialists" and 3 "generalists") who were unaware of the results of the patient's hemodynamics. Each examiner independently assessed patients for the presence of a RV lift, loud P2, jugular venous distension (JVD), tricuspid insufficiency murmur and right-sided 4th heart sound at rest and during a slow inspiration. A global rating (scale of 1-5) of the likelihood that the patient had pulmonary hypertension was provided by each examiner. 31 articles that assessed the physical examination in PH were included in the final analysis. There was heterogeneity amongst the studies and many did not include control data. The sign most associated with PH in the literature was a loud pulmonic component of the second heart sound (P2). In our prospective study physical examination was performed on 52 subjects (25 met criteria for PH; mPAP ≥ 25 mmHg). The physical sign with the highest likelihood ratio (LR) was a loud P2 on inspiration with a LR +ve 1.9, 95% CrI [1.2, 3.1] when data from all examiners was analyzed together. Results from the specialist examiners had higher diagnostic utility; a loud P2 on inspiration was associated with a positive LR of 3.2, 95% CrI [1.5, 6.2] and a right sided S4 on inspiration had a LR +ve 4.7, 95% CI [1.0, 15.6]. No aspect of the physical exam, could consistently rule out PH (negative LRs 0.7-1.3). The presence of a loud P2 or audible right-sided 4th heart sound are associated with PH. However the physical examination is unreliable for determining the presence of PH.
Postprocessing for Air Quality Predictions
NASA Astrophysics Data System (ADS)
Delle Monache, L.
2017-12-01
In recent year, air quality (AQ) forecasting has made significant progress towards better predictions with the goal of protecting the public from harmful pollutants. This progress is the results of improvements in weather and chemical transport models, their coupling, and more accurate emission inventories (e.g., with the development of new algorithms to account in near real-time for fires). Nevertheless, AQ predictions are still affected at times by significant biases which stem from limitations in both weather and chemistry transport models. Those are the result of numerical approximations and the poor representation (and understanding) of important physical and chemical process. Moreover, although the quality of emission inventories has been significantly improved, they are still one of the main sources of uncertainties in AQ predictions. For operational real-time AQ forecasting, a significant portion of these biases can be reduced with the implementation of postprocessing methods. We will review some of the techniques that have been proposed to reduce both systematic and random errors of AQ predictions, and improve the correlation between predictions and observations of ground-level ozone and surface particulate matter less than 2.5 µm in diameter (PM2.5). These methods, which can be applied to both deterministic and probabilistic predictions, include simple bias-correction techniques, corrections inspired by the Kalman filter, regression methods, and the more recently developed analog-based algorithms. These approaches will be compared and contrasted, and strength and weaknesses of each will be discussed.
Random mechanics: Nonlinear vibrations, turbulences, seisms, swells, fatigue
NASA Astrophysics Data System (ADS)
Kree, P.; Soize, C.
The random modeling of physical phenomena, together with probabilistic methods for the numerical calculation of random mechanical forces, are analytically explored. Attention is given to theoretical examinations such as probabilistic concepts, linear filtering techniques, and trajectory statistics. Applications of the methods to structures experiencing atmospheric turbulence, the quantification of turbulence, and the dynamic responses of the structures are considered. A probabilistic approach is taken to study the effects of earthquakes on structures and to the forces exerted by ocean waves on marine structures. Theoretical analyses by means of vector spaces and stochastic modeling are reviewed, as are Markovian formulations of Gaussian processes and the definition of stochastic differential equations. Finally, random vibrations with a variable number of links and linear oscillators undergoing the square of Gaussian processes are investigated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai
We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less
Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai
2018-03-01
We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less
NASA Technical Reports Server (NTRS)
Taylor, Bill; Pine, Bill
2003-01-01
INSPIRE (Interactive NASA Space Physics Ionosphere Radio Experiment - http://image.gsfc.nasa.gov/poetry/inspire) is a non-profit scientific, educational organization whose objective is to bring the excitement of observing natural and manmade radio waves in the audio region to high school students and others. The project consists of building an audio frequency radio receiver kit, making observations of natural and manmade radio waves and analyzing the data. Students also learn about NASA and our natural environment through the study of lightning, the source of many of the audio frequency waves, the atmosphere, the ionosphere, and the magnetosphere where the waves travel.
Waismeyer, Anna; Meltzoff, Andrew N
2017-10-01
Infants learn about cause and effect through hands-on experience; however, they also can learn about causality simply from observation. Such observational causal learning is a central mechanism by which infants learn from and about other people. Across three experiments, we tested infants' observational causal learning of both social and physical causal events. Experiment 1 assessed infants' learning of a physical event in the absence of visible spatial contact between the causes and effects. Experiment 2 developed a novel paradigm to assess whether infants could learn about a social causal event from third-party observation of a social interaction between two people. Experiment 3 compared learning of physical and social events when the outcomes occurred probabilistically (happening some, but not all, of the time). Infants demonstrated significant learning in all three experiments, although learning about probabilistic cause-effect relations was most difficult. These findings about infant observational causal learning have implications for children's rapid nonverbal learning about people, things, and their causal relations. Copyright © 2017 Elsevier Inc. All rights reserved.
A Physics-Inspired Introduction to Political Science
ERIC Educational Resources Information Center
Taagepera, Rein
1976-01-01
This paper analyzes what is involved in patterning part of an introduction to politics along the lines of physical sciences, and it presents contents and results of a course in which the author did this. (Author/ND)
A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.
Revell, Christopher; Somveille, Marius
2017-08-29
In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.
Superstring-inspired SO(10) GUT model with intermediate scale
NASA Astrophysics Data System (ADS)
Sasaki, Ken
1987-12-01
A new mechanism is proposed for the mixing of Weinberg-Salam Higgs fields in superstring-inspired SO(10) models with no SO(10) singlet fields. The higher-dimensional terms in the superpotential can generate both Higgs field mixing and a small mass for the physical neutrino. I would like to thank Professor C. Iso for hospitality extended to me at the Tokyo Institute of Technology.
A Transferrable Belief Model Representation for Physical Security of Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Gerts
This work analyzed various probabilistic methods such as classic statistics, Bayesian inference, possibilistic theory, and Dempster-Shafer theory of belief functions for the potential insight offered into the physical security of nuclear materials as well as more broad application to nuclear non-proliferation automated decision making theory. A review of the fundamental heuristic and basic limitations of each of these methods suggested that the Dempster-Shafer theory of belief functions may offer significant capability. Further examination of the various interpretations of Dempster-Shafer theory, such as random set, generalized Bayesian, and upper/lower probability demonstrate some limitations. Compared to the other heuristics, the transferrable beliefmore » model (TBM), one of the leading interpretations of Dempster-Shafer theory, can improve the automated detection of the violation of physical security using sensors and human judgment. The improvement is shown to give a significant heuristic advantage over other probabilistic options by demonstrating significant successes for several classic gedanken experiments.« less
NASA Astrophysics Data System (ADS)
Ronde, Christian De
In classical physics, probabilistic or statistical knowledge has been always related to ignorance or inaccurate subjective knowledge about an actual state of affairs. This idea has been extended to quantum mechanics through a completely incoherent interpretation of the Fermi-Dirac and Bose-Einstein statistics in terms of "strange" quantum particles. This interpretation, naturalized through a widespread "way of speaking" in the physics community, contradicts Born's physical account of Ψ as a "probability wave" which provides statistical information about outcomes that, in fact, cannot be interpreted in terms of `ignorance about an actual state of affairs'. In the present paper we discuss how the metaphysics of actuality has played an essential role in limiting the possibilities of understating things differently. We propose instead a metaphysical scheme in terms of immanent powers with definite potentia which allows us to consider quantum probability in a new light, namely, as providing objective knowledge about a potential state of affairs.
Beyond quantum probability: another formalism shared by quantum physics and psychology.
Dzhafarov, Ehtibar N; Kujala, Janne V
2013-06-01
There is another meeting place for quantum physics and psychology, both within and outside of cognitive modeling. In physics it is known as the issue of classical (probabilistic) determinism, and in psychology it is known as the issue of selective influences. The formalisms independently developed in the two areas for dealing with these issues turn out to be identical, opening ways for mutually beneficial interactions.
NASA Astrophysics Data System (ADS)
Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto
2014-05-01
Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.
NASA Technical Reports Server (NTRS)
Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2004-01-01
This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.
Young children do not succeed in choice tasks that imply evaluating chances.
Girotto, Vittorio; Fontanari, Laura; Gonzalez, Michel; Vallortigara, Giorgio; Blaye, Agnès
2016-07-01
Preverbal infants manifest probabilistic intuitions in their reactions to the outcomes of simple physical processes and in their choices. Their ability conflicts with the evidence that, before the age of about 5years, children's verbal judgments do not reveal probability understanding. To assess these conflicting results, three studies tested 3-5-year-olds on choice tasks on which infants perform successfully. The results showed that children of all age groups made optimal choices in tasks that did not require forming probabilistic expectations. In probabilistic tasks, however, only 5-year-olds made optimal choices. Younger children performed at random and/or were guided by superficial heuristics. These results suggest caution in interpreting infants' ability to evaluate chance, and indicate that the development of this ability may not follow a linear trajectory. Copyright © 2016 Elsevier B.V. All rights reserved.
Semi-volatile pesticides, such as chlorpyrifos, can move about within a home environment after an application due to physical/chemical processes, resulting in concentration loadings in and on objects and surfaces. Children can be particularly susceptible to the effects of pest...
Physical Education as "Means without Ends:" Towards a New Concept of Physical Education
ERIC Educational Resources Information Center
Vlieghe, Joris
2013-01-01
This article is concerned with the educational value of raising the human body at school. Drawing inspiration from the work of Giorgio Agamben, I develop a new perspective that explores the possibility of taking the concept of physical education in a literal sense. This is to say that the specific educational content of physical education (in…
The Physics of "Copenhagen" for Students and the General Public.
ERIC Educational Resources Information Center
Bergstrom, L.; Johansson, K. E.; Nilsson, Ch.
2001-01-01
The play Copenhagen has attracted the attention of a large audience in several countries. The hypothetical discussion between two of the giants in physics, Niels Bohr and Werner Heisenberg, has inspired us to start a theoretical and experimental exploration of quantum physics. This theme has been used in Stockholm Science Laboratory for audiences…
No Space for Girliness in Physics: Understanding and Overcoming the Masculinity of Physics
ERIC Educational Resources Information Center
Götschel, Helene
2014-01-01
Allison Gonsalves' article on "women doctoral students' positioning around discourses of gender and competence in physics" explores narratives of Canadian women physicists concerning their strategies to gain recognition as physicists. In my response to her rewarding and inspiring analysis I will reflect on her findings and arguments and…
The Material Co-Construction of Hard Science Fiction and Physics
ERIC Educational Resources Information Center
Hasse, Cathrine
2015-01-01
This article explores the relationship between hard science fiction and physics and a gendered culture of science. Empirical studies indicate that science fiction references might spur some students' interest in physics and help develop this interest throughout school, into a university education and even further later inspire the practice of…
Sundew-Inspired Adhesive Hydrogels Combined with Adipose-Derived Stem Cells for Wound Healing
Sun, Leming; Huang, Yujian; Bian, Zehua; Petrosino, Jennifer; Fan, Zhen; Wang, Yongzhong; Park, Ki Ho; Yue, Tao; Schmidt, Michael; Galster, Scott; Ma, Jianjie; Zhu, Hua; Zhang, Mingjun
2016-01-01
The potential to harness the unique physical, chemical, and biological properties of the sundew (Drosera) plant’s adhesive hydrogels has long intrigued researchers searching for novel wound-healing applications. However, the ability to collect sufficient quantities of the sundew plant’s adhesive hydrogels is problematic and has eclipsed their therapeutic promise. Inspired by these natural hydrogels, we asked if sundew-inspired adhesive hydrogels could overcome the drawbacks associated with natural sundew hydrogels and be used in combination with stem-cell-based therapy to enhance wound-healing therapeutics. Using a bioinspired approach, we synthesized adhesive hydrogels comprised of sodium alginate, gum arabic, and calcium ions to mimic the properties of the natural sundew-derived adhesive hydrogels. We then characterized and showed that these sundew-inspired hydrogels promote wound healing through their superior adhesive strength, nanostructure, and resistance to shearing when compared to other hydrogels in vitro. In vivo, sundew-inspired hydrogels promoted a “suturing” effect to wound sites, which was demonstrated by enhanced wound closure following topical application of the hydrogels. In combination with mouse adipose-derived stem cells (ADSCs) and compared to other therapeutic biomaterials, the sundew-inspired hydrogels demonstrated superior wound-healing capabilities. Collectively, our studies show that sundew-inspired hydrogels contain ideal properties that promote wound healing and suggest that sundew-inspired-ADSCs combination therapy is an efficacious approach for treating wounds without eliciting noticeable toxicity or inflammation. PMID:26731614
Sundew-Inspired Adhesive Hydrogels Combined with Adipose-Derived Stem Cells for Wound Healing.
Sun, Leming; Huang, Yujian; Bian, Zehua; Petrosino, Jennifer; Fan, Zhen; Wang, Yongzhong; Park, Ki Ho; Yue, Tao; Schmidt, Michael; Galster, Scott; Ma, Jianjie; Zhu, Hua; Zhang, Mingjun
2016-01-27
The potential to harness the unique physical, chemical, and biological properties of the sundew (Drosera) plant's adhesive hydrogels has long intrigued researchers searching for novel wound-healing applications. However, the ability to collect sufficient quantities of the sundew plant's adhesive hydrogels is problematic and has eclipsed their therapeutic promise. Inspired by these natural hydrogels, we asked if sundew-inspired adhesive hydrogels could overcome the drawbacks associated with natural sundew hydrogels and be used in combination with stem-cell-based therapy to enhance wound-healing therapeutics. Using a bioinspired approach, we synthesized adhesive hydrogels comprised of sodium alginate, gum arabic, and calcium ions to mimic the properties of the natural sundew-derived adhesive hydrogels. We then characterized and showed that these sundew-inspired hydrogels promote wound healing through their superior adhesive strength, nanostructure, and resistance to shearing when compared to other hydrogels in vitro. In vivo, sundew-inspired hydrogels promoted a "suturing" effect to wound sites, which was demonstrated by enhanced wound closure following topical application of the hydrogels. In combination with mouse adipose-derived stem cells (ADSCs) and compared to other therapeutic biomaterials, the sundew-inspired hydrogels demonstrated superior wound-healing capabilities. Collectively, our studies show that sundew-inspired hydrogels contain ideal properties that promote wound healing and suggest that sundew-inspired-ADSCs combination therapy is an efficacious approach for treating wounds without eliciting noticeable toxicity or inflammation.
Development of probabilistic regional climate scenario in East Asia
NASA Astrophysics Data System (ADS)
Dairaku, K.; Ueno, G.; Ishizaki, N. N.
2015-12-01
Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in East Asia (CORDEX-EA and Japan), the probability distribution of 2m air temperature was estimated by using developed regression model. The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. Probabilistic climate information in present (1969-1998) and future (2069-2098) climate was developed using CMIP3 SRES A1b scenarios 21 models and the observation data (CRU_TS3.22 & University of Delaware in CORDEX-EA, NIAES AMeDAS mesh data in Japan). The prototype of probabilistic information in CORDEX-EA and Japan represent the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Appropriate combination of statistical methods and optimization of climate ensemble experiments using multi-General Circulation Models (GCMs) and multi-regional climate models (RCMs) ensemble downscaling experiments are investigated.
A survey of snake-inspired robot designs.
Hopkins, James K; Spranklin, Brent W; Gupta, Satyandra K
2009-06-01
Body undulation used by snakes and the physical architecture of a snake body may offer significant benefits over typical legged or wheeled locomotion designs in certain types of scenarios. A large number of research groups have developed snake-inspired robots to exploit these benefits. The purpose of this review is to report different types of snake-inspired robot designs and categorize them based on their main characteristics. For each category, we discuss their relative advantages and disadvantages. This review will assist in familiarizing a newcomer to the field with the existing designs and their distinguishing features. We hope that by studying existing robots, future designers will be able to create new designs by adopting features from successful robots. The review also summarizes the design challenges associated with the further advancement of the field and deploying snake-inspired robots in practice.
A global empirical system for probabilistic seasonal climate prediction
NASA Astrophysics Data System (ADS)
Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.
2015-12-01
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
An empirical system for probabilistic seasonal climate prediction
NASA Astrophysics Data System (ADS)
Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma
2016-04-01
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
An intelligent interactive simulator of clinical reasoning in general surgery.
Wang, S.; el Ayeb, B.; Echavé, V.; Preiss, B.
1993-01-01
We introduce an interactive computer environment for teaching in general surgery and for diagnostic assistance. The environment consists of a knowledge-based system coupled with an intelligent interface that allows users to acquire conceptual knowledge and clinical reasoning techniques. Knowledge is represented internally within a probabilistic framework and externally through a interface inspired by Concept Graphics. Given a set of symptoms, the internal knowledge framework computes the most probable set of diseases as well as best alternatives. The interface displays CGs illustrating the results and prompting essential facts of a medical situation or a process. The system is then ready to receive additional information or to suggest further investigation. Based on the new information, the system will narrow the solutions with increased belief coefficients. PMID:8130508
Flipping the Classroom Revisited
NASA Astrophysics Data System (ADS)
Riendeau, Diane
2013-02-01
I received many emails following the first column on flipping the classroom. Many of my local colleagues also approached me at our physics alliance, Physics Northwest. Teachers are very interested in this new pedagogy. As I result, I wanted to share some more videos to inspire you.
Allothetic and idiothetic sensor fusion in rat-inspired robot localization
NASA Astrophysics Data System (ADS)
Weitzenfeld, Alfredo; Fellous, Jean-Marc; Barrera, Alejandra; Tejera, Gonzalo
2012-06-01
We describe a spatial cognition model based on the rat's brain neurophysiology as a basis for new robotic navigation architectures. The model integrates allothetic (external visual landmarks) and idiothetic (internal kinesthetic information) cues to train either rat or robot to learn a path enabling it to reach a goal from multiple starting positions. It stands in contrast to most robotic architectures based on SLAM, where a map of the environment is built to provide probabilistic localization information computed from robot odometry and landmark perception. Allothetic cues suffer in general from perceptual ambiguity when trying to distinguish between places with equivalent visual patterns, while idiothetic cues suffer from imprecise motions and limited memory recalls. We experiment with both types of cues in different maze configurations by training rats and robots to find the goal starting from a fixed location, and then testing them to reach the same target from new starting locations. We show that the robot, after having pre-explored a maze, can find a goal with improved efficiency, and is able to (1) learn the correct route to reach the goal, (2) recognize places already visited, and (3) exploit allothetic and idiothetic cues to improve on its performance. We finally contrast our biologically-inspired approach to more traditional robotic approaches and discuss current work in progress.
On the unreasonable effectiveness of the post-Newtonian approximation in gravitational physics
Will, Clifford M.
2011-01-01
The post-Newtonian approximation is a method for solving Einstein’s field equations for physical systems in which motions are slow compared to the speed of light and where gravitational fields are weak. Yet it has proven to be remarkably effective in describing certain strong-field, fast-motion systems, including binary pulsars containing dense neutron stars and binary black hole systems inspiraling toward a final merger. The reasons for this effectiveness are largely unknown. When carried to high orders in the post-Newtonian sequence, predictions for the gravitational-wave signal from inspiraling compact binaries will play a key role in gravitational-wave detection by laser-interferometric observatories. PMID:21447714
The Schrödinger Sessions: Science for Science Fiction
NASA Astrophysics Data System (ADS)
Orzel, Chad; Edwards, Emily; Rolston, Steven
In July 2015, we held a workshop for 17 science fiction writers working in a variety of media at the Joint Quantum Institute at the University of Maryland, College Park. ''The Schrödinger Sessions,'' funded by an outreach grant from APS, provided a three-day ''crash course'' on quantum physics and technology, including lectures from JQI scientists and tours of JQI labs. The goal was to better inform and inspire stories making use of quantum physics, as a means of outreach to inspire a broad audience of future scientists. We will report on the contents of the workshop, reactions from the attendees and presenters, and future plans. Funded by an Outreach Mini-Grant from the APS.
A review on robotic fish enabled by ionic polymer-metal composite artificial muscles.
Chen, Zheng
2017-01-01
A novel actuating material, which is lightweight, soft, and capable of generating large flapping motion under electrical stimuli, is highly desirable to build energy-efficient and maneuverable bio-inspired underwater robots. Ionic polymer-metal composites are important category of electroactive polymers, since they can generate large bending motions under low actuation voltages. IPMCs are ideal artificial muscles for small-scale and bio-inspired robots. This paper takes a system perspective to review the recent work on IPMC-enabled underwater robots, from modeling, fabrication, and bio-inspired design perspectives. First, a physics-based and control-oriented model of IPMC actuator will be reviewed. Second, a bio-inspired robotic fish propelled by IPMC caudal fin will be presented and a steady-state speed model of the fish will be demonstrated. Third, a novel fabrication process for 3D actuating membrane will be introduced and a bio-inspired robotic manta ray propelled by two IPMC pectoral fins will be demonstrated. Fourth, a 2D maneuverable robotic fish propelled by multiple IPMC fin will be presented. Last, advantages and challenges of using IPMC artificial muscles in bio-inspired robots will be concluded.
Biologically inspired LED lens from cuticular nanostructures of firefly lantern
Kim, Jae-Jun; Lee, Youngseop; Kim, Ha Gon; Choi, Ki-Ju; Kweon, Hee-Seok; Park, Seongchong; Jeong, Ki-Hun
2012-01-01
Cuticular nanostructures found in insects effectively manage light for light polarization, structural color, or optical index matching within an ultrathin natural scale. These nanostructures are mainly dedicated to manage incoming light and recently inspired many imaging and display applications. A bioluminescent organ, such as a firefly lantern, helps to out-couple light from the body in a highly efficient fashion for delivering strong optical signals in sexual communication. However, the cuticular nanostructures, except the light-producing reactions, have not been well investigated for physical principles and engineering biomimetics. Here we report a unique observation of high-transmission nanostructures on a firefly lantern and its biological inspiration for highly efficient LED illumination. Both numerical and experimental results clearly reveal high transmission through the nanostructures inspired from the lantern cuticle. The nanostructures on an LED lens surface were fabricated by using a large-area nanotemplating and reconfigurable nanomolding with heat-induced shear thinning. The biologically inspired LED lens, distinct from a smooth surface lens, substantially increases light transmission over visible ranges, comparable to conventional antireflection coating. This biological inspiration can offer new opportunities for increasing the light extraction efficiency of high-power LED packages. PMID:23112185
Tie Goes to the Runner: The Physics and Psychology of a Close Play
ERIC Educational Resources Information Center
Starling, David J.; Starling, Sarah J.
2017-01-01
Since physics is often a service course for college students, it is important to incorporate everyday examples in the curriculum that inspire students of diverse backgrounds and interests. In this regard, baseball has been a workhorse for the physics classroom for a long time, taking the form of demonstrations and example problems. In this…
Towards a high-speed quantum random number generator
NASA Astrophysics Data System (ADS)
Stucki, Damien; Burri, Samuel; Charbon, Edoardo; Chunnilall, Christopher; Meneghetti, Alessio; Regazzoni, Francesco
2013-10-01
Randomness is of fundamental importance in various fields, such as cryptography, numerical simulations, or the gaming industry. Quantum physics, which is fundamentally probabilistic, is the best option for a physical random number generator. In this article, we will present the work carried out in various projects in the context of the development of a commercial and certified high speed random number generator.
INTO THE LAIR: GRAVITATIONAL-WAVE SIGNATURES OF DARK MATTER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macedo, Caio F. B.; Cardoso, Vitor; Crispino, Luis C. B.
The nature and properties of dark matter (DM) are both outstanding issues in physics. Besides clustering in halos, the universal character of gravity implies that self-gravitating compact DM configurations-predicted by various models-might be spread throughout the universe. Their astrophysical signature can be used to probe fundamental particle physics, or to test alternative descriptions of compact objects in active galactic nuclei. Here, we discuss the most promising dissection tool of such configurations: the inspiral of a compact stellar-size object and consequent gravitational-wave (GW) emission. The inward motion of this ''test probe'' encodes unique information about the nature of the supermassive configuration.more » When the probe travels through some compact region we show, within a Newtonian approximation, that the quasi-adiabatic inspiral is mainly driven by DM accretion and by dynamical friction, rather than by radiation reaction. When accretion dominates, the frequency and amplitude of the GW signal produced during the latest stages of the inspiral are nearly constant. In the exterior region we study a model in which the inspiral is driven by GW and scalar-wave emission, described at a fully relativistic level. Resonances in the energy flux appear whenever the orbital frequency matches the effective mass of the DM particle, corresponding to the excitation of the central object's quasinormal frequencies. Unexpectedly, these resonances can lead to large dephasing with respect to standard inspiral templates, to such an extent as to prevent detection with matched filtering techniques. We discuss some observational consequences of these effects for GW detection.« less
Exploration decisions and firms in the mineral industries
Attanasi, E.D.
1981-01-01
The purpose of this paper is to demonstrate how physical characteristics of deposits and results of past exploration enter future exploration decisions. A proposed decision model is presented that is consistent with a set of primitive probabilistic assumptions associated with deposit size distributions and discoverability. Analysis of optimal field exploration strategy showed the likely firm responses to alternative exploration taxes and effects on the distribution of future discoveries. Examination of the probabilistic elements of the decision model indicates that changes in firm expectations associated with the distribution of deposits cannot be totally offset by changes in economic variables. ?? 1981.
NASA Applications and Lessons Learned in Reliability Engineering
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Fuller, Raymond P.
2011-01-01
Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.
NASA Astrophysics Data System (ADS)
Tsilanizara, A.; Gilardi, N.; Huynh, T. D.; Jouanne, C.; Lahaye, S.; Martinez, J. M.; Diop, C. M.
2014-06-01
The knowledge of the decay heat quantity and the associated uncertainties are important issues for the safety of nuclear facilities. Many codes are available to estimate the decay heat. ORIGEN, FISPACT, DARWIN/PEPIN2 are part of them. MENDEL is a new depletion code developed at CEA, with new software architecture, devoted to the calculation of physical quantities related to fuel cycle studies, in particular decay heat. The purpose of this paper is to present a probabilistic approach to assess decay heat uncertainty due to the decay data uncertainties from nuclear data evaluation like JEFF-3.1.1 or ENDF/B-VII.1. This probabilistic approach is based both on MENDEL code and URANIE software which is a CEA uncertainty analysis platform. As preliminary applications, single thermal fission of uranium 235, plutonium 239 and PWR UOx spent fuel cell are investigated.
PEOPLE IN PHYSICS: Interview with Peter Barham
NASA Astrophysics Data System (ADS)
Membrey, Conducted by Jill
2000-03-01
Dr Peter Barham, Reader in Physics at the University of Bristol, is one of the first winners of an Institute of Physics Public Awareness of Physics award. These are intended for individuals or groups who have demonstrated excellence, inspiration and innovation in bringing physics to the public. In Dr Barham's case, he has been recognized for his very successful lecture demonstrations on the physics of food for a range of audiences, as well as for supporting and encouraging others to promote physics to the general public.
NASA Celebrates the World Year of Physics
NASA Technical Reports Server (NTRS)
Adams, M. L.
2005-01-01
Celebrating the World Year of Physics presents NASA with an opportunity to inform educators of the importance of physics in our everyday lives. indeed, almost all NASA programs fake advantage of physical concepts in some fashion. Special programs throughout the year, affiliated with the World Year of Physics, are identifed to inform and inspire educators, students, and the general public. We will discuss these programs in detail and outline how educators may become more involved.
SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.
Virulo is a probabilistic model for predicting virus attenuation. Monte Carlo methods are used to generate ensemble simulations of virus attenuation due to physical, biological, and chemical factors. The model generates a probability of failure to achieve a chosen degree o...
NASA Astrophysics Data System (ADS)
Geramita, Matthew
2008-04-01
In 1924, Katherine Chamberlain became the first woman to receive a doctorate in physics from the University of Michigan. As one of the first women in the world to earn a doctorate in physics, Katherine reached a level prominence in the scientific community that few women had achieved. As a scientist, Katherine studied the outer energy levels of various elements using x-ray spectroscopy at the University of Michigan. In her thesis, she showed the potential for x-rays to reduce highly oxidized compounds and in 1925 won the Ellen Richards Prize for the world's best scientific paper by a woman. As an educator, she taught an introduction to photography course for thirty-five years in the hopes of creating new ways to inspire a love for physics in her students. As a community leader, she worked with The United World Federalists and The Michigan Memorial Phoenix Project to find peaceful uses for nuclear energy. Looking at these aspects of Chamberlain's life offers a unique perspective on the physics community of the 1920's, physics education, and the nuclear panic that followed WWII.
Niveaux d'étude du cerveau, et sagesse physique
NASA Astrophysics Data System (ADS)
Toulouse, Gérard
1993-02-01
The brain is a complex spatio-temporal affair. Several brain theories propose the definition of three superposed levels of study. But physics, though the experience of condensed matter physics, suggests that it is unwise to enforce onto brain theories a unified hierarchical scheme, the inspiration for which seems to come from the realm of sub-molecular physics. Le cerveau est une affaire spatio-temporelle complexe. Plusieurs théories du cerveau proposent de définir trois niveaux d'études superposés. Mais la physique, à travers l'expérience de la physique de la matière condensée, suggère qu'il n'est pas sage d'imposer sur les théories du cerveau un schéma hiérarchique unifié, dont l'inspiration semble provenir du domaine de la physique sub-moléculaire.
Neuromorphic computing enabled by physics of electron spins: Prospects and perspectives
NASA Astrophysics Data System (ADS)
Sengupta, Abhronil; Roy, Kaushik
2018-03-01
“Spintronics” refers to the understanding of the physics of electron spin-related phenomena. While most of the significant advancements in this field has been driven primarily by memory, recent research has demonstrated that various facets of the underlying physics of spin transport and manipulation can directly mimic the functionalities of the computational primitives in neuromorphic computation, i.e., the neurons and synapses. Given the potential of these spintronic devices to implement bio-mimetic computations at very low terminal voltages, several spin-device structures have been proposed as the core building blocks of neuromorphic circuits and systems to implement brain-inspired computing. Such an approach is expected to play a key role in circumventing the problems of ever-increasing power dissipation and hardware requirements for implementing neuro-inspired algorithms in conventional digital CMOS technology. Perspectives on spin-enabled neuromorphic computing, its status, and challenges and future prospects are outlined in this review article.
NASA Astrophysics Data System (ADS)
Aleardi, Mattia
2018-01-01
We apply a two-step probabilistic seismic-petrophysical inversion for the characterization of a clastic, gas-saturated, reservoir located in offshore Nile Delta. In particular, we discuss and compare the results obtained when two different rock-physics models (RPMs) are employed in the inversion. The first RPM is an empirical, linear model directly derived from the available well log data by means of an optimization procedure. The second RPM is a theoretical, non-linear model based on the Hertz-Mindlin contact theory. The first step of the inversion procedure is a Bayesian linearized amplitude versus angle (AVA) inversion in which the elastic properties, and the associated uncertainties, are inferred from pre-stack seismic data. The estimated elastic properties constitute the input to the second step that is a probabilistic petrophysical inversion in which we account for the noise contaminating the recorded seismic data and the uncertainties affecting both the derived rock-physics models and the estimated elastic parameters. In particular, a Gaussian mixture a-priori distribution is used to properly take into account the facies-dependent behavior of petrophysical properties, related to the different fluid and rock properties of the different litho-fluid classes. In the synthetic and in the field data tests, the very minor differences between the results obtained by employing the two RPMs, and the good match between the estimated properties and well log information, confirm the applicability of the inversion approach and the suitability of the two different RPMs for reservoir characterization in the investigated area.
ERIC Educational Resources Information Center
Mylott, Elliot; Kutschera, Ellynne; Dunlap, Justin C.; Christensen, Warren; Widenhorn, Ralf
2016-01-01
We will describe a one-quarter pilot algebra-based introductory physics course for pre-health and life science majors. The course features videos with biomedical experts and cogent biomedically inspired physics content. The materials were used in a flipped classroom as well as an all-online environment where students interacted with multimedia…
A Useful Demonstration of Calculus in a Physics High School Laboratory
ERIC Educational Resources Information Center
Alvarez, Gustavo; Schulte, Jurgen; Stockton, Geoffrey; Wheeler, David
2018-01-01
The real power of calculus is revealed when it is applied to actual physical problems. In this paper, we present a calculus inspired physics experiment suitable for high school and undergraduate programs. A model for the theory of the terminal velocity of a falling body subject to a resistive force is developed and its validity tested in an…
Martin Gardner: 100 Years of the Magic of Physics
ERIC Educational Resources Information Center
Gillaspy, John D.
2014-01-01
2014 marks the 100th anniversary of the birth of Martin Gardner, a man whose writings helped inspire generations of young students to pursue a career in physics and mathematics. From his first to his last, and many in between, Gardner's publications often combined magic and science. A recurring theme was the clever use of physical principles…
Moving from idea to action: promoting physical activity by empowering adolescents.
Lindqvist, Anna-Karin; Mikaelsson, Katarina; Westerberg, Mats; Gard, Gunvor; Kostenius, Catrine
2014-11-01
Physical activity provides fundamental health benefits for children and youth. The aim of the study was to explore the possibility of conducting an empowerment-inspired intervention and examine the impact of the intervention in promoting moderate and vigorous physical activity (MVPA) among adolescents. A nonrandomized trial with a concurrent control group was carried out. Physical activity data were collected before and after the intervention with daily questions by short message service. Self-efficacy, social support, and attitude were also measured before and after the intervention since they were possible mediators. The intervention was created by the students, the researchers, and the teachers using an empowerment-based approach. Students in the intervention group (n = 21) increased their MVPA on average by 4.9 (SD = 28.9) minutes per day, and students in the control group (n = 25) reduced their MVPA on average by 25.4 (SD = 23.0) minutes per day (p = .000). The intervention might have contributed to a promotion of physical activity among students in the intervention group. The most valuable contribution this study provides is the knowledge that it is possible to develop and conduct an empowerment-inspired intervention to promote adolescent physical activity. © 2014 Society for Public Health Education.
Quo vadimus? - Much hard work is still needed
NASA Astrophysics Data System (ADS)
Toffoli, Tommaso
1998-09-01
Physical aspects of computation that just a few years ago appeared tentative and tenuous, such as energy recycling in computation and quantum computation, have now grown into full-fledged scientific businesses. Conversely, concepts born within physics, such as entropy and phase transitions, are now fully at home in computational contexts quite unrelated to physics. Countless symposia cannot exhaust the wealth of research that is turning up in these areas. The “Physics of Computation” workshops cannot and should not try to be an exhaustive forum for these more mature areas. I think it would be to everyone's advantage if the workshops tried to play a more specialized and more critical role; namely, to venture into uncharted territories and to do so with a sense of purpose and of direction. Here I briefly suggest a few possibilities; among these, the need to construct a general, model-independent concept of “amount of computation”, much as we already have one for “amount of information”. I suspect that, much as the inspiration and prototype for the latter was found in physical entropy, so the inspiration and prototype for the former will be found in physical action.
SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.
NEXT GENERATION MULTIMEDIA/MULTIPATHWAY EXPOSURE MODELING
The Stochastic Human Exposure and Dose Simulation model for pesticides (SHEDS-Pesticides) supports the efforts of EPA to better understand human exposures and doses to multimedia, multipathway pollutants. It is a physically-based, probabilistic computer model that predicts, for u...
SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.
SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.
Toward an Integrated Design, Inspection and Redundancy Research Program.
1984-01-01
William Creelman William H. Silcox National Marine Service Standard Oil Company of California St. Louis, Missouri San Francisco, California .-- N...develop physical models and generic tools for analyzing the effects of redundancy, reserve strength, and residual strength on the system behavior of marine...probabilistic analyses to be applicable to real-world problems, this program needs to provide - the deterministic physical models and generic tools upon
High-Resolution Underwater Mapping Using Side-Scan Sonar
2016-01-01
The goal of this study is to generate high-resolution sea floor maps using a Side-Scan Sonar(SSS). This is achieved by explicitly taking into account the SSS operation as follows. First, the raw sensor data is corrected by means of a physics-based SSS model. Second, the data is projected to the sea-floor. The errors involved in this projection are thoroughfully analysed. Third, a probabilistic SSS model is defined and used to estimate the probability of each sea-floor region to be observed. This probabilistic information is then used to weight the contribution of each SSS measurement to the map. Because of these models, arbitrary map resolutions can be achieved, even beyond the sensor resolution. Finally, a geometric map building method is presented and combined with the probabilistic approach. The resulting map is composed of two layers. The echo intensity layer holds the most likely echo intensities at each point in the sea-floor. The probabilistic layer contains information about how confident can the user or the higher control layers be about the echo intensity layer data. Experimental results have been conducted in a large subsea region. PMID:26821379
Assessing Expertise in Introductory Physics Using Categorization Task
ERIC Educational Resources Information Center
Mason, Andrew; Singh, Chandralekha
2011-01-01
The ability to categorize problems based upon underlying principles, rather than surface features or contexts, is considered one of several proxy predictors of expertise in problem solving. With inspiration from the classic study by Chi, Feltovich, and Glaser, we assess the distribution of expertise among introductory physics students by asking…
Problematizing as a Scientific Endeavor
ERIC Educational Resources Information Center
Phillips, Anna McLean; Watkins, Jessica; Hammer, David
2017-01-01
The work of physics learners at all levels revolves around problems. Physics education research has inspired attention to the forms of these problems, whether conceptual or algorithmic, closed or open response, well or ill structured. Meanwhile, it has been the work of curriculum developers and instructors to develop these problems. Physics…
SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.
SHEDS - Multimedia is EPA's premier physically-based, probabilistic model, that can simulate cumulative or aggregate exposures for a population across a variety of multimedia, multipathway environmental chemicals.
filltex: Automatic queries to ADS and INSPIRE databases to fill LaTex bibliography
NASA Astrophysics Data System (ADS)
Gerosa, Davide; Vallisneri, Michele
2017-05-01
filltex is a simple tool to fill LaTex reference lists with records from the ADS and INSPIRE databases. ADS and INSPIRE are the most common databases used among the theoretical physics and astronomy scientific communities, respectively. filltex automatically looks for all citation labels present in a tex document and, by means of web-scraping, downloads all the required citation records from either of the two databases. filltex significantly speeds up the LaTex scientific writing workflow, as all required actions (compile the tex file, fill the bibliography, compile the bibliography, compile the tex file again) are automated in a single command. We also provide an integration of filltex for the macOS LaTex editor TexShop.
The physics of Copenhagen for students and the general public
NASA Astrophysics Data System (ADS)
Bergström, L.; Johansson, K. E.; Nilsson, Ch
2001-09-01
The play Copenhagen has attracted the attention of a large audience in several countries. The hypothetical discussion in Copenhagen between two of the giants in physics, Niels Bohr and Werner Heisenberg, has inspired us to start a theoretical and experimental exploration of quantum physics. This theme has been used in Stockholm Science Laboratory for audiences of both students and the general public.
Tie Goes to the Runner: The Physics and Psychology of a Close Play
NASA Astrophysics Data System (ADS)
Starling, David J.; Starling, Sarah J.
2017-04-01
Since physics is often a service course for college students, it is important to incorporate everyday examples in the curriculum that inspire students of diverse backgrounds and interests. In this regard, baseball has been a workhorse for the physics classroom for a long time, taking the form of demonstrations and example problems. Here, we discuss how baseball can help bridge the physical and social sciences in an introductory physics course by analyzing a close play at first base.
NASA Astrophysics Data System (ADS)
Sotolongo-Costa, O.; Gaggero-Sager, L. M.; Becker, J. T.; Maestu, F.; Sotolongo-Grau, O.
2017-04-01
Aging associated brain decline often result in some kind of dementia. Even when this is a complex brain disorder a physical model can be used in order to describe its general behavior. A probabilistic model for the development of dementia is obtained and fitted to some experimental data obtained from the Alzheimer's Disease Neuroimaging Initiative. It is explained how dementia appears as a consequence of aging and why it is irreversible.
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.
We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less
Probabilistic Multi-Hazard Assessment of Dry Cask Structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bencturk, Bora; Padgett, Jamie; Uddin, Rizwan
systems the concrete shall not only provide shielding but insures stability of the upright canister, facilitates anchoring, allows ventilation, and provides physical protection against theft, severe weather and natural (seismic) as well as man-made events (blast incidences). Given the need to remain functional for 40 years or even longer in case of interim storage, the concrete outerpack and the internal canister components need to be evaluated with regard to their long-term ability to perform their intended design functions. Just as evidenced by deteriorating concrete bridges, there are reported visible degradation mechanisms of dry storage systems especially when high corrosive environmentsmore » are considered in maritime locations. The degradation of reinforced concrete is caused by multiple physical and chemical mechanisms, which may be summarized under the heading of environmental aging. The underlying hygro-thermal transport processes are accelerated by irradiation effects, hence creep and shrinkage need to include the effect of chloride penetration, alkali aggregate reaction as well as corrosion of the reinforcing steel. In light of the above, the two main objectives of this project are to (1) develop a probabilistic multi-hazard assessment framework, and (2) through experimental and numerical research perform a comprehensive assessment under combined earthquake loads and aging induced deterioration, which will also provide data for the development and validation of the probabilistic framework.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. S. Schroeder; R. W. Youngblood
The Risk-Informed Safety Margin Characterization (RISMC) pathway of the Light Water Reactor Sustainability Program is developing simulation-based methods and tools for analyzing safety margin from a modern perspective. [1] There are multiple definitions of 'margin.' One class of definitions defines margin in terms of the distance between a point estimate of a given performance parameter (such as peak clad temperature), and a point-value acceptance criterion defined for that parameter (such as 2200 F). The present perspective on margin is that it relates to the probability of failure, and not just the distance between a nominal operating point and a criterion.more » In this work, margin is characterized through a probabilistic analysis of the 'loads' imposed on systems, structures, and components, and their 'capacity' to resist those loads without failing. Given the probabilistic load and capacity spectra, one can assess the probability that load exceeds capacity, leading to component failure. Within the project, we refer to a plot of these probabilistic spectra as 'the logo.' Refer to Figure 1 for a notional illustration. The implications of referring to 'the logo' are (1) RISMC is focused on being able to analyze loads and spectra probabilistically, and (2) calling it 'the logo' tacitly acknowledges that it is a highly simplified picture: meaningful analysis of a given component failure mode may require development of probabilistic spectra for multiple physical parameters, and in many practical cases, 'load' and 'capacity' will not vary independently.« less
Physicists Get INSPIREd: INSPIRE Project and Grid Applications
NASA Astrophysics Data System (ADS)
Klem, Jukka; Iwaszkiewicz, Jan
2011-12-01
INSPIRE is the new high-energy physics scientific information system developed by CERN, DESY, Fermilab and SLAC. INSPIRE combines the curated and trusted contents of SPIRES database with Invenio digital library technology. INSPIRE contains the entire HEP literature with about one million records and in addition to becoming the reference HEP scientific information platform, it aims to provide new kinds of data mining services and metrics to assess the impact of articles and authors. Grid and cloud computing provide new opportunities to offer better services in areas that require large CPU and storage resources including document Optical Character Recognition (OCR) processing, full-text indexing of articles and improved metrics. D4Science-II is a European project that develops and operates an e-Infrastructure supporting Virtual Research Environments (VREs). It develops an enabling technology (gCube) which implements a mechanism for facilitating the interoperation of its e-Infrastructure with other autonomously running data e-Infrastructures. As a result, this creates the core of an e-Infrastructure ecosystem. INSPIRE is one of the e-Infrastructures participating in D4Science-II project. In the context of the D4Science-II project, the INSPIRE e-Infrastructure makes available some of its resources and services to other members of the resulting ecosystem. Moreover, it benefits from the ecosystem via a dedicated Virtual Organization giving access to an array of resources ranging from computing and storage resources of grid infrastructures to data and services.
Teaching the Physics of Energy while Traveling by Train
ERIC Educational Resources Information Center
Hay, Katrina
2013-01-01
Pacific Lutheran University (Tacoma, WA) is renowned for the number of its courses that offer international and study-away opportunities. Inspired by the theme of sustainability, and my growing concern about the environmental impact of conventional fuels, I offered a course, Physics of Energy, for the first time during PLU's January 2011 term (a…
Federal outdoor recreation trends: Effects on economic opportunities
Eric M. White; Michael Bowker; Ashley E. Askew; Linda L. Langner; J. Ross Arnold; Don English
2015-01-01
Outdoor recreation plays a significant role in American lives. It provides physical challenges and well-being, helps develop lifelong skills, provokes interest and inquiry, inspires wonder and awe of the natural world, and often provides an alternative to daily routines. Recreation contributes greatly to the physical, mental, and spiritual health of individuals, bonds...
Global Warming: Lessons from Ozone Depletion
ERIC Educational Resources Information Center
Hobson, Art
2010-01-01
My teaching and textbook have always covered many physics-related social issues, including stratospheric ozone depletion and global warming. The ozone saga is an inspiring good-news story that's instructive for solving the similar but bigger problem of global warming. Thus, as soon as students in my physics literacy course at the University of…
Astronomy-inspired Atomic and Molecular Physics
NASA Astrophysics Data System (ADS)
Rau, A. R. P.
2002-02-01
Aimed at senior undergraduate and first-year graduate students in departments of physics and astronomy, this textbook gives a systematic treatment of atomic and molecular structure and spectra, together with the effect of weak and strong external electromagnetic fields. Topics chosen are those of interest in astronomy and indeed many were inspired by specific astronomical contexts. Examples include the negative ion of hydrogen and the effects of strong magnetic fields such as those occurring on certain white dwarfs and neutron stars. Adiabatic and non-adiabatic handling of electron correlations and application to processes such as dielectronic recombination are included. Astronomical examples are provided throughout as well as end-of-the chapter problems and exercises. Over seventy illustrative diagrams complete this unique and comprehensive volume. Link: http://www.wkap.nl/prod/b/1-4020-0467-2
Self-assembled hierarchically structured organic-inorganic composite systems.
Tritschler, Ulrich; Cölfen, Helmut
2016-05-13
Designing bio-inspired, multifunctional organic-inorganic composite materials is one of the most popular current research objectives. Due to the high complexity of biocomposite structures found in nacre and bone, for example, a one-pot scalable and versatile synthesis approach addressing structural key features of biominerals and affording bio-inspired, multifunctional organic-inorganic composites with advanced physical properties is highly challenging. This article reviews recent progress in synthesizing organic-inorganic composite materials via various self-assembly techniques and in this context highlights a recently developed bio-inspired synthesis concept for the fabrication of hierarchically structured, organic-inorganic composite materials. This one-step self-organization concept based on simultaneous liquid crystal formation of anisotropic inorganic nanoparticles and a functional liquid crystalline polymer turned out to be simple, fast, scalable and versatile, leading to various (multi-)functional composite materials, which exhibit hierarchical structuring over several length scales. Consequently, this synthesis approach is relevant for further progress and scientific breakthrough in the research field of bio-inspired and biomimetic materials.
Tohyama, Satsuki; Usuki, Fusako
2015-01-01
We report a case of a patient with severe ataxia and visual disturbance due to vitamin E deficiency, whose self-efficacy was inspired by intervention with an appropriate occupational therapy activity. Before the handloom intervention, her severe neurological deficits decreased her activities of daily living (ADL) ability, which made her feel pessimistic and depressed. The use of a handloom, however, inspired her sense of accomplishment because she could perform the weft movement by using her residual physical function, thereby relieving her pessimistic attitude. This perception of capability motivated her to participate in further rehabilitation. Finally, her eager practice enhanced her ADL ability and quality of life (QOL). The result suggests that it is important to provide an appropriate occupational therapy activity that can inspire self-efficacy in patients with chronic refractory neurological disorders because the perception of capability can enhance the motivation to improve performance in general activities, ADL ability and QOL. PMID:25666249
Tohyama, Satsuki; Usuki, Fusako
2015-02-09
We report a case of a patient with severe ataxia and visual disturbance due to vitamin E deficiency, whose self-efficacy was inspired by intervention with an appropriate occupational therapy activity. Before the handloom intervention, her severe neurological deficits decreased her activities of daily living (ADL) ability, which made her feel pessimistic and depressed. The use of a handloom, however, inspired her sense of accomplishment because she could perform the weft movement by using her residual physical function, thereby relieving her pessimistic attitude. This perception of capability motivated her to participate in further rehabilitation. Finally, her eager practice enhanced her ADL ability and quality of life (QOL). The result suggests that it is important to provide an appropriate occupational therapy activity that can inspire self-efficacy in patients with chronic refractory neurological disorders because the perception of capability can enhance the motivation to improve performance in general activities, ADL ability and QOL. 2015 BMJ Publishing Group Ltd.
NASA Astrophysics Data System (ADS)
Gromek, Katherine Emily
A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.
NASA Astrophysics Data System (ADS)
Ghasemi, A.; Borhani, S.; Viparelli, E.; Hill, K. M.
2017-12-01
The Exner equation provides a formal mathematical link between sediment transport and bed morphology. It is typically represented in a discrete formulation where there is a sharp geometric interface between the bedload layer and the bed, below which no particles are entrained. For high temporally and spatially resolved models, this is strictly correct, but typically this is applied in such a way that spatial and temporal fluctuations in the bed surface (bedforms and otherwise) are not captured. This limits the extent to which the exchange between particles in transport and the sediment bed are properly represented, particularly problematic for mixed grain size distributions that exhibit segregation. Nearly two decades ago, Parker (2000) provided a framework for a solution to this dilemma in the form of a probabilistic Exner equation, partially experimentally validated by Wong et al. (2007). We present a computational study designed to develop a physics-based framework for understanding the interplay between physical parameters of the bed and flow and parameters in the Parker (2000) probabilistic formulation. To do so we use Discrete Element Method simulations to relate local time-varying parameters to long-term macroscopic parameters. These include relating local grain size distribution and particle entrainment and deposition rates to long- average bed shear stress and the standard deviation of bed height variations. While relatively simple, these simulations reproduce long-accepted empirically determined transport behaviors such as the Meyer-Peter and Muller (1948) relationship. We also find that these simulations reproduce statistical relationships proposed by Wong et al. (2007) such as a Gaussian distribution of bed heights whose standard deviation increases with increasing bed shear stress. We demonstrate how the ensuing probabilistic formulations provide insight into the transport and deposition of both narrow and wide grain size distribution.
Random matrix approach to group correlations in development country financial market
NASA Astrophysics Data System (ADS)
Qohar, Ulin Nuha Abdul; Lim, Kyuseong; Kim, Soo Yong; Liong, The Houw; Purqon, Acep
2015-12-01
Financial market is a borderless economic activity, everyone in this world has the right to participate in stock transactions. The movement of stocks is interesting to be discussed in various sciences, ranging from economists to mathe-maticians try to explain and predict the stock movement. Econophysics, as a discipline that studies the economic behavior using one of the methods in particle physics to explain stock movement. Stocks tend to be unpredictable probabilistic regarded as a probabilistic particle. Random Matrix Theory is one method used to analyze probabilistic particle is used to analyze the characteristics of the movement in the stock collection of developing country stock market shares of the correlation matrix. To obtain the characteristics of the developing country stock market and use characteristics of stock markets of developed countries as a parameter for comparison. The result shows market wide effect is not happened in Philipine market and weak in Indonesia market. Contrary, developed country (US) has strong market wide effect.
Robust Control Design for Systems With Probabilistic Uncertainty
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.
2005-01-01
This paper presents a reliability- and robustness-based formulation for robust control synthesis for systems with probabilistic uncertainty. In a reliability-based formulation, the probability of violating design requirements prescribed by inequality constraints is minimized. In a robustness-based formulation, a metric which measures the tendency of a random variable/process to cluster close to a target scalar/function is minimized. A multi-objective optimization procedure, which combines stability and performance requirements in time and frequency domains, is used to search for robustly optimal compensators. Some of the fundamental differences between the proposed strategy and conventional robust control methods are: (i) unnecessary conservatism is eliminated since there is not need for convex supports, (ii) the most likely plants are favored during synthesis allowing for probabilistic robust optimality, (iii) the tradeoff between robust stability and robust performance can be explored numerically, (iv) the uncertainty set is closely related to parameters with clear physical meaning, and (v) compensators with improved robust characteristics for a given control structure can be synthesized.
Effective Practices for Training and Inspiring High School Physics Teachers
NASA Astrophysics Data System (ADS)
Magee-Sauer, Karen
It is well-documented that there is a nationwide shortage of highly qualified high school physics teachers. Not surprising, institutions of higher education report that the most common number of physics teacher graduates is zero with the majority of institutions graduating less than two physics teachers per year. With these statistics in mind, it is critical that institutions take a careful look at how they recruit, train, and contribute to the retention of high school physics teachers. PhysTEC is a partnership between the APS and AAPT that is dedicated to improving and promoting the education of high school physics teachers. Primarily funded by the NSF and its partnering organizations, PhysTEC has identified key components that are common to successful physics teacher preparation programs. While creating a successful training program in physics, it is also important that students have the opportunity for a ``do-able'' path to certification that does not add further financial debt. This talk will present an overview of ``what works'' in creating a path for physics majors to a high school physics teaching career, actions and activities that help train and inspire pre-service physics teachers, and frameworks that provide the support for in-service teachers. Obstacles to certification and the importance of a strong partnership with colleges of education will be discussed. Several examples of successful physics high school teacher preparation programs will be presented. This material is part of the Physics Teacher Education Coalition project, which is based upon work supported by the National Science Foundation under Grant Nos. 0808790, 0108787, and 0833210.
MODELING HUMAN EXPOSURES AND DOSE USING A 2-DIMENSIONAL MONTE-CARLO MODEL (SHEDS)
Since 1998, US EPA's National Exposure Research Laboratory (NERL) has been developing the Stochastic Human Exposure and Dose Simulation (SHEDS) model for various classes of pollutants. SHEDS is a physically-based probabilistic model intended for improving estimates of human ex...
A PROBABILISTIC METHOD FOR ESTIMATING MONITORING POINT DENSITY FOR CONTAINMENT SYSTEM LEAK DETECTION
The use of physical and hydraulic containment systems for the isolation of contaminated ground water and aquifer materials ssociated with hazardous waste sites has increased during the last decade. The existing methodologies for monitoring and evaluating leakage from hazardous w...
MODELING AGGREGATE CHLORPYRIFOS EXPOSURE AND DOSE TO CHILDREN
To help address the aggregate exposure assessment needs of the Food Quality Protection Act, a physically-based probabilistic model (SHEDS-Pesticides, version 3) has been applied to estimate aggregate chlorpyrifos exposure and dose to children. Two age groups (0-4, 5-9 years) a...
USEPA SHEDS MODEL: METHODOLOGY FOR EXPOSURE ASSESSMENT FOR WOOD PRESERVATIVES
A physically-based, Monte Carlo probabilistic model (SHEDS-Wood: Stochastic Human Exposure and Dose Simulation model for wood preservatives) has been applied to assess the exposure and dose of children to arsenic (As) and chromium (Cr) from contact with chromated copper arsenat...
Resilient Grid Operational Strategies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasqualini, Donatella
Extreme weather-related disturbances, such as hurricanes, are a leading cause of grid outages historically. Although physical asset hardening is perhaps the most common way to mitigate the impacts of severe weather, operational strategies may be deployed to limit the extent of societal and economic losses associated with weather-related physical damage.1 The purpose of this study is to examine bulk power-system operational strategies that can be deployed to mitigate the impact of severe weather disruptions caused by hurricanes, thereby increasing grid resilience to maintain continuity of critical infrastructure during extreme weather. To estimate the impacts of resilient grid operational strategies, Losmore » Alamos National Laboratory (LANL) developed a framework for hurricane probabilistic risk analysis (PRA). The probabilistic nature of this framework allows us to estimate the probability distribution of likely impacts, as opposed to the worst-case impacts. The project scope does not include strategies that are not operations related, such as transmission system hardening (e.g., undergrounding, transmission tower reinforcement and substation flood protection) and solutions in the distribution network.« less
Data Assimilation - Advances and Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Brian J.
2014-07-30
This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less
Qian, Xiaokun; Deal, Allison M; Ribisl, Kurt M; Linnan, Laura A; Tate, Deborah F
2016-01-01
Background Online interventions providing individual health behavior assessment should deliver feedback in a way that is both understandable and engaging. This study focused on the potential for infographics inspired by the aesthetics of game design to contribute to these goals. Objective We conducted formative research to test game-inspired infographics against more traditional displays (eg, text-only, column chart) for conveying a behavioral goal and an individual’s behavior relative to the goal. We explored the extent to which the display type would influence levels of engagement and information processing. Methods Between-participants experiments compared game-inspired infographics with traditional formats in terms of outcomes related to information processing (eg, comprehension, cognitive load) and engagement (eg, attitudes toward the information, emotional tone). We randomly assigned participants (N=1162) to an experiment in 1 of 6 modules (tobacco use, alcohol use, vegetable consumption, fruit consumption, physical activity, and weight management). Results In the tobacco module, a game-inspired format (scorecard) was compared with text-only; there were no differences in attitudes and emotional tone, but the scorecard outperformed text-only on comprehension (P=.004) and decreased cognitive load (P=.006). For the other behaviors, we tested 2 game-inspired formats (scorecard, progress bar) and a traditional column chart; there were no differences in comprehension, but the progress bar outperformed the other formats on attitudes and emotional tone (P<.001 for all contrasts). Conclusions Across modules, a game-inspired infographic showed potential to outperform a traditional format for some study outcomes while not underperforming on other outcomes. Overall, findings support the use of game-inspired infographics in behavioral assessment feedback to enhance comprehension and engagement, which may lead to greater behavior change. PMID:27658469
Comello, Maria Leonora G; Qian, Xiaokun; Deal, Allison M; Ribisl, Kurt M; Linnan, Laura A; Tate, Deborah F
2016-09-22
Online interventions providing individual health behavior assessment should deliver feedback in a way that is both understandable and engaging. This study focused on the potential for infographics inspired by the aesthetics of game design to contribute to these goals. We conducted formative research to test game-inspired infographics against more traditional displays (eg, text-only, column chart) for conveying a behavioral goal and an individual's behavior relative to the goal. We explored the extent to which the display type would influence levels of engagement and information processing. Between-participants experiments compared game-inspired infographics with traditional formats in terms of outcomes related to information processing (eg, comprehension, cognitive load) and engagement (eg, attitudes toward the information, emotional tone). We randomly assigned participants (N=1162) to an experiment in 1 of 6 modules (tobacco use, alcohol use, vegetable consumption, fruit consumption, physical activity, and weight management). In the tobacco module, a game-inspired format (scorecard) was compared with text-only; there were no differences in attitudes and emotional tone, but the scorecard outperformed text-only on comprehension (P=.004) and decreased cognitive load (P=.006). For the other behaviors, we tested 2 game-inspired formats (scorecard, progress bar) and a traditional column chart; there were no differences in comprehension, but the progress bar outperformed the other formats on attitudes and emotional tone (P<.001 for all contrasts). Across modules, a game-inspired infographic showed potential to outperform a traditional format for some study outcomes while not underperforming on other outcomes. Overall, findings support the use of game-inspired infographics in behavioral assessment feedback to enhance comprehension and engagement, which may lead to greater behavior change.
NASA Technical Reports Server (NTRS)
Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.
2010-01-01
Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project
Acoustic Wave Guiding by Reconfigurable Tessellated Arrays
NASA Astrophysics Data System (ADS)
Zou, Chengzhe; Lynd, Danielle T.; Harne, Ryan L.
2018-01-01
The reconfiguration of origami tessellations is a prime vehicle to harness for adapting system properties governed by a structural form. While the knowledge of mechanical property changes associated with origami tessellation folding has been extensively built up, the opportunities to integrate other physics into a framework of tessellated, adaptive structures remain to be fully exploited. Acoustics appears to be a prime domain to marry with origami science. Specifically, deep technical analogies are revealed between wave-guiding properties achieved via digital methods that virtually reposition array elements and the actual repositioning of facets by folding origami-inspired tessellations. Here we capitalize on this analogy to investigate acoustic arrays established upon facet layouts of origami-inspired tessellations. We show that a concept of reconfigurable tessellated arrays may guide waves more effectively than traditional digitally phased arrays using fewer transducer elements. Moreover, we show that the refinement of tessellated arrays trends to the ideal case of classical wave radiators or receivers grounded in principles of geometrical acoustics. By linear wave physics shared among myriad scientific disciplines and across orders of magnitude in length scale, these discoveries may cultivate numerous opportunities for wave-guiding adaptive structures inspired by low-dimensional origami tessellations.
Generalized Probabilistic Description of Noninteracting Identical Particles
NASA Astrophysics Data System (ADS)
Karczewski, Marcin; Markiewicz, Marcin; Kaszlikowski, Dagomir; Kurzyński, Paweł
2018-02-01
We investigate an operational description of identical noninteracting particles in multiports. In particular, we look for physically motivated restrictions that explain their bunching probabilities. We focus on a symmetric 3-port in which a triple of superquantum particles admitted by our generalized probabilistic framework would bunch with a probability of 3/4 . The bosonic bound of 2/3 can then be restored by imposing the additional requirement of product evolution of certain input states. These states are characterized by the fact that, much like product states, their entropy equals the sum of entropies of their one-particle substates. This principle is, however, not enough to exclude the possibility of superquantum particles in higher-order multiports.
The Graphics Tablet--A Valuable Tool for the Digital STEM Teacher
ERIC Educational Resources Information Center
Stephens, Jeff
2018-01-01
I am inspired to write this article after coming across some publications in "The Physics Teacher" that all hit on topics of personal interest and experience. Similarly to Christensen my goal in writing this is to encourage other physics educators to take advantage of modern technology in delivering content to students and to feel…
ERIC Educational Resources Information Center
Mason, Andrew; Singh, Chandralekha
2016-01-01
The ability to categorize problems based upon underlying principles, rather than contexts, is considered a hallmark of expertise in physics problem solving. With inspiration from a classic study by Chi, Feltovich, and Glaser, we compared the categorization of 25 introductory mechanics problems based upon similarity of solution by students in large…
The Space-Time Topography of English Speakers
ERIC Educational Resources Information Center
Duman, Steve
2016-01-01
English speakers talk and think about Time in terms of physical space. The past is behind us, and the future is in front of us. In this way, we "map" space onto Time. This dissertation addresses the specificity of this physical space, or its topography. Inspired by languages like Yupno (Nunez, et al., 2012) and Bamileke-Dschang (Hyman,…
3 CFR 8662 - Proclamation 8662 of April 29, 2011. National Physical Fitness and Sports Month, 2011
Code of Federal Regulations, 2012 CFR
2012-01-01
... and costly diseases like heart disease, diabetes, and obesity. For more information on the President’s.... The health of our sons and daughters is key to our Nation’s future. Unfortunately, childhood obesity... solving the epidemic of childhood obesity within a generation by inspiring children to be physically...
ERIC Educational Resources Information Center
Hockicko, Peter; Krišták, Luboš; Nemec, Miroslav
2015-01-01
Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think…
The Implications of Assessment for Learning in Physical Education and Health
ERIC Educational Resources Information Center
Tolgfors, Bjorn; Öhman, Marie
2016-01-01
This article deals with the implications of assessment for learning (AfL) in upper secondary physical education and health (PEH). Inspired by the research field that emanates from the concept of governmentality, the study is concerned with how AfL guides teachers' and students' actions in certain directions. Based on teachers' descriptions of how…
ERIC Educational Resources Information Center
Thorpe, Holly
2014-01-01
In this paper I call for "new forms of thinking and new ways of theorizing" the complex relations between the biological and social in sport and physical culture. I illustrate the inseparability of our biological and social bodies in sport and physical culture via the case of exercise and female reproductive hormones. Inspired by…
Learning Physical Examination Skills outside Timetabled Training Sessions: What Happens and Why?
ERIC Educational Resources Information Center
Duvivier, Robbert J.; van Geel, Koos; van Dalen, Jan; Scherpbier, Albert J. J. A.; van der Vleuten, Cees P. M.
2012-01-01
Lack of published studies on students' practice behaviour of physical examination skills outside timetabled training sessions inspired this study into what activities medical students undertake to improve their skills and factors influencing this. Six focus groups of a total of 52 students from Years 1-3 using a pre-established interview guide.…
Towards a mathematical theory of meaningful communication
NASA Astrophysics Data System (ADS)
Corominas-Murtra, Bernat; Fortuny, Jordi; Solé, Ricard V.
2014-04-01
Meaning has been left outside most theoretical approaches to information in biology. Functional responses based on an appropriate interpretation of signals have been replaced by a probabilistic description of correlations between emitted and received symbols. This assumption leads to potential paradoxes, such as the presence of a maximum information associated to a channel that creates completely wrong interpretations of the signals. Game-theoretic models of language evolution and other studies considering embodied communicating agents show that the correct (meaningful) match resulting from agent-agent exchanges is always achieved and natural systems obviously solve the problem correctly. Inspired by the concept of duality of the communicative sign stated by the swiss linguist Ferdinand de Saussure, here we present a complete description of the minimal system necessary to measure the amount of information that is consistently decoded. Several consequences of our developments are investigated, such as the uselessness of a certain amount of information properly transmitted for communication among autonomous agents.
Uncertainty Propagation Methods for High-Dimensional Complex Systems
NASA Astrophysics Data System (ADS)
Mukherjee, Arpan
Researchers are developing ever smaller aircraft called Micro Aerial Vehicles (MAVs). The Space Robotics Group has joined the field by developing a dragonfly-inspired MAV. This thesis presents two contributions to this project. The first is the development of a dynamical model of the internal MAV components to be used for tuning design parameters and as a future plant model. This model is derived using the Lagrangian method and differs from others because it accounts for the internal dynamics of the system. The second contribution of this thesis is an estimation algorithm that can be used to determine prototype performance and verify the dynamical model from the first part. Based on the Gauss-Newton Batch Estimator, this algorithm uses a single camera and known points of interest on the wing to estimate the wing kinematic angles. Unlike other single-camera methods, this method is probabilistically based rather than being geometric.
Simulated annealing with probabilistic analysis for solving traveling salesman problems
NASA Astrophysics Data System (ADS)
Hong, Pei-Yee; Lim, Yai-Fung; Ramli, Razamin; Khalid, Ruzelan
2013-09-01
Simulated Annealing (SA) is a widely used meta-heuristic that was inspired from the annealing process of recrystallization of metals. Therefore, the efficiency of SA is highly affected by the annealing schedule. As a result, in this paper, we presented an empirical work to provide a comparable annealing schedule to solve symmetric traveling salesman problems (TSP). Randomized complete block design is also used in this study. The results show that different parameters do affect the efficiency of SA and thus, we propose the best found annealing schedule based on the Post Hoc test. SA was tested on seven selected benchmarked problems of symmetric TSP with the proposed annealing schedule. The performance of SA was evaluated empirically alongside with benchmark solutions and simple analysis to validate the quality of solutions. Computational results show that the proposed annealing schedule provides a good quality of solution.
Brownian Motion and its Conditional Descendants
NASA Astrophysics Data System (ADS)
Garbaczewski, Piotr
It happened before [1] that I have concluded my publication with a special dedication to John R. Klauder. Then, the reason was John's PhD thesis [2] and the questions (perhaps outdated in the eyes of the band-wagon jumpers, albeit still retaining their full vitality [3]): (i) What are the uses of the classical (c-number, non-Grassmann) spinor fields, especially nonlinear ones, what are they for at all ? (ii) What are, if any, the classical partners for Fermi models and fields in particular ? The present dedication, even if not as conspicuously motivated as the previous one by John's research, nevertheless pertains to investigations pursued by John through the years and devoted to the analysis of random noise. Sometimes, re-reading old papers and re-analysing old, frequently forgotten ideas might prove more rewarding than racing the fashions. Following this attitude, let us take as the departure point Schrödinger's original suggestion [4] of the existence of a special class of random processes, which have their origin in the Einstein-Smoluchowski theory of the Brownian motion and its Wiener's codification. The original analysis due to Schrodinger of the probabilistic significance of the heat equation and of its time adjoint in parallel, remained unnoticed by the physics community, and since then forgotten. It reappeared however in the mathematical literature as an inspiration to generalise the concept of Markovian diffusions to the case of Bernstein stochastic processes. But, it stayed without consequences for a deeper understanding of the possible physical phenomena which might underly the corresponding abstract formalism. Schrödinger's objective was to initiate investigations of possible links between quantum theory and the theory of Brownian motion, an attempt which culminated later in the so-called Nelson's stochastic mechanics [8] and its encompassing formalism [7] in which the issue of the Brownian implementation of quantum dynamics is placed in the framework of Markov-Bernstein diffusions…
Emergent Structural Mechanisms for High-Density Collective Motion Inspired by Human Crowds
NASA Astrophysics Data System (ADS)
Bottinelli, Arianna; Sumpter, David T. J.; Silverberg, Jesse L.
2016-11-01
Collective motion of large human crowds often depends on their density. In extreme cases like heavy metal concerts and black Friday sales events, motion is dominated by physical interactions instead of conventional social norms. Here, we study an active matter model inspired by situations when large groups of people gather at a point of common interest. Our analysis takes an approach developed for jammed granular media and identifies Goldstone modes, soft spots, and stochastic resonance as structurally driven mechanisms for potentially dangerous emergent collective motion.
Numerical and Probabilistic Analysis of Asteroid and Comet Impact Hazard Mitigation
2010-09-01
object on Jupiter are reminders and warning signals that we should take seriously. The extinction of the dinosaurs has been attributed to the impact of a...experimentally determined absorption patterns. These energy deposition processes are independent, so a piecemeal approach is physically reasonable . We
75 FR 32077 - Great Outdoors Month, 2010
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-07
... effort to help our children eat more nutritious foods, lead healthier lives, and increase their physical activity. Exploring beyond the walls of their homes and schools will help inspire our children to move, run...
New course in bioengineering and bioinspired design.
Erickson, Jonathan C
2012-01-01
The past two years, a new interdisciplinary course has been offered at Washington and Lee University (Lexington, VA, USA), which seeks to surmount barriers that have traditionally existed between the physical and life sciences. The course explores the physiology leading to the physical mechanisms and engineering principles that endow the astonishing navigation abilities and sensory mechanisms of animal systems. The course also emphasizes how biological systems are inspiring novel engineering designs. Two (among many) examples are how the adhesion of the gecko foot inspired a new class of adhesives based on Van der Waals forces; and how the iridophore protein plates found in mimic octopus and squid act as tunable ¼ wave stacks, thus inspiring the engineering of optically tunable block copolymer gels for sensing temperature, pressure, or chemical gradients. A major component of this course is the integration of a 6-8 week long research project. To date, projects have included engineering: a soft-body robot whose motion mimics the inchworm; an electrical circuit to sense minute electric fields in aqueous environments based on the shark electrosensory system; and cyborg grasshoppers whose jump motion is controlled via an electronic-neural interface. Initial feedback has indicated that this course has served to increase student interaction and cross-pollination of ideas between the physical and life sciences. Student feedback also indicated a marked increase in desire and confidence to continue to pursue problems at the boundary of biology and engineeringbioengineering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yun, E-mail: genliyun@126.com, E-mail: cuiwanzhao@126.com; Cui, Wan-Zhao, E-mail: genliyun@126.com, E-mail: cuiwanzhao@126.com; Wang, Hong-Guang
2015-05-15
Effects of the secondary electron emission (SEE) phenomenon of metal surface on the multipactor analysis of microwave components are investigated numerically and experimentally in this paper. Both the secondary electron yield (SEY) and the emitted energy spectrum measurements are performed on silver plated samples for accurate description of the SEE phenomenon. A phenomenological probabilistic model based on SEE physics is utilized and fitted accurately to the measured SEY and emitted energy spectrum of the conditioned surface material of microwave components. Specially, the phenomenological probabilistic model is extended to the low primary energy end lower than 20 eV mathematically, since no accuratemore » measurement data can be obtained. Embedding the phenomenological probabilistic model into the Electromagnetic Particle-In-Cell (EM-PIC) method, the electronic resonant multipacting in microwave components can be tracked and hence the multipactor threshold can be predicted. The threshold prediction error of the transformer and the coaxial filter is 0.12 dB and 1.5 dB, respectively. Simulation results demonstrate that the discharge threshold is strongly dependent on the SEYs and its energy spectrum in the low energy end (lower than 50 eV). Multipacting simulation results agree quite well with experiments in practical components, while the phenomenological probabilistic model fit both the SEY and the emission energy spectrum better than the traditionally used model and distribution. The EM-PIC simulation method with the phenomenological probabilistic model for the surface collision simulation has been demonstrated for predicting the multipactor threshold in metal components for space application.« less
Urns and Chameleons: two metaphors for two different types of measurements
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2013-09-01
The awareness of the physical possibility of models of space, alternative with respect to the Euclidean one, begun to emerge towards the end of the 19-th century. At the end of the 20-th century a similar awareness emerged concerning the physical possibility of models of the laws of chance alternative with respect to the classical probabilistic models (Kolmogorov model). In geometry the mathematical construction of several non-Euclidean models of space preceded of about one century their applications in physics, which came with the theory of relativity. In physics the opposite situation took place. In fact, while the first example of non Kolmogorov probabilistic models emerged in quantum physics approximately one century ago, at the beginning of 1900, the awareness of the fact that this new mathematical formalism reflected a new mathematical model of the laws of chance had to wait until the early 1980's. In this long time interval the classical and the new probabilistic models were both used in the description and the interpretation of quantum phenomena and negatively interfered with each other because of the absence (for many decades) of a mathematical theory that clearly delimited the respective domains of application. The result of this interference was the emergence of the so-called the "paradoxes of quantum theory". For several decades there have been many different attempts to solve these paradoxes giving rise to what K. Popper baptized "the great quantum muddle": a debate which has been at the core of the philosophy of science for more than 50 years. However these attempts have led to contradictions between the two fundamental theories of the contemporary physical: the quantum theory and the theory of the relativity. Quantum probability identifies the reason of the emergence of non Kolmogorov models, and therefore of the so-called the paradoxes of quantum theory, in the difference between the notion of passive measurements like "reading pre-existent properties" (urn metaphor) and measurements consisting in reading "a response to an interaction" (chameleon metaphor). The non-trivial point is that one can prove that, while the urn scheme cannot lead to empirical data outside of classic probability, response based measurements can give rise to non classical statistics. The talk will include entirely classical examples of non classical statistics and potential applications to economic, sociological or biomedical phenomena.
No space for girliness in physics: understanding and overcoming the masculinity of physics
NASA Astrophysics Data System (ADS)
Götschel, Helene
2014-06-01
Allison Gonsalves' article on "women doctoral students' positioning around discourses of gender and competence in physics" explores narratives of Canadian women physicists concerning their strategies to gain recognition as physicists. In my response to her rewarding and inspiring analysis I will reflect on her findings and arguments and put them into a broader context of research in gender and physics. In addition to her promising strategies to make physics attractive and welcoming to all genders I want to stress two more aspects of the tricky problem: diversity and contextuality of physics.
A DATA-DRIVEN MODEL FOR SPECTRA: FINDING DOUBLE REDSHIFTS IN THE SLOAN DIGITAL SKY SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsalmantza, P.; Hogg, David W., E-mail: vivitsal@mpia.de
2012-07-10
We present a data-driven method-heteroscedastic matrix factorization, a kind of probabilistic factor analysis-for modeling or performing dimensionality reduction on observed spectra or other high-dimensional data with known but non-uniform observational uncertainties. The method uses an iterative inverse-variance-weighted least-squares minimization procedure to generate a best set of basis functions. The method is similar to principal components analysis (PCA), but with the substantial advantage that it uses measurement uncertainties in a responsible way and accounts naturally for poorly measured and missing data; it models the variance in the noise-deconvolved data space. A regularization can be applied, in the form of a smoothnessmore » prior (inspired by Gaussian processes) or a non-negative constraint, without making the method prohibitively slow. Because the method optimizes a justified scalar (related to the likelihood), the basis provides a better fit to the data in a probabilistic sense than any PCA basis. We test the method on Sloan Digital Sky Survey (SDSS) spectra, concentrating on spectra known to contain two redshift components: these are spectra of gravitational lens candidates and massive black hole binaries. We apply a hypothesis test to compare one-redshift and two-redshift models for these spectra, utilizing the data-driven model trained on a random subset of all SDSS spectra. This test confirms 129 of the 131 lens candidates in our sample and all of the known binary candidates, and turns up very few false positives.« less
Intuitive Physics: Current Research and Controversies.
Kubricht, James R; Holyoak, Keith J; Lu, Hongjing
2017-10-01
Early research in the field of intuitive physics provided extensive evidence that humans succumb to common misconceptions and biases when predicting, judging, and explaining activity in the physical world. Recent work has demonstrated that, across a diverse range of situations, some biases can be explained by the application of normative physical principles to noisy perceptual inputs. However, it remains unclear how knowledge of physical principles is learned, represented, and applied to novel situations. In this review we discuss theoretical advances from heuristic models to knowledge-based, probabilistic simulation models, as well as recent deep-learning models. We also consider how recent work may be reconciled with earlier findings that favored heuristic models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew D.; Grabaskas, David; Brunett, Acacia J.
2016-01-01
Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologiesmore » for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Centering on an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive reactor cavity cooling system following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. While this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability for the reactor cavity cooling system (and the reactor system in general) to the postulated transient event.« less
From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model
NASA Astrophysics Data System (ADS)
Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.
2014-12-01
European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.
Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event
Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; ...
2017-01-24
We report that many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has beenmore » examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Lastly, although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.« less
Physical Simulation for Probabilistic Motion Tracking
2008-01-01
learn a low- dimensional embedding of the high-dimensional kinematic data and then attempt to solve the problem in this more man- ageable low...rotations and foot skate ). Such artifacts can be attributed to the general lack of physically plausible priors [2] (that can account for static and/or...temporal priors of the form p(xf+1|xf ) = N (xf + γf ,Σ) (where γf is scaled velocity learned or inferred), have also been proposed [13] and shown to
NASA Astrophysics Data System (ADS)
Radencic, S.; McNeal, K. S.; Pierce, D.; Hare, D.
2010-12-01
The INSPIRE program at Mississippi State University (MSU), funded by the NSF Graduate STEM Fellows in K-12 Education (GK12) program, focuses on Earth and Space science education and has partnered ten graduate students from MSU with five teachers from local, rural school districts. For the next five years the project will serve to increase inquiry and technology experiences in science and math while enhancing graduate student’s communication skills. Graduate students, from the disciplines of Geosciences, Physics, and Engineering are partnered with Chemistry, Physical Science, Physics, Geometry and Middle school science classrooms and will create engaging inquiry activities that incorporate elements of their research, and integrate various forms of technology. The generated lesson plans that are implemented in the classroom are published on the INSPIRE home page (www.gk12.msstate.edu) so that other classroom instructors can utilize this free resource. Local 7th -12th grade students will attend GIS day later this fall at MSU to increase their understanding and interest in Earth and Space sciences. Selected graduate students and teachers will visit one of four international university partners located in Poland, Australia, England, or The Bahamas to engage research abroad. Upon return they will incorporate their global experiences into their local classrooms. Planning for the project included many factors important to the success of the partnerships. The need for the program was evident in Mississippi K-12 schools based on low performance on high stakes assessments and lack of curriculum in the Earth and Space sciences. Meeting with administrators to determine what needs they would like addressed by the project and recognizing the individual differences among the schools were integral components to tailoring project goals and to meet the unique needs of each school partner. Time for training and team building of INSPIRE teachers and graduate students before the school year aided in fostering a community atmosphere to ensure successful classroom experiences. Including stakeholders in the progress of lesson plan product development during a workshop luncheon was another key part to building a community of support for INSPIRE. These planning components are essential to the success of the project and are recommended to similar projects. The INSPIRE project external evaluation includes: (i) interviews of participants and K-12 students involved in INSPIRE, (ii) pre-post technology and teaching attitude surveys of graduate students and teachers, (iii) thematic analysis of daily feedback forms from the workshop, (iv) summary of end of workshop evaluations, and (v) constant surveying of program progress towards meeting its goals. Internal evaluation includes: (i) classroom observations of graduate student interactions with students (ii) bi-weekly journal entries from both teachers and graduate students, and (iii) weekly feedback from graduate students. Preliminary evaluation of the workshop daily feedback forms indicate a high level of approval for the technology and inquiry activities modeled. Journal entries indicate that the majority of Fellow-teacher teams experience positive interactions in the classroom.
Exposure to contaminants originating in the domestic water supply is influenced by a number of factors, including human activities, water use behavior, and physical and chemical processes. The key role of human activities is very apparent in exposure related to volatile water-...
Teaching Measurement and Uncertainty the GUM Way
ERIC Educational Resources Information Center
Buffler, Andy; Allie, Saalih; Lubben, Fred
2008-01-01
This paper describes a course aimed at developing understanding of measurement and uncertainty in the introductory physics laboratory. The course materials, in the form of a student workbook, are based on the probabilistic framework for measurement as recommended by the International Organization for Standardization in their publication "Guide to…
Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio
2010-11-01
Abbreviation for the names Griffiths, Engen , and McCloskey. Often used to de- note the stick-breaking distribution over infinite vectors whose elements...of state calculations by fast computing machines. Journal of Chemical Physics, 21:1087–1092, 1953. [65] R. Miotto, L. Barrington, and G. Lanckriet
Activity-based sampling (ABS) used to evaluate breathing zone exposure to a contaminant present in soil resulting from various activities, involves breathing zone sampling for contaminants while that activity is performed. A probabilistic model based upon aerosol physics and flui...
Annotations of a Public Astronomer
NASA Astrophysics Data System (ADS)
Adamo, A.
2011-06-01
Angelo Adamo is an Italian astronomer and artist interested in inspiring people with scientifically-based tales. He has recently published two illustrated books exploring the relationships between mankind and cosmos through physics, art, literature, music, cartoons, and movies.
ERIC Educational Resources Information Center
Wong, Siu-ling; Chun, Ka-wai Cecilia; Mak, Se-yuen
2007-01-01
We describe a physics investigation project inspired by one of the adventures of Odysseus in Homer's "Odyssey." The investigation uses the laws of mechanics, vector algebra and a simple way to construct a fan-and-sail-cart for experimental verification.
Making a Difference: TPSR, a New Wave of Youth Development Changing Lives One Stroke at a Time
ERIC Educational Resources Information Center
Beale, Angela
2016-01-01
This article was inspired by a belief in the need to connect physical education with community-based physical activity programming in minority communities. The initial steps of this reflective narrative were described in a previous article. This article describes a program called Project Guard: Make A Splash E.N.D. (End Needless Drowning), a…
ERIC Educational Resources Information Center
Nicholls, Jennifer; Philip, Robyn
2012-01-01
This paper explores the design of virtual and physical learning spaces developed for students of drama and theatre studies. What can we learn from the traditional drama workshop that will inform the design of drama and theatre spaces created in technology-mediated learning environments? The authors examine four examples of spaces created for…
From the Outside In: Getting Physical with Exercises Inspired by Stella Adler and Uta Hagen.
ERIC Educational Resources Information Center
Miller, Bruce
2002-01-01
Proposes that teaching students to find and play appropriate actions helps them tell the story of a play and create character better than if they focused on emotions. Discusses Stella Adler and Uta Hagen, two acting teachers who advocated this physical approach. Presents two exercises: "justify and connect," and "enter a room." (PM)
True-slime-mould-inspired hydrostatically coupled oscillator system exhibiting versatile behaviours.
Umedachi, Takuya; Idei, Ryo; Ito, Kentaro; Ishiguro, Akio
2013-09-01
Behavioural diversity is an indispensable attribute of living systems, which makes them intrinsically adaptive and responsive to the demands of a dynamically changing environment. In contrast, conventional engineering approaches struggle to suppress behavioural diversity in artificial systems to reach optimal performance in given environments for desired tasks. The goals of this research include understanding the essential mechanism that endows living systems with behavioural diversity and implementing the mechanism in robots to exhibit adaptive behaviours. For this purpose, we have focused on an amoeba-like unicellular organism: the plasmodium of true slime mould. Despite the absence of a central nervous system, the plasmodium exhibits versatile spatiotemporal oscillatory patterns and switches spontaneously among these patterns. By exploiting this behavioural diversity, it is able to exhibit adaptive behaviour according to the situation encountered. Inspired by this organism, we built a real physical robot using hydrostatically coupled oscillators that produce versatile oscillatory patterns and spontaneous transitions among the patterns. The experimental results show that exploiting physical hydrostatic interplay—the physical dynamics of the robot—allows simple phase oscillators to promote versatile behaviours. The results can contribute to an understanding of how a living system generates versatile and adaptive behaviours with physical interplays among body parts.
INSPIRE: Initiating New Science Partnerships in Rural Education
NASA Astrophysics Data System (ADS)
Pierce, Donna M.; McNeal, K. S.; Bruce, L. M.; Harpole, S. H.; Schmitz, D. W.
2010-10-01
INSPIRE, Initiating New Science Partnerships in Rural Education, is a partnership between Mississippi State University and three school districts in Mississippi's Golden Triangle (Starkville, Columbus, West Point). This program recruits ten graduate fellows each year from geosciences, physics, astronomy, and engineering and pairs them with a participating middle school or high school teacher. The graduate fellows provide technology-supported inquiry-based learning in the earth and space sciences by incorporating their research into classroom instruction and using multiple resources such as Google Earth, geographic information systems (GIS), Celestia, and others. In addition to strengthening the communication skills of the graduate fellows, INSPIRE will increase the content knowledge of participating teachers, provide high-quality instruction using multiple technologies, promote higher education to area high-school students, and provide fellows and teachers with international research experience through our partners in Australia, The Bahamas, England, and Poland. INSPIRE is funded by the Graduate STEM Fellows in K-12 Education Program (GK-12; Award No. DGE-0947419), which is part of the Division for Graduate Education of the National Science Foundation.
The Physical Tourist. A European Study Course
NASA Astrophysics Data System (ADS)
Kortemeyer, Gerd; Westfall, Catherine
2010-03-01
We organized and led a European study course for American undergraduate university students to explore the early history of relativity and quantum theory. We were inspired by The Physical Tourist articles published in this journal on Munich, Bern, Berlin, Copenhagen, and Göttingen. We describe this adventure both for others wishing to teach such a course and for anyone wishing to walk in the footsteps of the physicists who revolutionized physics in the early decades of the twentieth century.
Probabilistic Estimates of Global Mean Sea Level and its Underlying Processes
NASA Astrophysics Data System (ADS)
Hay, C.; Morrow, E.; Kopp, R. E.; Mitrovica, J. X.
2015-12-01
Local sea level can vary significantly from the global mean value due to a suite of processes that includes ongoing sea-level changes due to the last ice age, land water storage, ocean circulation changes, and non-uniform sea-level changes that arise when modern-day land ice rapidly melts. Understanding these sources of spatial and temporal variability is critical to estimating past and present sea-level change and projecting future sea-level rise. Using two probabilistic techniques, a multi-model Kalman smoother and Gaussian process regression, we have reanalyzed 20th century tide gauge observations to produce a new estimate of global mean sea level (GMSL). Our methods allow us to extract global information from the sparse tide gauge field by taking advantage of the physics-based and model-derived geometry of the contributing processes. Both methods provide constraints on the sea-level contribution of glacial isostatic adjustment (GIA). The Kalman smoother tests multiple discrete models of glacial isostatic adjustment (GIA), probabilistically computing the most likely GIA model given the observations, while the Gaussian process regression characterizes the prior covariance structure of a suite of GIA models and then uses this structure to estimate the posterior distribution of local rates of GIA-induced sea-level change. We present the two methodologies, the model-derived geometries of the underlying processes, and our new probabilistic estimates of GMSL and GIA.
Adaptive predictors based on probabilistic SVM for real time disruption mitigation on JET
NASA Astrophysics Data System (ADS)
Murari, A.; Lungaroni, M.; Peluso, E.; Gaudio, P.; Vega, J.; Dormido-Canto, S.; Baruzzo, M.; Gelfusa, M.; Contributors, JET
2018-05-01
Detecting disruptions with sufficient anticipation time is essential to undertake any form of remedial strategy, mitigation or avoidance. Traditional predictors based on machine learning techniques can be very performing, if properly optimised, but do not provide a natural estimate of the quality of their outputs and they typically age very quickly. In this paper a new set of tools, based on probabilistic extensions of support vector machines (SVM), are introduced and applied for the first time to JET data. The probabilistic output constitutes a natural qualification of the prediction quality and provides additional flexibility. An adaptive training strategy ‘from scratch’ has also been devised, which allows preserving the performance even when the experimental conditions change significantly. Large JET databases of disruptions, covering entire campaigns and thousands of discharges, have been analysed, both for the case of the graphite and the ITER Like Wall. Performance significantly better than any previous predictor using adaptive training has been achieved, satisfying even the requirements of the next generation of devices. The adaptive approach to the training has also provided unique information about the evolution of the operational space. The fact that the developed tools give the probability of disruption improves the interpretability of the results, provides an estimate of the predictor quality and gives new insights into the physics. Moreover, the probabilistic treatment permits to insert more easily these classifiers into general decision support and control systems.
Constructor theory of probability
2016-01-01
Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914
Some Surprising Introductory Physics Facts and Numbers
NASA Astrophysics Data System (ADS)
Mallmann, A. James
2016-04-01
In the entertainment world, people usually like, and find memorable, novels, short stories, and movies with surprise endings. This suggests that classroom teachers might want to present to their students examples of surprising facts associated with principles of physics. Possible benefits of finding surprising facts about principles of physics are opportunities to expand beyond traditional presentations—and, in some cases, to achieve a deeper and broader understanding of those principles. I believe, moreover, that some of the facts presented here may inspire physics teachers to produce some challenge problems for students.
Superconductors in the high school classroom
NASA Astrophysics Data System (ADS)
Lincoln, James
2017-11-01
In this article, we discuss the behavior of high-temperature superconductors and how to demonstrate them safely and effectively in the high school or introductory physics classroom. Included here is a discussion of the most relevant physics topics that can be demonstrated, some safety tips, and a bit of the history of superconductors. In an effort to include first-year physics students in the world of modern physics, a topic as engaging as superconductivity should not be missed. It is an opportunity to inspire students to study physics through the myriad of possible applications that high temperature superconductors hold for the future.
An Astronomical Journey: One School's Experience
ERIC Educational Resources Information Center
Gibert, Michael; Tedd, Bernie
2016-01-01
Raising the profile of physics is particularly important in girls' schools. Here we describe a range of astronomical activities and observations that we have used, which we hope will inspire teachers at other schools to do likewise.
On a biologically inspired topology optimization method
NASA Astrophysics Data System (ADS)
Kobayashi, Marcelo H.
2010-03-01
This work concerns the development of a biologically inspired methodology for the study of topology optimization in engineering and natural systems. The methodology is based on L systems and its turtle interpretation for the genotype-phenotype modeling of the topology development. The topology is analyzed using the finite element method, and optimized using an evolutionary algorithm with the genetic encoding of the L system and its turtle interpretation, as well as, body shape and physical characteristics. The test cases considered in this work clearly show the suitability of the proposed method for the study of engineering and natural complex systems.
ERIC Educational Resources Information Center
Brewer, Hannah J.; Nichols, Randall; Leight, Joanne M.; Clark, Gary E.
2017-01-01
We live in a dynamic educational world. Physical and health education teacher preparation programs must examine what society needs and consider a new model for teacher preparation that is based on inspiring youth to build healthy behaviors that last a lifetime. One university created a new School Wellness Education (SWE) program that prepares…
Astronomers Travel in Time and Space with Light
NASA Technical Reports Server (NTRS)
Mather, John C.
2016-01-01
This is an excerpt of John Mather's in a book titled: INSPIRED BY LIGHT, Reflections from the International Year of Light 2015. It was produced in January 2016 by SPIE, the European Physical Society (EPS), and The Abdus Salam International Centre for Theoretical Physics (ICTP) to commemorate the International Year of Light and Light-based Technologies 2015. The excerpt discusses how astronomers use light.
The Role of Probability in Developing Learners' Models of Simulation Approaches to Inference
ERIC Educational Resources Information Center
Lee, Hollylynne S.; Doerr, Helen M.; Tran, Dung; Lovett, Jennifer N.
2016-01-01
Repeated sampling approaches to inference that rely on simulations have recently gained prominence in statistics education, and probabilistic concepts are at the core of this approach. In this approach, learners need to develop a mapping among the problem situation, a physical enactment, computer representations, and the underlying randomization…
To help address the Food Quality Protection Act of 1996, a physically-based probabilistic model (Residential Stochastic Human Exposure and Dose Simulation Model for Pesticides; Residential-SHEDS) has been developed to quantify and analyze dermal and non-dietary ingestion exposu...
Time Dependent Data Mining in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cogliati, Joshua Joseph; Chen, Jun; Patel, Japan Ketan
RAVEN is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. The goal of this type of analyses is to understand the response of such systems in particular with respect their probabilistic behavior, to understand their predictability and drivers or lack of thereof. Data mining capabilities are the cornerstones to perform such deep learning of system responses. For this reason static data mining capabilities were added last fiscal year (FY 15). In real applications, when dealing with complex multi-scale, multi-physics systems it seems natural that, during transients, the relevance of themore » different scales, and physics, would evolve over time. For these reasons the data mining capabilities have been extended allowing their application over time. In this writing it is reported a description of the new RAVEN capabilities implemented with several simple analytical tests to explain their application and highlight the proper implementation. The report concludes with the application of those newly implemented capabilities to the analysis of a simulation performed with the Bison code.« less
Chan, Lesley W; Morse, Daniel E; Gordon, Michael J
2018-05-08
Near- and sub-wavelength photonic structures are used by numerous organisms (e.g. insects, cephalopods, fish, birds) to create vivid and often dynamically-tunable colors, as well as create, manipulate, or capture light for vision, communication, crypsis, photosynthesis, and defense. This review introduces the physics of moth eye (ME)-like, biomimetic nanostructures and discusses their application to reduce optical losses and improve efficiency of various optoelectronic devices, including photodetectors, photovoltaics, imagers, and light emitting diodes. Light-matter interactions at structured and heterogeneous surfaces over different length scales are discussed, as are the various methods used to create ME-inspired surfaces. Special interest is placed on a simple, scalable, and tunable method, namely colloidal lithography with plasma dry etching, to fabricate ME-inspired nanostructures in a vast suite of materials. Anti-reflective surfaces and coatings for IR devices and enhancing light extraction from visible light emitting diodes are highlighted.
Safe Maneuvering Envelope Estimation Based on a Physical Approach
NASA Technical Reports Server (NTRS)
Lombaerts, Thomas J. J.; Schuet, Stefan R.; Wheeler, Kevin R.; Acosta, Diana; Kaneshige, John T.
2013-01-01
This paper discusses a computationally efficient algorithm for estimating the safe maneuvering envelope of damaged aircraft. The algorithm performs a robust reachability analysis through an optimal control formulation while making use of time scale separation and taking into account uncertainties in the aerodynamic derivatives. This approach differs from others since it is physically inspired. This more transparent approach allows interpreting data in each step, and it is assumed that these physical models based upon flight dynamics theory will therefore facilitate certification for future real life applications.
Physics League Across Numerous Countries for Kick-ass Students (PLANCKS)
NASA Astrophysics Data System (ADS)
Haasnoot, Irene
2016-01-01
Physics League Across Numerous Countries for Kick-ass Students (PLANCKS) is an international theoretical physics competition for bachelor and master students. The intention of PLANCKS is to increase international collaboration and stimulate the personal development of individual contestants. This is done by organizing a three-day-event which take place every year and is hosted by different countries. Besides the contest, social and scientific activities will be organised, including an opening symposium where leading physicists give lectures to inspire the participants.
The Physics of Information Technology
NASA Astrophysics Data System (ADS)
Gershenfeld, Neil
2000-10-01
The Physics of Information Technology explores the familiar devices that we use to collect, transform, transmit, and interact with electronic information. Many such devices operate surprisingly close to very many fundamental physical limits. Understanding how such devices work, and how they can (and cannot) be improved, requires deep insight into the character of physical law as well as engineering practice. The book starts with an introduction to units, forces, and the probabilistic foundations of noise and signaling, then progresses through the electromagnetics of wired and wireless communications, and the quantum mechanics of electronic, optical, and magnetic materials, to discussions of mechanisms for computation, storage, sensing, and display. This self-contained volume will help both physical scientists and computer scientists see beyond the conventional division between hardware and software to understand the implications of physical theory for information manipulation.
USSR Report, Military Affairs.
1985-09-19
Inspirer and Organizer of State Terrorism ...3 Snegov, V. - Japan’s Military Preparedness ................ ......... 9 Sykhotskiy, V. - Physical...intercept, DF and analysis of radar signals and fleet and satellite comunications . The aircraft is further equipped to process the data and transmit them to
NASA Astrophysics Data System (ADS)
Subramanian, Aneesh C.; Palmer, Tim N.
2017-06-01
Stochastic schemes to represent model uncertainty in the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system has helped improve its probabilistic forecast skill over the past decade by both improving its reliability and reducing the ensemble mean error. The largest uncertainties in the model arise from the model physics parameterizations. In the tropics, the parameterization of moist convection presents a major challenge for the accurate prediction of weather and climate. Superparameterization is a promising alternative strategy for including the effects of moist convection through explicit turbulent fluxes calculated from a cloud-resolving model (CRM) embedded within a global climate model (GCM). In this paper, we compare the impact of initial random perturbations in embedded CRMs, within the ECMWF ensemble prediction system, with stochastically perturbed physical tendency (SPPT) scheme as a way to represent model uncertainty in medium-range tropical weather forecasts. We especially focus on forecasts of tropical convection and dynamics during MJO events in October-November 2011. These are well-studied events for MJO dynamics as they were also heavily observed during the DYNAMO field campaign. We show that a multiscale ensemble modeling approach helps improve forecasts of certain aspects of tropical convection during the MJO events, while it also tends to deteriorate certain large-scale dynamic fields with respect to stochastically perturbed physical tendencies approach that is used operationally at ECMWF.
NASA Astrophysics Data System (ADS)
Wellons, Sarah; Torrey, Paul
2017-06-01
Galaxy populations at different cosmic epochs are often linked by cumulative comoving number density in observational studies. Many theoretical works, however, have shown that the cumulative number densities of tracked galaxy populations not only evolve in bulk, but also spread out over time. We present a method for linking progenitor and descendant galaxy populations which takes both of these effects into account. We define probability distribution functions that capture the evolution and dispersion of galaxy populations in number density space, and use these functions to assign galaxies at redshift zf probabilities of being progenitors/descendants of a galaxy population at another redshift z0. These probabilities are used as weights for calculating distributions of physical progenitor/descendant properties such as stellar mass, star formation rate or velocity dispersion. We demonstrate that this probabilistic method provides more accurate predictions for the evolution of physical properties than the assumption of either a constant number density or an evolving number density in a bin of fixed width by comparing predictions against galaxy populations directly tracked through a cosmological simulation. We find that the constant number density method performs least well at recovering galaxy properties, the evolving method density slightly better and the probabilistic method best of all. The improvement is present for predictions of stellar mass as well as inferred quantities such as star formation rate and velocity dispersion. We demonstrate that this method can also be applied robustly and easily to observational data, and provide a code package for doing so.
Miller, Michael A; Colby, Alison C C; Kanehl, Paul D; Blocksom, Karen
2009-03-01
The Wisconsin Department of Natural Resources (WDNR), with support from the U.S. EPA, conducted an assessment of wadeable streams in the Driftless Area ecoregion in western Wisconsin using a probabilistic sampling design. This ecoregion encompasses 20% of Wisconsin's land area and contains 8,800 miles of perennial streams. Randomly-selected stream sites (n = 60) equally distributed among stream orders 1-4 were sampled. Watershed land use, riparian and in-stream habitat, water chemistry, macroinvertebrate, and fish assemblage data were collected at each true random site and an associated "modified-random" site on each stream that was accessed via a road crossing nearest to the true random site. Targeted least-disturbed reference sites (n = 22) were also sampled to develop reference conditions for various physical, chemical, and biological measures. Cumulative distribution function plots of various measures collected at the true random sites evaluated with reference condition thresholds, indicate that high proportions of the random sites (and by inference the entire Driftless Area wadeable stream population) show some level of degradation. Study results show no statistically significant differences between the true random and modified-random sample sites for any of the nine physical habitat, 11 water chemistry, seven macroinvertebrate, or eight fish metrics analyzed. In Wisconsin's Driftless Area, 79% of wadeable stream lengths were accessible via road crossings. While further evaluation of the statistical rigor of using a modified-random sampling design is warranted, sampling randomly-selected stream sites accessed via the nearest road crossing may provide a more economical way to apply probabilistic sampling in stream monitoring programs.
Are quantum-mechanical-like models possible, or necessary, outside quantum physics?
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
2014-12-01
This article examines some experimental conditions that invite and possibly require recourse to quantum-mechanical-like mathematical models (QMLMs), models based on the key mathematical features of quantum mechanics, in scientific fields outside physics, such as biology, cognitive psychology, or economics. In particular, I consider whether the following two correlative features of quantum phenomena that were decisive for establishing the mathematical formalism of quantum mechanics play similarly important roles in QMLMs elsewhere. The first is the individuality and discreteness of quantum phenomena, and the second is the irreducibly probabilistic nature of our predictions concerning them, coupled to the particular character of the probabilities involved, as different from the character of probabilities found in classical physics. I also argue that these features could be interpreted in terms of a particular form of epistemology that suspends and even precludes a causal and, in the first place, realist description of quantum objects and processes. This epistemology limits the descriptive capacity of quantum theory to the description, classical in nature, of the observed quantum phenomena manifested in measuring instruments. Quantum mechanics itself only provides descriptions, probabilistic in nature, concerning numerical data pertaining to such phenomena, without offering a physical description of quantum objects and processes. While QMLMs share their use of the quantum-mechanical or analogous mathematical formalism, they may differ by the roles, if any, the two features in question play in them and by different ways of interpreting the phenomena they considered and this formalism itself. This article will address those differences as well.
NASA Astrophysics Data System (ADS)
Barrow, John D.; Davies, Paul C. W.; Harper, Charles L., Jr.
2004-06-01
This preview of the future of physics comprises contributions from recognized authorities inspired by the pioneering work of John Wheeler. Quantum theory represents a unifying theme within the book, as it relates to the topics of the nature of physical reality, cosmic inflation, the arrow of time, models of the universe, superstrings, quantum gravity and cosmology. Attempts to formulate a final unification theory of physics are also considered, along with the existence of hidden dimensions of space, hidden cosmic matter, and the strange world of quantum technology. John Archibald Wheeler is one of the most influential scientists of the twentieth century. His extraordinary career has spanned momentous advances in physics, from the birth of the nuclear age to the conception of the quantum computer. Famous for coining the term "black hole," Professor Wheeler helped lay the foundations for the rebirth of gravitation as a mainstream branch of science, triggering the explosive growth in astrophysics and cosmology that followed. His early contributions to physics include the S matrix, the theory of nuclear rotation (with Edward Teller), the theory of nuclear fission (with Niels Bohr), action-at-a-distance electrodynamics (with Richard Feynman), positrons as backward-in-time electrons, the universal Fermi interaction (with Jayme Tiomno), muonic atoms, and the collective model of the nucleus. His inimitable style of thinking, quirky wit, and love of the bizarre have inspired generations of physicists.
Reshaping Millennials as Future Leaders of the Marine Corps
2016-02-04
encourages values, beliefs, and responsibilities while simultaneously arousing physical and emotional excitation. He or she also aims to inspire...other races, religions, and sexual orientations while being interdependent on family, friends, and teachers.25 7 The Challenge Why do
Probabilistic Modeling of the Renal Stone Formation Module
NASA Technical Reports Server (NTRS)
Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.
2013-01-01
The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously randomly sampling the probability distributions of the electrolyte concentrations and system parameters that are inputs into the deterministic model. The total urine chemistry concentrations are used to determine the urine chemistry activity using the Joint Expert Speciation System (JESS), a biochemistry model. Information used from JESS is then fed into the deterministic growth model. Outputs from JESS and the deterministic model are passed back to the probabilistic model where a multivariate regression is used to assess the likelihood of a stone forming and the likelihood of a stone requiring clinical intervention. The parameters used to determine to quantify these risks include: relative supersaturation (RS) of calcium oxalate, citrate/calcium ratio, crystal number density, total urine volume, pH, magnesium excretion, maximum stone width, and ureteral location. Methods and Validation: The RSFM is designed to perform a Monte Carlo simulation to generate probability distributions of clinically significant renal stones, as well as provide an associated uncertainty in the estimate. Initially, early versions will be used to test integration of the components and assess component validation and verification (V&V), with later versions used to address questions regarding design reference mission scenarios. Once integrated with the deterministic component, the credibility assessment of the integrated model will follow NASA STD 7009 requirements.
ZERO: probabilistic routing for deploy and forget Wireless Sensor Networks.
Vilajosana, Xavier; Llosa, Jordi; Pacho, Jose Carlos; Vilajosana, Ignasi; Juan, Angel A; Vicario, Jose Lopez; Morell, Antoni
2010-01-01
As Wireless Sensor Networks are being adopted by industry and agriculture for large-scale and unattended deployments, the need for reliable and energy-conservative protocols become critical. Physical and Link layer efforts for energy conservation are not mostly considered by routing protocols that put their efforts on maintaining reliability and throughput. Gradient-based routing protocols route data through most reliable links aiming to ensure 99% packet delivery. However, they suffer from the so-called "hot spot" problem. Most reliable routes waste their energy fast, thus partitioning the network and reducing the area monitored. To cope with this "hot spot" problem we propose ZERO a combined approach at Network and Link layers to increase network lifespan while conserving reliability levels by means of probabilistic load balancing techniques.
Invention and History of the Bubble Chamber (LBNL Summer Lecture Series)
Glaser, Don [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
2018-01-12
Summer Lecture Series 2006: Don Glaser won the 1960 Nobel Prize for Physics for his 1952 invention of the bubble chamber at Berkeley Lab, a type of particle detector that became the mainstay of high-energy physics research throughout the 1960s and 1970s. He discusses how, inspired by bubbles in a glass of beer, he invented the bubble chamber and detected cosmic-ray muons.
Giving students a taste of research
NASA Astrophysics Data System (ADS)
Thoennessen, Michael
2008-02-01
When I was studying physics at the University of Cologne, Germany - admittedly a fairly long time ago - I once carried out an experiment that involved counting hundreds of pendulum oscillations. Using just a stopwatch to roughly measure the period of the oscillation we determined g, the gravitational acceleration on Earth. It was rote, tedious and only roughly accurate work. It was not an experience that inspired me to pursue a career in physics.
NASA Astrophysics Data System (ADS)
Hermans, Thomas; Nguyen, Frédéric; Caers, Jef
2015-07-01
In inverse problems, investigating uncertainty in the posterior distribution of model parameters is as important as matching data. In recent years, most efforts have focused on techniques to sample the posterior distribution with reasonable computational costs. Within a Bayesian context, this posterior depends on the prior distribution. However, most of the studies ignore modeling the prior with realistic geological uncertainty. In this paper, we propose a workflow inspired by a Popper-Bayes philosophy that data should first be used to falsify models, then only be considered for matching. We propose a workflow consisting of three steps: (1) in defining the prior, we interpret multiple alternative geological scenarios from literature (architecture of facies) and site-specific data (proportions of facies). Prior spatial uncertainty is modeled using multiple-point geostatistics, where each scenario is defined using a training image. (2) We validate these prior geological scenarios by simulating electrical resistivity tomography (ERT) data on realizations of each scenario and comparing them to field ERT in a lower dimensional space. In this second step, the idea is to probabilistically falsify scenarios with ERT, meaning that scenarios which are incompatible receive an updated probability of zero while compatible scenarios receive a nonzero updated belief. (3) We constrain the hydrogeological model with hydraulic head and ERT using a stochastic search method. The workflow is applied to a synthetic and a field case studies in an alluvial aquifer. This study highlights the importance of considering and estimating prior uncertainty (without data) through a process of probabilistic falsification.
Mineral-Enhanced Polyacrylic Acid Hydrogel as an Oyster-Inspired Organic-Inorganic Hybrid Adhesive.
Li, Ang; Jia, Yunfei; Sun, Shengtong; Xu, Yisheng; Minsky, Burcu Baykal; Stuart, M A Cohen; Cölfen, Helmut; von Klitzing, Regine; Guo, Xuhong
2018-03-28
Underwater adhesion is crucial to many marine life forms living a sedentary lifestyle. Amongst them, mussel adhesion has been mostly studied, which inspires numerous investigations of 3,4-dihydroxyphenylalanine (DOPA)-based organic adhesives. In contrast, reef-building oysters represent another important "inorganic" strategy of marine molluscs for adhesion by generating biomineralized organic-inorganic adhesives, which is still rarely studied and no synthetic analogues have ever been reported so far. Here, a novel type of oyster-inspired organic-inorganic adhesive based on a biomineralized polyelectrolyte hydrogel is reported, which consists of polyacrylic acid physically cross-linked by very small amorphous calcium carbonate nanoparticles (<3 nm). The mineral-enhanced polyelectrolyte hydrogel adhesive is shown to be injectable, reusable, and optically clear upon curing in air. Moreover, comparable adhesion performance to DOPA-based adhesives is found for the hydrogel adhesive in both dry and wet conditions, which can even be further enhanced by introducing a small amount of second large cross-linker such as negatively charged nanoparticles. The present mineral hydrogel represents a new type of bio-inspired organic-inorganic adhesive that may find a variety of potential applications in adhesive chemistry.
Information Measures for Multisensor Systems
2013-12-11
permuted to generate spectra that were non- physical but preserved the entropy of the source spectra. Another 1000 spectra were constructed to mimic co...Research Laboratory (NRL) has yielded probabilistic models for spectral data that enable the computation of information measures such as entropy and...22308 Chemical sensing Information theory Spectral data Information entropy Information divergence Mass spectrometry Infrared spectroscopy Multisensor
Critical voids in exposure data and models lead risk assessors to rely on conservative assumptions. Risk assessors and managers need improved tools beyond the screening level analysis to address aggregate exposures to pesticides as required by the Food Quality Protection Act o...
Marbles: A Means of Introducing Students to Scattering Concepts
ERIC Educational Resources Information Center
Bender, K. M.; Westphal, P. S.; Ramsier, R. D.
2008-01-01
The purpose of this activity is to introduce students to concepts of short-range and long-range scattering, and engage them in using indirect measurements and probabilistic models. The activity uses simple and readily available apparatus, and can be adapted for use with secondary level students as well as those in general physics courses or…
Probabilistic Solution of Inverse Problems.
1985-09-01
AODRESSIl differentI from Conat.oildun 0111C*) It. SECURITY CLASS (ofll ~e vport) Office of Naval Research UCASFE Information Systems ...report describes research done within the Laboratory for Information and Decision Systems and the Artificial Intelligence Laboratory at the Massachusetts...analysis of systems endowed with perceptual abilities is the construction of internal representations of the physical structures in the external world
Interference in the classical probabilistic model and its representation in complex Hilbert space
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei Yu.
2005-10-01
The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.
Perks of Tracking Your Workout Progress
Monitoring your activity is a good way to know whether you’re reaching your goals and can inspire you to set new ones! Buy a pedometer or download an app or other tools to help you keep track of your physical activity goals and progress.
Global projects and Astronomy awareness activities in Nepal
NASA Astrophysics Data System (ADS)
Gautam, Suman
2015-08-01
Modern astronomy is a crowning achievement of human civilization which inspires teenagers to choose career in science and technology and is a stable of adult education. It is a unique and cost effective tool for furthering sustainable global development because of its technological, scientific and cultural dimensions which allow us to reach with the large portion of the community interact with children and inspire with our wonderful cosmos.Using astronomy to stimulate quality and inspiring education for disadvantaged children is an important goal of Nepal Astronomical Society (NASO) since its inception. NASO is carrying out various awareness activities on its own and in collaboration with national and international organizations like Central Department of Physics Tribhuvan University (TU), International astronomical Union (IAU), Department of Physics Prithvi Narayan Campus Pokhara, Nepal academy of science and technology (NAST), Global Hands on Universe (GHOU), EU- UNAWE and Pokhara Astronomical Society (PAS) to disseminate those activities for the school children and teachers in Nepal. Our experiences working with kids, students, teachers and public in the field of universe Awareness Activities for the school children to minimize the abstruse concept of astronomy through some practical approach and the project like Astronomy for the visually impaired students, Galileo Teacher Training program and International School for young astronomers (ISYA) outskirts will be explained which is believed to play vital role in promoting astronomy and space science activities in Nepal.
Surrogate modeling of joint flood risk across coastal watersheds
NASA Astrophysics Data System (ADS)
Bass, Benjamin; Bedient, Philip
2018-03-01
This study discusses the development and performance of a rapid prediction system capable of representing the joint rainfall-runoff and storm surge flood response of tropical cyclones (TCs) for probabilistic risk analysis. Due to the computational demand required for accurately representing storm surge with the high-fidelity ADvanced CIRCulation (ADCIRC) hydrodynamic model and its coupling with additional numerical models to represent rainfall-runoff, a surrogate or statistical model was trained to represent the relationship between hurricane wind- and pressure-field characteristics and their peak joint flood response typically determined from physics based numerical models. This builds upon past studies that have only evaluated surrogate models for predicting peak surge, and provides the first system capable of probabilistically representing joint flood levels from TCs. The utility of this joint flood prediction system is then demonstrated by improving upon probabilistic TC flood risk products, which currently account for storm surge but do not take into account TC associated rainfall-runoff. Results demonstrate the source apportionment of rainfall-runoff versus storm surge and highlight that slight increases in flood risk levels may occur due to the interaction between rainfall-runoff and storm surge as compared to the Federal Emergency Management Association's (FEMAs) current practices.
Probabilistic description of probable maximum precipitation
NASA Astrophysics Data System (ADS)
Ben Alaya, Mohamed Ali; Zwiers, Francis W.; Zhang, Xuebin
2017-04-01
Probable Maximum Precipitation (PMP) is the key parameter used to estimate probable Maximum Flood (PMF). PMP and PMF are important for dam safety and civil engineering purposes. Even if the current knowledge of storm mechanisms remains insufficient to properly evaluate limiting values of extreme precipitation, PMP estimation methods are still based on deterministic consideration, and give only single values. This study aims to provide a probabilistic description of the PMP based on the commonly used method, the so-called moisture maximization. To this end, a probabilistic bivariate extreme values model is proposed to address the limitations of traditional PMP estimates via moisture maximization namely: (i) the inability to evaluate uncertainty and to provide a range PMP values, (ii) the interpretation that a maximum of a data series as a physical upper limit (iii) and the assumption that a PMP event has maximum moisture availability. Results from simulation outputs of the Canadian Regional Climate Model CanRCM4 over North America reveal the high uncertainties inherent in PMP estimates and the non-validity of the assumption that PMP events have maximum moisture availability. This later assumption leads to overestimation of the PMP by an average of about 15% over North America, which may have serious implications for engineering design.
Anisotropic connectivity implements motion-based prediction in a spiking neural network.
Kaplan, Bernhard A; Lansner, Anders; Masson, Guillaume S; Perrinet, Laurent U
2013-01-01
Predictive coding hypothesizes that the brain explicitly infers upcoming sensory input to establish a coherent representation of the world. Although it is becoming generally accepted, it is not clear on which level spiking neural networks may implement predictive coding and what function their connectivity may have. We present a network model of conductance-based integrate-and-fire neurons inspired by the architecture of retinotopic cortical areas that assumes predictive coding is implemented through network connectivity, namely in the connection delays and in selectiveness for the tuning properties of source and target cells. We show that the applied connection pattern leads to motion-based prediction in an experiment tracking a moving dot. In contrast to our proposed model, a network with random or isotropic connectivity fails to predict the path when the moving dot disappears. Furthermore, we show that a simple linear decoding approach is sufficient to transform neuronal spiking activity into a probabilistic estimate for reading out the target trajectory.
Single walled carbon nanotube-based stochastic resonance device with molecular self-noise source
NASA Astrophysics Data System (ADS)
Fujii, Hayato; Setiadi, Agung; Kuwahara, Yuji; Akai-Kasaya, Megumi
2017-09-01
Stochastic resonance (SR) is an intrinsic noise usage system for small-signal sensing found in various living creatures. The noise-enhanced signal transmission and detection system, which is probabilistic but consumes low power, has not been used in modern electronics. We demonstrated SR in a summing network based on a single-walled carbon nanotube (SWNT) device that detects small subthreshold signals with very low current flow. The nonlinear current-voltage characteristics of this SWNT device, which incorporated Cr electrodes, were used as the threshold level of signal detection. The adsorption of redox-active polyoxometalate molecules on SWNTs generated additional noise, which was utilized as a self-noise source. To form a summing network SR device, a large number of SWNTs were aligned parallel to each other between the electrodes, which increased the signal detection ability. The functional capabilities of the present small-size summing network SR device, which rely on dense nanomaterials and exploit intrinsic spontaneous noise at room temperature, offer a glimpse of future bio-inspired electronic devices.
Foundational perspectives on causality in large-scale brain networks
NASA Astrophysics Data System (ADS)
Mannino, Michael; Bressler, Steven L.
2015-12-01
A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical likelihood that a change in the activity of one neuronal population affects the activity in another. We argue that these measures access the inherently probabilistic nature of causal influences in the brain, and are thus better suited for large-scale brain network analysis than are DC-based measures. Our work is consistent with recent advances in the philosophical study of probabilistic causality, which originated from inherent conceptual problems with deterministic regularity theories. It also resonates with concepts of stochasticity that were involved in establishing modern physics. In summary, we argue that probabilistic causality is a conceptually appropriate foundation for describing neural causality in the brain.
Foundational perspectives on causality in large-scale brain networks.
Mannino, Michael; Bressler, Steven L
2015-12-01
A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical likelihood that a change in the activity of one neuronal population affects the activity in another. We argue that these measures access the inherently probabilistic nature of causal influences in the brain, and are thus better suited for large-scale brain network analysis than are DC-based measures. Our work is consistent with recent advances in the philosophical study of probabilistic causality, which originated from inherent conceptual problems with deterministic regularity theories. It also resonates with concepts of stochasticity that were involved in establishing modern physics. In summary, we argue that probabilistic causality is a conceptually appropriate foundation for describing neural causality in the brain. Copyright © 2015 Elsevier B.V. All rights reserved.
The material co-construction of hard science fiction and physics
NASA Astrophysics Data System (ADS)
Hasse, Cathrine
2015-12-01
This article explores the relationship between hard science fiction and physics and a gendered culture of science. Empirical studies indicate that science fiction references might spur some students' interest in physics and help develop this interest throughout school, into a university education and even further later inspire the practice of doing science. There are many kinds of fiction within the science fiction genre. In the presented empirical exploration physics students seem particularly fond of what is called `hard science fiction': a particular type of science fiction dealing with technological developments (Hartwell and Cramer in The hard SF renaissance, Orb/TOR, New York, 2002). Especially hard science fiction as a motivating fantasy may, however, also come with a gender bias. The locally materialized techno-fantasies spurring dreams of the terraforming of planets like Mars and travels in time and space may not be shared by all physics students. Especially female students express a need for other concerns in science. The entanglement of physics with hard science fiction may thus help develop some students' interest in learning school physics and help create an interest for studying physics at university level. But research indicates that especially female students are not captured by the hard techno-fantasies to the same extent as some of their male colleagues. Other visions (e.g. inspired by soft science fiction) are not materialized as a resource in the local educational culture. It calls for an argument of how teaching science is also teaching cultural values, ethics and concerns, which may be gendered. Teaching materials, like the use of hard science fiction in education, may not just be (yet another) gender bias in science education but also carrier of particular visions for scientific endeavours.
Mixed Signals: The Impact of International Administration on Kosovo’s Independence
2010-12-01
therapist , mother confessor, and ever-suffering editor: thank you for your amazing support and understanding. You are my hero, and you inspire me...University Press, 1998), xxix. 9 the physical realm: the tectonic plates of race, religion, language, and culture also collide in the Balkans...battle in 1389.66 By adopting these crowd symbols, the Serbs retained their spiritual link to Kosovo despite having departed the province physically
Intelligent Computational Systems. Opening Remarks: CFD Application Process Workshop
NASA Technical Reports Server (NTRS)
VanDalsem, William R.
1994-01-01
This discussion will include a short review of the challenges that must be overcome if computational physics technology is to have a larger impact on the design cycles of U.S. aerospace companies. Some of the potential solutions to these challenges may come from the information sciences fields. A few examples of potential computational physics/information sciences synergy will be presented, as motivation and inspiration for the Improving The CFD Applications Process Workshop.
Post-Fisherian Experimentation: From Physical to Virtual
Jeff Wu, C. F.
2014-04-24
Fisher's pioneering work in design of experiments has inspired further work with broader applications, especially in industrial experimentation. Three topics in physical experiments are discussed: principles of effect hierarchy, sparsity, and heredity for factorial designs, a new method called CME for de-aliasing aliased effects, and robust parameter design. The recent emergence of virtual experiments on a computer is reviewed. Here, some major challenges in computer experiments, which must go beyond Fisherian principles, are outlined.
Hashim, Iza Husna Mohamad; Kumamoto, Shogo; Takemura, Kenjiro; Maeno, Takashi; Okuda, Shin; Mori, Yukio
2017-11-11
Tactile sensation is one type of valuable feedback in evaluating a product. Conventionally, sensory evaluation is used to get direct subjective responses from the consumers, in order to improve the product's quality. However, this method is a time-consuming and costly process. Therefore, this paper proposes a novel tactile evaluation system that can give tactile feedback from a sensor's output. The main concept of this system is hierarchically layering the tactile sensation, which is inspired by the flow of human perception. The tactile sensation is classified from low-order of tactile sensation (LTS) to high-order of tactile sensation (HTS), and also to preference. Here, LTS will be correlated with physical measures. Furthermore, the physical measures that are used to correlate with LTS are selected based on four main aspects of haptic information (roughness, compliance, coldness, and slipperiness), which are perceived through human tactile sensors. By using statistical analysis, the correlation between each hierarchy was obtained, and the preference was derived in terms of physical measures. A verification test was conducted by using unknown samples to determine the reliability of the system. The results showed that the system developed was capable of estimating preference with an accuracy of approximately 80%.
NASA Astrophysics Data System (ADS)
Zhang, Shaojie; Zhao, Luqiang; Delgado-Tellez, Ricardo; Bao, Hongjun
2018-03-01
Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs) of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs < 1 is tested for each pixel in n simulations which are integrated in a unique parameter. This parameter links the landslide probability to the uncertainties of soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.
NASA Astrophysics Data System (ADS)
Zunino, Andrea; Mosegaard, Klaus
2017-04-01
Sought-after reservoir properties of interest are linked only indirectly to the observable geophysical data which are recorded at the earth's surface. In this framework, seismic data represent one of the most reliable tool to study the structure and properties of the subsurface for natural resources. Nonetheless, seismic analysis is not an end in itself, as physical properties such as porosity are often of more interest for reservoir characterization. As such, inference of those properties implies taking into account also rock physics models linking porosity and other physical properties to elastic parameters. In the framework of seismic reflection data, we address this challenge for a reservoir target zone employing a probabilistic method characterized by a multi-step complex nonlinear forward modeling that combines: 1) a rock physics model with 2) the solution of full Zoeppritz equations and 3) a convolutional seismic forward modeling. The target property of this work is porosity, which is inferred using a Monte Carlo approach where porosity models, i.e., solutions to the inverse problem, are directly sampled from the posterior distribution. From a theoretical point of view, the Monte Carlo strategy can be particularly useful in the presence of nonlinear forward models, which is often the case when employing sophisticated rock physics models and full Zoeppritz equations and to estimate related uncertainty. However, the resulting computational challenge is huge. We propose to alleviate this computational burden by assuming some smoothness of the subsurface parameters and consequently parameterizing the model in terms of spline bases. This allows us a certain flexibility in that the number of spline bases and hence the resolution in each spatial direction can be controlled. The method is tested on a 3-D synthetic case and on a 2-D real data set.
NASA Astrophysics Data System (ADS)
Mert, A.
2016-12-01
The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.
Probabilistic short-term forecasting of eruption rate at Kīlauea Volcano using a physics-based model
NASA Astrophysics Data System (ADS)
Anderson, K. R.
2016-12-01
Deterministic models of volcanic eruptions yield predictions of future activity conditioned on uncertainty in the current state of the system. Physics-based eruption models are well-suited for deterministic forecasting as they can relate magma physics with a wide range of observations. Yet, physics-based eruption forecasting is strongly limited by an inadequate understanding of volcanic systems, and the need for eruption models to be computationally tractable. At Kīlauea Volcano, Hawaii, episodic depressurization-pressurization cycles of the magma system generate correlated, quasi-exponential variations in ground deformation and surface height of the active summit lava lake. Deflations are associated with reductions in eruption rate, or even brief eruptive pauses, and thus partly control lava flow advance rates and associated hazard. Because of the relatively well-understood nature of Kīlauea's shallow magma plumbing system, and because more than 600 of these events have been recorded to date, they offer a unique opportunity to refine a physics-based effusive eruption forecasting approach and apply it to lava eruption rates over short (hours to days) time periods. A simple physical model of the volcano ascribes observed data to temporary reductions in magma supply to an elastic reservoir filled with compressible magma. This model can be used to predict the evolution of an ongoing event, but because the mechanism that triggers events is unknown, event durations are modeled stochastically from previous observations. A Bayesian approach incorporates diverse data sets and prior information to simultaneously estimate uncertain model parameters and future states of the system. Forecasts take the form of probability distributions for eruption rate or cumulative erupted volume at some future time. Results demonstrate the significant uncertainties that still remain even for short-term eruption forecasting at a well-monitored volcano - but also the value of a physics-based, mixed deterministic-probabilistic eruption forecasting approach in reducing and quantifying these uncertainties.
Intelligent detectors modelled from the cat's eye
NASA Astrophysics Data System (ADS)
Lindblad, Th.; Becanovic, V.; Lindsey, C. S.; Szekely, G.
1997-02-01
Biologically inspired image/signal processing, in particular neural networks like the Pulse-Coupled Neural Network (PCNN), are revisited. Their use with high granularity high-energy physics detectors, as well as optical sensing devices, for filtering, de-noising, segmentation, object isolation and edge detection is discussed.
Potyrailo, Radislav A.; Bonam, Ravi K.; Hartley, John G.; Starkey, Timothy A.; Vukusic, Peter; Vasudev, Milana; Bunning, Timothy; Naik, Rajesh R.; Tang, Zhexiong; Palacios, Manuel A.; Larsen, Michael; Le Tarte, Laurie A.; Grande, James C.; Zhong, Sheng; Deng, Tao
2015-01-01
Combining vapour sensors into arrays is an accepted compromise to mitigate poor selectivity of conventional sensors. Here we show individual nanofabricated sensors that not only selectively detect separate vapours in pristine conditions but also quantify these vapours in mixtures, and when blended with a variable moisture background. Our sensor design is inspired by the iridescent nanostructure and gradient surface chemistry of Morpho butterflies and involves physical and chemical design criteria. The physical design involves optical interference and diffraction on the fabricated periodic nanostructures and uses optical loss in the nanostructure to enhance the spectral diversity of reflectance. The chemical design uses spatially controlled nanostructure functionalization. Thus, while quantitation of analytes in the presence of variable backgrounds is challenging for most sensor arrays, we achieve this goal using individual multivariable sensors. These colorimetric sensors can be tuned for numerous vapour sensing scenarios in confined areas or as individual nodes for distributed monitoring. PMID:26324320
Investigating Near Space Interaction Regions: Developing a Remote Observatory
NASA Astrophysics Data System (ADS)
Gallant, M.; Mierkiewicz, E. J.; Oliversen, R. J.; Jaehnig, K.; Percival, J.; Harlander, J.; Englert, C. R.; Kallio, R.; Roesler, F. L.; Nossal, S. M.; Gardner, D.; Rosborough, S.
2016-12-01
The Investigating Near Space Interaction Regions (INSpIRe) effort will (1) establish an adaptable research station capable of contributing to terrestrial and planetary aeronomy; (2) integrate two state-of-the-art second generation Fabry-Perot (FP) and Spatial Heteorodyne Spectrometers (SHS) into a remotely operable configuration; (3) deploy this instrumentation to a clear-air site, establishing a stable, well-calibrated observatory; (4) embark on a series of observations designed to contribute to three major areas of geocoronal research: geocoronal physics, structure/coupling, and variability. This poster describes the development of the INSpIRe remote observatory. Based at Embry-Riddle Aeronautical University (ERAU), initiative INSpIRe provides a platform to encourage the next generation of researchers to apply knowledge gained in the classroom to real-world science and engineering. Students at ERAU contribute to the INSpIRe effort's hardware and software needs. Mechanical/optical systems are in design to bring light to any of four instruments. Control software is in development to allow remote users to control everything from dome and optical system operations to calibration and data collection. In April 2016, we also installed and tested our first science instrument in the INSpIRe trailer, the Redline DASH Demonstration Instrument (REDDI). REDDI uses Doppler Asymmetric Spatial Heterodyne (DASH) spectroscopy, and its deployment as part of INSpIRe is a collaborative research effort between the Naval Research Lab, St Cloud State University, and ERAU. Similar to a stepped Michelson device, REDDI measures oxygen (630.0 nm) winds from the thermosphere. REDDI is currently mounted in a temporary location under INSpIRe's main siderostat until its entrance optical system can be modified. First light tests produced good signal-to-noise fringes in ten minute integrations, indicating that we will soon be able to measure thermospheric winds from our Daytona Beach testing site. Future work will involve installation and software integration of FP and SHS systems and the Embry-Riddle Instrument Control System. The INSpIRe project is funded through NSF-CAREER award AGS135231 and the NASA Planetary Solar System Observations Program. The REDDI instrument was supported by the Chief of Naval Research.
Probabilistic Learning in Junior High School: Investigation of Student Probabilistic Thinking Levels
NASA Astrophysics Data System (ADS)
Kurniasih, R.; Sujadi, I.
2017-09-01
This paper was to investigate level on students’ probabilistic thinking. Probabilistic thinking level is level of probabilistic thinking. Probabilistic thinking is thinking about probabilistic or uncertainty matter in probability material. The research’s subject was students in grade 8th Junior High School students. The main instrument is a researcher and a supporting instrument is probabilistic thinking skills test and interview guidelines. Data was analyzed using triangulation method. The results showed that the level of students probabilistic thinking before obtaining a teaching opportunity at the level of subjective and transitional. After the students’ learning level probabilistic thinking is changing. Based on the results of research there are some students who have in 8th grade level probabilistic thinking numerically highest of levels. Level of students’ probabilistic thinking can be used as a reference to make a learning material and strategy.
2004-12-01
64, (2000), Federal Aviation Administration, Washington, DC. 14. Y.T. Wu, M.P. Enright, and H.R. Millwater , "Probabilistic Methods for Design...Assessment of Reliability with Inspection," AIAA Journal, AIAA, 40 (5), (2002), 937-946. 15. M.P. Enright, L. Huyse, R.C. McClung, and H.R. Millwater
Demetzos, Costas
2015-06-01
Biophysics and thermodynamics are considered as the scientific milestones for investigating the properties of materials. The relationship between the changes of temperature with the biophysical variables of biomaterials is important in the process of the development of drug delivery systems. Biophysics is a challenge sector of physics and should be used complementary with the biochemistry in order to discover new and promising technological platforms (i.e., drug delivery systems) and to disclose the 'silence functionality' of bio-inspired biological and artificial membranes. Thermal analysis and biophysical approaches in pharmaceuticals present reliable and versatile tools for their characterization and for the successful development of pharmaceutical products. The metastable phases of self-assembled nanostructures such as liposomes should be taken into consideration because they represent the thermal events can affect the functionality of advanced drug delivery nano systems. In conclusion, biophysics and thermodynamics are characterized as the building blocks for design and development of bio-inspired drug delivery systems.
Predicting origami-inspired programmable self-folding of hydrogel trilayers
NASA Astrophysics Data System (ADS)
An, Ning; Li, Meie; Zhou, Jinxiong
2016-11-01
Imitating origami principles in active or programmable materials opens the door for development of origami-inspired self-folding structures for not only aesthetic but also functional purposes. A variety of programmable materials enabled self-folding structures have been demonstrated across various fields and scales. These folding structures have finite thickness and the mechanical properties of the active materials dictate the folding process. Yet formalizing the use of origami rules for use in computer modeling has been challenging, owing to the zero-thickness theory and the exclusion of mechanical properties in current models. Here, we describe a physics-based finite element simulation scheme to predict programmable self-folding of temperature-sensitive hydrogel trilayers. Patterning crease and assigning mountain or valley folds are highlighted for complex origami such as folding of the Randlett’s flapping bird and the crane. Our efforts enhance the understanding and facilitate the design of origami-inspired self-folding structures, broadening the realization and application of reconfigurable structures.
Sensitivity to Uncertainty in Asteroid Impact Risk Assessment
NASA Astrophysics Data System (ADS)
Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.
2015-12-01
The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.
Operational formulation of time reversal in quantum theory
NASA Astrophysics Data System (ADS)
Oreshkov, Ognyan; Cerf, Nicolas J.
2015-10-01
The symmetry of quantum theory under time reversal has long been a subject of controversy because the transition probabilities given by Born’s rule do not apply backward in time. Here, we resolve this problem within a rigorous operational probabilistic framework. We argue that reconciling time reversal with the probabilistic rules of the theory requires a notion of operation that permits realizations through both pre- and post-selection. We develop the generalized formulation of quantum theory that stems from this approach and give a precise definition of time-reversal symmetry, emphasizing a previously overlooked distinction between states and effects. We prove an analogue of Wigner’s theorem, which characterizes all allowed symmetry transformations in this operationally time-symmetric quantum theory. Remarkably, we find larger classes of symmetry transformations than previously assumed, suggesting a possible direction in the search for extensions of known physics.
NASA Astrophysics Data System (ADS)
Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi
2017-08-01
Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.
NASA Astrophysics Data System (ADS)
Kopp, Robert E.; DeConto, Robert M.; Bader, Daniel A.; Hay, Carling C.; Horton, Radley M.; Kulp, Scott; Oppenheimer, Michael; Pollard, David; Strauss, Benjamin H.
2017-12-01
Mechanisms such as ice-shelf hydrofracturing and ice-cliff collapse may rapidly increase discharge from marine-based ice sheets. Here, we link a probabilistic framework for sea-level projections to a small ensemble of Antarctic ice-sheet (AIS) simulations incorporating these physical processes to explore their influence on global-mean sea-level (GMSL) and relative sea-level (RSL). We compare the new projections to past results using expert assessment and structured expert elicitation about AIS changes. Under high greenhouse gas emissions (Representative Concentration Pathway [RCP] 8.5), median projected 21st century GMSL rise increases from 79 to 146 cm. Without protective measures, revised median RSL projections would by 2100 submerge land currently home to 153 million people, an increase of 44 million. The use of a physical model, rather than simple parameterizations assuming constant acceleration of ice loss, increases forcing sensitivity: overlap between the central 90% of simulations for 2100 for RCP 8.5 (93-243 cm) and RCP 2.6 (26-98 cm) is minimal. By 2300, the gap between median GMSL estimates for RCP 8.5 and RCP 2.6 reaches >10 m, with median RSL projections for RCP 8.5 jeopardizing land now occupied by 950 million people (versus 167 million for RCP 2.6). The minimal correlation between the contribution of AIS to GMSL by 2050 and that in 2100 and beyond implies current sea-level observations cannot exclude future extreme outcomes. The sensitivity of post-2050 projections to deeply uncertain physics highlights the need for robust decision and adaptive management frameworks.
NASA Technical Reports Server (NTRS)
Kopp, Robert E.; DeConto, Robert M.; Bader, Daniel A.; Hay, Carling C.; Horton, Radley M.; Kulp, Scott; Oppenheimer, Michael; Pollard, David; Strauss, Benjamin
2017-01-01
Mechanisms such as ice-shelf hydrofracturing and ice-cliff collapse may rapidly increase discharge from marine-based ice sheets. Here, we link a probabilistic framework for sea-level projections to a small ensemble of Antarctic ice-sheet (AIS) simulations incorporating these physical processes to explore their influence on global-mean sea-level (GMSL) and relative sea-level (RSL). We compare the new projections to past results using expert assessment and structured expert elicitation about AIS changes. Under high greenhouse gas emissions (Representative Concentration Pathway [RCP] 8.5), median projected 21st century GMSL rise increases from 79 to 146 cm. Without protective measures, revised median RSL projections would by 2100 submerge land currently home to 153 million people, an increase of 44 million. The use of a physical model, rather than simple parameterizations assuming constant acceleration of ice loss, increases forcing sensitivity: overlap between the central 90% of simulations for 2100 for RCP 8.5 (93-243 cm) and RCP 2.6 (26-98 cm) is minimal. By 2300, the gap between median GMSL estimates for RCP 8.5 and RCP 2.6 reaches >10 m, with median RSL projections for RCP 8.5 jeopardizing land now occupied by 950 million people (versus 167 million for RCP 2.6). The minimal correlation between the contribution of AIS to GMSL by 2050 and that in 2100 and beyond implies current sea-level observations cannot exclude future extreme outcomes. The sensitivity of post-2050 projections to deeply uncertain physics highlights the need for robust decision and adaptive management frameworks.
A useful demonstration of calculus in a physics high school laboratory
NASA Astrophysics Data System (ADS)
Alvarez, Gustavo; Schulte, Jurgen; Stockton, Geoffrey; Wheeler, David
2018-01-01
The real power of calculus is revealed when it is applied to actual physical problems. In this paper, we present a calculus inspired physics experiment suitable for high school and undergraduate programs. A model for the theory of the terminal velocity of a falling body subject to a resistive force is developed and its validity tested in an experiment of a falling magnet in a column of self-induced eddy currents. The presented method combines multiple physics concepts such as 1D kinematics, classical mechanics, electromagnetism and non-trivial mathematics. It offers the opportunity for lateral as well as project-based learning.
Inciting High-School interest in physics.
NASA Astrophysics Data System (ADS)
Zhang, Jiandi
2008-03-01
We report on our outreach effort on material-physics education program as one part of my NSF Career award project. This is a program incorporated with the NSF funded Physics Learning Center at FIU, focusing on the material physics enrichment both high school students and teachers. We particularly pay attention to minority students by taking the advantage of FIU's composition and location. The program offers a special/session-style workshop, demonstrations, research lab touring, as well as summer research activities. The goal is to enrich teacher's ability of instruction to their students and inspire students to pursue scientific careers. The detailed outreach activities will be discussed.
The Earth's Magnetic Field Fuels Inter-Disciplinary Education
ERIC Educational Resources Information Center
Abdul-Razzaq, Wathiq; Biller, R. Dale; Wilson, Thomas H.
2015-01-01
There is no doubt that integrated concepts inspire students and take learning to a new level. As we fly, we fly through the magnetic field of the Earth. We used the concepts involved in flying to develop an exercise that bonds geology, physics and life sciences.
Exploring the calibration of a wind forecast ensemble for energy applications
NASA Astrophysics Data System (ADS)
Heppelmann, Tobias; Ben Bouallegue, Zied; Theis, Susanne
2015-04-01
In the German research project EWeLiNE, Deutscher Wetterdienst (DWD) and Fraunhofer Institute for Wind Energy and Energy System Technology (IWES) are collaborating with three German Transmission System Operators (TSO) in order to provide the TSOs with improved probabilistic power forecasts. Probabilistic power forecasts are derived from probabilistic weather forecasts, themselves derived from ensemble prediction systems (EPS). Since the considered raw ensemble wind forecasts suffer from underdispersiveness and bias, calibration methods are developed for the correction of the model bias and the ensemble spread bias. The overall aim is to improve the ensemble forecasts such that the uncertainty of the possible weather deployment is depicted by the ensemble spread from the first forecast hours. Additionally, the ensemble members after calibration should remain physically consistent scenarios. We focus on probabilistic hourly wind forecasts with horizon of 21 h delivered by the convection permitting high-resolution ensemble system COSMO-DE-EPS which has become operational in 2012 at DWD. The ensemble consists of 20 ensemble members driven by four different global models. The model area includes whole Germany and parts of Central Europe with a horizontal resolution of 2.8 km and a vertical resolution of 50 model levels. For verification we use wind mast measurements around 100 m height that corresponds to the hub height of wind energy plants that belong to wind farms within the model area. Calibration of the ensemble forecasts can be performed by different statistical methods applied to the raw ensemble output. Here, we explore local bivariate Ensemble Model Output Statistics at individual sites and quantile regression with different predictors. Applying different methods, we already show an improvement of ensemble wind forecasts from COSMO-DE-EPS for energy applications. In addition, an ensemble copula coupling approach transfers the time-dependencies of the raw ensemble to the calibrated ensemble. The calibrated wind forecasts are evaluated first with univariate probabilistic scores and additionally with diagnostics of wind ramps in order to assess the time-consistency of the calibrated ensemble members.
NASA Astrophysics Data System (ADS)
González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.
2011-12-01
Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
NASA Astrophysics Data System (ADS)
Cheng, Jie; Lee, Sang-Hoon
2015-12-01
Silks produced by spiders and silkworms are charming natural biological materials with highly optimized hierarchical structures and outstanding physicomechanical properties. The superior performance of silks relies on the integration of a unique protein sequence, a distinctive spinning process, and complex hierarchical structures. Silks have been prepared to form a variety of morphologies and are widely used in diverse applications, for example, in the textile industry, as drug delivery vehicles, and as tissue engineering scaffolds. This review presents an overview of the organization of natural silks, in which chemical and physical functions are optimized, as well as a range of new materials inspired by the desire to mimic natural silk structure and synthesis.
NASA Astrophysics Data System (ADS)
Fratzl, Peter
Biological tissues are naturally interactive and adaptive. In general, these features are due to the action of cells that provide sensing, actuation as well as tissue remodelling. There are also examples of materials synthesized by living organisms, such as plant seeds, which fulfil an active function without living cells working as mechanosensors and actuators. Thus the activity of these materials is based on physical principles alone, which provides inspiration for new concepts for artificial active materials. We will describe structural principles leading to movement in seed capsules triggered by ambient humidity and discuss the influence of internal architecture on the overall mechanical behaviour of materials, including actuation and motility. Several conceptual systems for actuating planar structures will be discussed.
Biologically inspired intelligent robots
NASA Astrophysics Data System (ADS)
Bar-Cohen, Yoseph; Breazeal, Cynthia
2003-07-01
Humans throughout history have always sought to mimic the appearance, mobility, functionality, intelligent operation, and thinking process of biological creatures. This field of biologically inspired technology, having the moniker biomimetics, has evolved from making static copies of human and animals in the form of statues to the emergence of robots that operate with realistic behavior. Imagine a person walking towards you where suddenly you notice something weird about him--he is not real but rather he is a robot. Your reaction would probably be "I can't believe it but this robot looks very real" just as you would react to an artificial flower that is a good imitation. You may even proceed and touch the robot to check if your assessment is correct but, as oppose to the flower case, the robot may be programmed to respond physical and verbally. This science fiction scenario could become a reality as the current trend continues in developing biologically inspired technologies. Technology evolution led to such fields as artificial muscles, artificial intelligence, and artificial vision as well as biomimetic capabilities in materials science, mechanics, electronics, computing science, information technology and many others. This paper will review the state of the art and challenges to biologically-inspired technologies and the role that EAP is expected to play as the technology evolves.
Nature-Inspired Structural Materials for Flexible Electronic Devices.
Liu, Yaqing; He, Ke; Chen, Geng; Leow, Wan Ru; Chen, Xiaodong
2017-10-25
Exciting advancements have been made in the field of flexible electronic devices in the last two decades and will certainly lead to a revolution in peoples' lives in the future. However, because of the poor sustainability of the active materials in complex stress environments, new requirements have been adopted for the construction of flexible devices. Thus, hierarchical architectures in natural materials, which have developed various environment-adapted structures and materials through natural selection, can serve as guides to solve the limitations of materials and engineering techniques. This review covers the smart designs of structural materials inspired by natural materials and their utility in the construction of flexible devices. First, we summarize structural materials that accommodate mechanical deformations, which is the fundamental requirement for flexible devices to work properly in complex environments. Second, we discuss the functionalities of flexible devices induced by nature-inspired structural materials, including mechanical sensing, energy harvesting, physically interacting, and so on. Finally, we provide a perspective on newly developed structural materials and their potential applications in future flexible devices, as well as frontier strategies for biomimetic functions. These analyses and summaries are valuable for a systematic understanding of structural materials in electronic devices and will serve as inspirations for smart designs in flexible electronics.
NASA Astrophysics Data System (ADS)
Adams, M.; Smith, J. A.; Kloostra, E.; Knupp, K. R.; Taylor, K.; Anderson, S.; Baskauf, C. J.; Buckner, S.; DiMatties, J.; Fry, C. D.; Gaither, B.; Galben, C. W.; Gallagher, D. L.; Heaston, M. P.; Kraft, J.; Meisch, K.; Mills, R.; Nations, C.; Nielson, D.; Oelgoetz, J.; Rawlins, L. P.; Sudbrink, D. L.; Wright, A.
2017-12-01
For the August 2017 eclipse, NASA's Marshall Space Flight Center partnered with the U.S. Space and Rocket Center (USSRC), Austin Peay State University (APSU) in Clarksville, Tennessee, the University of Alabama in Huntsville (UAH), the Interactive NASA Space Physics Ionosphere Radio Experiments (INSPIRE) Project, and the local school systems of Montgomery County, Tennessee, and Christian County, Kentucky. Multiple site visits and workshops were carried out during the first eight months of 2017 to prepare local teachers and students for the eclipse. A special curriculum was developed to prepare USSRC Space Camp and INSPIRE students to observe and participate in science measurements during the eclipse. Representatives from Christian County school system and APSU carried out observations for the Citizen Continental-America Telescopic Eclipse (CATE) Experiment in two separate locations. UAH and APSU as part of the Montana State Ballooning Project, launched balloons containing video cameras and other instruments. USSRC Space Camp students and counselors and INSPIRE students conducted science experiments that included the following: atmospheric science investigations of the atmospheric boundary layer, very-low frequency and Ham radio observations to investigate ionospheric responses to the eclipse, animal and insect observations, solar-coronal observations, eclipse shadow bands. We report on the results of all these investigations.
A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.
Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may helpmore » address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-system’s state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.« less
NASA Astrophysics Data System (ADS)
Biass, Sébastien; Falcone, Jean-Luc; Bonadonna, Costanza; Di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino
2016-10-01
We present a probabilistic approach to quantify the hazard posed by volcanic ballistic projectiles (VBP) and their potential impact on the built environment. A model named Great Balls of Fire (GBF) is introduced to describe ballistic trajectories of VBPs accounting for a variable drag coefficient and topography. It relies on input parameters easily identifiable in the field and is designed to model large numbers of VBPs stochastically. Associated functions come with the GBF code to post-process model outputs into a comprehensive probabilistic hazard assessment for VBP impacts. Outcomes include probability maps to exceed given thresholds of kinetic energies at impact, hazard curves and probabilistic isoenergy maps. Probabilities are calculated either on equally-sized pixels or zones of interest. The approach is calibrated, validated and applied to La Fossa volcano, Vulcano Island (Italy). We constructed a generic eruption scenario based on stratigraphic studies and numerical inversions of the 1888-1890 long-lasting Vulcanian cycle of La Fossa. Results suggest a ~ 10- 2% probability of occurrence of VBP impacts with kinetic energies ≤ 104 J at the touristic locality of Porto. In parallel, the vulnerability to roof perforation was estimated by combining field observations and published literature, allowing for a first estimate of the potential impact of VBPs during future Vulcanian eruptions. Results indicate a high physical vulnerability to the VBP hazard, and, consequently, half of the building stock having a ≥ 2.5 × 10- 3% probability of roof perforation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua
2014-11-01
Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper representsmore » an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation environment such as RELAP-7. • Identify the risk-significant passive components, their failure modes and anticipated rates of degradation • Incorporate surveillance and maintenance activities and their effects into the plant state and into component aging progress. • Asses aging affects in a dynamic simulation environment 1. C. L. SMITH, V. N. SHAH, T. KAO, G. APOSTOLAKIS, “Incorporating Ageing Effects into Probabilistic Risk Assessment –A Feasibility Study Utilizing Reliability Physics Models,” NUREG/CR-5632, USNRC, (2001). 2. T. ALDEMIR, “A Survey of Dynamic Methodologies for Probabilistic Safety Assessment of Nuclear Power Plants, Annals of Nuclear Energy, 52, 113-124, (2013). 3. C. RABITI, A. ALFONSI, J. COGLIATI, D. MANDELLI and R. KINOSHITA “Reactor Analysis and Virtual Control Environment (RAVEN) FY12 Report,” INL/EXT-12-27351, (2012). 4. D. ANDERS et.al, "RELAP-7 Level 2 Milestone Report: Demonstration of a Steady State Single Phase PWR Simulation with RELAP-7," INL/EXT-12-25924, (2012).« less
Nanotechnology: The Incredible Invisible World
ERIC Educational Resources Information Center
Roberts, Amanda S.
2011-01-01
The concept of nanotechnology was first introduced in 1959 by Richard Feynman at a meeting of the American Physical Society. Nanotechnology opens the door to an exciting new science/technology/engineering field. The possibilities for the uses of this technology should inspire the imagination to think big. Many are already pursuing such feats…
Network science of biological systems at different scales: A review
NASA Astrophysics Data System (ADS)
Gosak, Marko; Markovič, Rene; Dolenšek, Jurij; Slak Rupnik, Marjan; Marhl, Marko; Stožer, Andraž; Perc, Matjaž
2018-03-01
Network science is today established as a backbone for description of structure and function of various physical, chemical, biological, technological, and social systems. Here we review recent advances in the study of complex biological systems that were inspired and enabled by methods of network science. First, we present
Scaling in Theropod Dinosaurs: Femoral Bone Dimensions
ERIC Educational Resources Information Center
Lee, Scott A.
2014-01-01
Finding topics that inspire students is an important aspect of any physics course. Virtually everyone is fascinated by "Tyrannosaurus rex," and the excitement of the class is palpable when we explore scaling effects in "T. rex" and other bipedal theropod dinosaurs as part of our discussion of mechanics and elasticity. In this…
Edward Purcell and Nuclear Magnetic Resonance (NMR)
"development of new methods for nuclear magnetic precision measurements and discoveries in educated and inspired a generation of physicists, who refer to it often, and depend on it utterly.1 Purcell : A Precise Determination of the Proton Magnetic Moment in Bohr Magnetons; Physical Review, Vol. 76
You Can Be a Woman Paleontologist.
ERIC Educational Resources Information Center
Gabriel, Diane L.; Cohen, Judith Love
This booklet stresses the value of various academic studies (e.g., history, geology, anatomy, physical anthropology) as prerequisites for a career in paleontology, by depicting real women whose careers provide inspirational role models. The first section is a text designed for use by elementary students and presents the career of paleontology from…
Modern Gravitational Lens Cosmology for Introductory Physics and Astronomy Students
ERIC Educational Resources Information Center
Huwe, Paul; Field, Scott
2015-01-01
Recent and exciting discoveries in astronomy and cosmology have inspired many high school students to learn about these fields. A particularly fascinating consequence of general relativity at the forefront of modern cosmology research is gravitational lensing, the bending of light rays that pass near massive objects. Gravitational lensing enables…
Lyle Olsen, Coach and Teacher: The Mantle Fits.
ERIC Educational Resources Information Center
Arbolino, Jack
1979-01-01
Lyle Olsen, a scholar-athlete and a prize rookie in the Dodger chain while he was in college, is now a professor-coach. He believes that the guideline "winning is the only thing" is a treacherous one. Olsen inspires his students to make their lives richer through physical education. (Author/MLW)
Simple Experiments for Teaching Air Pressure
ERIC Educational Resources Information Center
Shamsipour, Gholamreza
2006-01-01
Everyone who teaches physics knows very well that sometimes a simple device or experiment can help to make a concept clear. In this paper, inspired by "The Jumping Pencil" by Martin Gardner, I will discuss a simple demonstration device that can be used to start the study of air pressure.
A statistical physics perspective on criticality in financial markets
NASA Astrophysics Data System (ADS)
Bury, Thomas
2013-11-01
Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood.
State-Transition Structures in Physics and in Computation
NASA Astrophysics Data System (ADS)
Petri, C. A.
1982-12-01
In order to establish close connections between physical and computational processes, it is assumed that the concepts of “state” and of “transition” are acceptable both to physicists and to computer scientists, at least in an informal way. The aim of this paper is to propose formal definitions of state and transition elements on the basis of very low level physical concepts in such a way that (1) all physically possible computations can be described as embedded in physical processes; (2) the computational aspects of physical processes can be described on a well-defined level of abstraction; (3) the gulf between the continuous models of physics and the discrete models of computer science can be bridged by simple mathematical constructs which may be given a physical interpretation; (4) a combinatorial, nonstatistical definition of “information” can be given on low levels of abstraction which may serve as a basis to derive higher-level concepts of information, e.g., by a statistical or probabilistic approach. Conceivable practical consequences are discussed.
Synaptic and nonsynaptic plasticity approximating probabilistic inference
Tully, Philip J.; Hennig, Matthias H.; Lansner, Anders
2014-01-01
Learning and memory operations in neural circuits are believed to involve molecular cascades of synaptic and nonsynaptic changes that lead to a diverse repertoire of dynamical phenomena at higher levels of processing. Hebbian and homeostatic plasticity, neuromodulation, and intrinsic excitability all conspire to form and maintain memories. But it is still unclear how these seemingly redundant mechanisms could jointly orchestrate learning in a more unified system. To this end, a Hebbian learning rule for spiking neurons inspired by Bayesian statistics is proposed. In this model, synaptic weights and intrinsic currents are adapted on-line upon arrival of single spikes, which initiate a cascade of temporally interacting memory traces that locally estimate probabilities associated with relative neuronal activation levels. Trace dynamics enable synaptic learning to readily demonstrate a spike-timing dependence, stably return to a set-point over long time scales, and remain competitive despite this stability. Beyond unsupervised learning, linking the traces with an external plasticity-modulating signal enables spike-based reinforcement learning. At the postsynaptic neuron, the traces are represented by an activity-dependent ion channel that is shown to regulate the input received by a postsynaptic cell and generate intrinsic graded persistent firing levels. We show how spike-based Hebbian-Bayesian learning can be performed in a simulated inference task using integrate-and-fire (IAF) neurons that are Poisson-firing and background-driven, similar to the preferred regime of cortical neurons. Our results support the view that neurons can represent information in the form of probability distributions, and that probabilistic inference could be a functional by-product of coupled synaptic and nonsynaptic mechanisms operating over several timescales. The model provides a biophysical realization of Bayesian computation by reconciling several observed neural phenomena whose functional effects are only partially understood in concert. PMID:24782758
2012-01-01
Background Despite computational challenges, elucidating conformations that a protein system assumes under physiologic conditions for the purpose of biological activity is a central problem in computational structural biology. While these conformations are associated with low energies in the energy surface that underlies the protein conformational space, few existing conformational search algorithms focus on explicitly sampling low-energy local minima in the protein energy surface. Methods This work proposes a novel probabilistic search framework, PLOW, that explicitly samples low-energy local minima in the protein energy surface. The framework combines algorithmic ingredients from evolutionary computation and computational structural biology to effectively explore the subspace of local minima. A greedy local search maps a conformation sampled in conformational space to a nearby local minimum. A perturbation move jumps out of a local minimum to obtain a new starting conformation for the greedy local search. The process repeats in an iterative fashion, resulting in a trajectory-based exploration of the subspace of local minima. Results and conclusions The analysis of PLOW's performance shows that, by navigating only the subspace of local minima, PLOW is able to sample conformations near a protein's native structure, either more effectively or as well as state-of-the-art methods that focus on reproducing the native structure for a protein system. Analysis of the actual subspace of local minima shows that PLOW samples this subspace more effectively that a naive sampling approach. Additional theoretical analysis reveals that the perturbation function employed by PLOW is key to its ability to sample a diverse set of low-energy conformations. This analysis also suggests directions for further research and novel applications for the proposed framework. PMID:22759582
Counterfactual distribution of Schrödinger cat states
NASA Astrophysics Data System (ADS)
Shenoy-Hejamadi, Akshata; Srikanth, R.
2015-12-01
In the counterfactual cryptography scheme proposed by Noh, the sender Alice probabilistically transmits classical information to the receiver Bob without the physical travel of a particle. Here we generalize this idea to the distribution of quantum entanglement. The key insight is to replace their classical input choices with quantum superpositions. We further show that the scheme can be generalized to counterfactually distribute multipartite cat states.
NASA Astrophysics Data System (ADS)
Furbish, D. J.; Roering, J. J.
2013-12-01
Recent discussions of local versus nonlocal sediment transport on hillslopes offer a lens for considering uncertainty in formulations of transport rates that are aimed at characterizing patchy, intermittent sediment motions in steeplands. Here we describe a general formulation for transport that is based on a convolution integral of the factors controlling the entrainment and disentrainment of sediment particles on a hillslope. In essence, such a formulation represents a ';flux' version of the Master equation, a general probabilistic (kinematic) formulation of mass conservation. As such, with the relevant physics invoked to represent entrainment and disentrainment, a nonlocal formulation quite happily accommodates local transport (and looks/behaves like a local formulation), as well as nonlocal transport, depending on the characteristic length scale of particle motions relative to the length scale at which the factors controlling particle transport are defined or measured. Nonetheless, nonlocal formulations of the sediment flux have mostly (but not entirely) outpaced experimental and field-based observations needed to inform the theory. At risk is bringing to bear a sophisticated mathematics that is not supported by our uncertain understanding of the processes involved. Experimental and field-based measurements of entrainment rates and particle travel distances are difficult to obtain, notably given the intermittency of many hillslope transport processes and the slow rates of change in hillslope morphology. A ';test' of a specific nonlocal formulation applied to hillslope evolution must therefore in part rest on consistency between measured hillslope configurations and predicted (i.e., modeled) hillslope configurations predicated on the proposed nonlocal formulation, assuming sufficient knowledge of initial and boundary conditions. On the other hand, because of its probabilistic basis, the formulation is in principle well suited to the task of describing transport relevant to geomorphic timescales -- in view of the stochastic nature of the transport processes occurring over these timescales and the uncertainty of our understanding of the physics involved. Moreover, in its basic form, the nonlocal formulation of the sediment flux is such that appropriate physics can be readily embedded within it as we learn more. And, the formulation is space-time averaged in a way that accommodates discontinuous (patchy, intermittent) sediment motions.
Vieira, Natália Donzeli; Testa, Daniela; Ruas, Paula Cristine; Salvini, Tânia de Fátima; Catai, Aparecida Maria; Melo, Ruth Caldeira
2017-04-01
Recent scientific evidence supports the benefits of Pilates exercises on postural balance and muscle strength of older persons. However, their effects on other aspects of physical fitness, which are also important for independent living in older age, are still unknown. To investigate the effects of a 12-week Pilates-inspired exercise program on the functional performance of community-dwelling older women. Forty community-dwelling older women were randomly enrolled in a Pilates-inspired exercise training (2 times/week, 60 min/session) (PG, n = 21, 66.0 ± 1.4yrs) or kept in the control group (CG; n = 19, 63.3 ± 0.9yrs). The Pilates exercises were conducted in small groups and performed on mats (using accessories such as exercise rubber bands, swiss and exercise balls). The functional performance on one-leg stance (OLS), timed up and go (TUG), five-times-sit-to-stand (STS) and 6-min walk (6 MW) tests was evaluated before and after the 12-week Pilates training or control follow-up period. After 12 weeks, time effects were observed for STS (p = 0.03) and 6 MW tests (p < 0.01). Only among PG subjects did the time spent to rise from a chair and return to a seated position decrease significantly (2.0 s faster, p = 0.02) and the distance walked in 6 min increase (∼30 m, p < 0.01). OLS and TUG performance remained unaltered in both groups. Pilates-inspired exercises improved dynamic balance, lower-extremity strength and aerobic resistance in community-dwelling older women. Therefore, it may be a potentially effective exercise regimen to maintain physical fitness in old age. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pulinets, S. A.; Andrzej, K.; Hernandez-Pajares, M.; Cherniak, I.; Zakharenkova, I.; Rothkaehl, H.; Davidenko, D.
2017-12-01
The INSPIRE project is dedicated to the study of physical processes and their effects in ionosphere which could be determined as earthquake precursors together with detailed description of the methodology of ionospheric pre-seismic anomalies definition. It was initiated by ESA and carried out by international consortium. The physical mechanisms of the ionospheric pre-seismic anomalies generation from ground to the ionosphere altitudes were formulated within framework of the Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling (LAIMC) model (Pulinets et al., 2015). The general algorithm for the identification of the ionospheric precursors was formalized which also takes into account the external Space Weather factors able to generate the false alarms. Importance of the special stable pattern called the "precursor mask" was highlighted which is based on self-similarity of pre-seismic ionospheric variations. The role of expert decision in pre-seismic anomalies interpretation for generation of seismic warning is important as well. The algorithm performance of the LAIMC seismo-ionospheric effect detection module has been demonstrated using the L'Aquila 2009 earthquake as a case study. The results of INSPIRE project have demonstrated that the ionospheric anomalies registered before the strong earthquakes could be used as reliable precursors. The detailed classification of the pre-seismic anomalies was presented in different regions of the ionosphere and signatures of the pre-seismic anomalies as detected by ground and satellite based instruments were described what clarified methodology of the precursor's identification from ionospheric multi-instrumental measurements. Configuration for the dedicated multi-observation experiment and satellite payload was proposed for the future implementation of the INSPIRE project results. In this regard the multi-instrument set can be divided by two groups: space equipment and ground-based support, which could be used for real-time monitoring. Together with scientific and technical tasks the set of political, logistic and administrative problems (including certification of approaches by seismological community, juridical procedures by the governmental authorities) should be resolved for the real earthquake forecast effectuation.
Probabilistic Space Weather Forecasting: a Bayesian Perspective
NASA Astrophysics Data System (ADS)
Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.
2017-12-01
Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.
NASA Astrophysics Data System (ADS)
Strauch, R. L.; Istanbulluoglu, E.
2017-12-01
We develop a landslide hazard modeling approach that integrates a data-driven statistical model and a probabilistic process-based shallow landslide model for mapping probability of landslide initiation, transport, and deposition at regional scales. The empirical model integrates the influence of seven site attribute (SA) classes: elevation, slope, curvature, aspect, land use-land cover, lithology, and topographic wetness index, on over 1,600 observed landslides using a frequency ratio (FR) approach. A susceptibility index is calculated by adding FRs for each SA on a grid-cell basis. Using landslide observations we relate susceptibility index to an empirically-derived probability of landslide impact. This probability is combined with results from a physically-based model to produce an integrated probabilistic map. Slope was key in landslide initiation while deposition was linked to lithology and elevation. Vegetation transition from forest to alpine vegetation and barren land cover with lower root cohesion leads to higher frequency of initiation. Aspect effects are likely linked to differences in root cohesion and moisture controlled by solar insulation and snow. We demonstrate the model in the North Cascades of Washington, USA and identify locations of high and low probability of landslide impacts that can be used by land managers in their design, planning, and maintenance.
Robust Control Design for Uncertain Nonlinear Dynamic Systems
NASA Technical Reports Server (NTRS)
Kenny, Sean P.; Crespo, Luis G.; Andrews, Lindsey; Giesy, Daniel P.
2012-01-01
Robustness to parametric uncertainty is fundamental to successful control system design and as such it has been at the core of many design methods developed over the decades. Despite its prominence, most of the work on robust control design has focused on linear models and uncertainties that are non-probabilistic in nature. Recently, researchers have acknowledged this disparity and have been developing theory to address a broader class of uncertainties. This paper presents an experimental application of robust control design for a hybrid class of probabilistic and non-probabilistic parametric uncertainties. The experimental apparatus is based upon the classic inverted pendulum on a cart. The physical uncertainty is realized by a known additional lumped mass at an unknown location on the pendulum. This unknown location has the effect of substantially altering the nominal frequency and controllability of the nonlinear system, and in the limit has the capability to make the system neutrally stable and uncontrollable. Another uncertainty to be considered is a direct current motor parameter. The control design objective is to design a controller that satisfies stability, tracking error, control power, and transient behavior requirements for the largest range of parametric uncertainties. This paper presents an overview of the theory behind the robust control design methodology and the experimental results.
230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paces, James B.
This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rindsmore » on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.« less
230Th/U ages Supporting Hanford Site‐Wide Probabilistic Seismic Hazard Analysis
Paces, James B.
2014-01-01
This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.
Detecting Cyber Attacks On Nuclear Power Plants
NASA Astrophysics Data System (ADS)
Rrushi, Julian; Campbell, Roy
This paper proposes an unconventional anomaly detection approach that provides digital instrumentation and control (I&C) systems in a nuclear power plant (NPP) with the capability to probabilistically discern between legitimate protocol frames and attack frames. The stochastic activity network (SAN) formalism is used to model the fusion of protocol activity in each digital I&C system and the operation of physical components of an NPP. SAN models are employed to analyze links between protocol frames as streams of bytes, their semantics in terms of NPP operations, control data as stored in the memory of I&C systems, the operations of I&C systems on NPP components, and NPP processes. Reward rates and impulse rewards are defined in the SAN models based on the activity-marking reward structure to estimate NPP operation profiles. These profiles are then used to probabilistically estimate the legitimacy of the semantics and payloads of protocol frames received by I&C systems.
A Hough Transform Global Probabilistic Approach to Multiple-Subject Diffusion MRI Tractography
Aganj, Iman; Lenglet, Christophe; Jahanshad, Neda; Yacoub, Essa; Harel, Noam; Thompson, Paul M.; Sapiro, Guillermo
2011-01-01
A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in this work. The proposed framework tests candidate 3D curves in the volume, assigning to each one a score computed from the diffusion images, and then selects the curves with the highest scores as the potential anatomical connections. The algorithm avoids local minima by performing an exhaustive search at the desired resolution. The technique is easily extended to multiple subjects, considering a single representative volume where the registered high-angular resolution diffusion images (HARDI) from all the subjects are non-linearly combined, thereby obtaining population-representative tracts. The tractography algorithm is run only once for the multiple subjects, and no tract alignment is necessary. We present experimental results on HARDI volumes, ranging from simulated and 1.5T physical phantoms to 7T and 4T human brain and 7T monkey brain datasets. PMID:21376655
Verhoef, Talitha I; Trend, Verena; Kelly, Barry; Robinson, Nigel; Fox, Paul; Morris, Stephen
2016-07-22
We evaluated the cost-effectiveness of the Give-it-a-Go programme, which offers free leisure centre memberships to physically inactive members of the public in a single London Borough receiving state benefits. A decision analytic Markov model was developed to analyse lifetime costs and quality-adjusted life-years (QALYs) of 1025 people recruited to the intervention versus no intervention. In the intervention group, people were offered 4 months of free membership at a leisure centre. Physical activity levels were assessed at 0 and 4 months using the International Physical Activity Questionnaire (IPAQ). Higher levels of physical activity were assumed to decrease the risk of coronary heart disease, stroke and diabetes mellitus type II, as well as improve mental health. Costs were assessed from a National Health Service (NHS) perspective. Uncertainty was assessed using one-way and probabilistic sensitivity analyses. One-hundred fifty nine participants (15.5 %) completed the programme by attending the leisure centre for 4 months. Compared with no intervention, Give it a Go increased costs by £67.25 and QALYs by 0.0033 (equivalent to 1.21 days in full health) per recruited person. The incremental costs per QALY gained were £20,347. The results were highly sensitive to the magnitude of mental health gain due to physical activity and the duration of the effect of the programme (1 year in the base case analysis). When the mental health gain was omitted from the analysis, the incremental cost per QALY gained increased to almost £1.5 million. In the probabilistic sensitivity analysis, the incremental costs per QALY gained were below £20,000 in 39 % of the 5000 simulations. Give it a Go did not significantly increase life-expectancy, but had a positive influence on quality of life due to the mental health gain of physical activity. If the increase in physical activity caused by Give it a Go lasts for more than 1 year, the programme would be cost-effective given a willingness to pay for a QALY of £20,000.
Non-physics peer demonstrators in undergraduate laboratories: a study of students’ perceptions
NASA Astrophysics Data System (ADS)
Braun, Michael; Kirkup, Les
2016-01-01
Laboratory demonstrators play a crucial role in facilitating students’ learning in physics subjects. Inspired by the success of peer-led activities, we introduced peer demonstrators to support student learning in first-year physics subjects that enrol students not intending to major in physics. Surveys were administered to 1700 students over 4 years in four subjects to examine student perceptions of how demonstrators assisted them in the laboratory. Scores awarded to peer demonstrators by students were no lower than those awarded to demonstrators traditionally employed in the first year physics laboratory. These latter demonstrators were drawn mainly from the ranks of physics research students. The findings validate the recruitment of peer demonstrators and will be used to inform the recruitment and support programmes for laboratory demonstrators.
NASA Astrophysics Data System (ADS)
Allain, Rhett
2016-05-01
We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.
A Physics Show Performed by Students for Kids: From Mechanics to Elementary Particle Physics
NASA Astrophysics Data System (ADS)
Dreiner, Herbi K.
2008-09-01
Physics students spend the early part of their training attending physics and mathematics lectures, solving problem sets, and experimenting in laboratory courses. The program is typically intensive and fairly rigid. They have little opportunity to follow their own curiosity or apply their knowledge. There have been many attempts to address this deficiency, specifically through outreach activities.1-23 For example, since 1984 Clint Sprott (University of Wisconsin) hosts a physics show entitled "The Wonders of Physics!" Dressed up as a circus director and assisted by students, Professor Sprott presents entertaining and educating experiments to a regularly packed auditorium of all age groups.5 This was in turn inspired by the "Chemistry is Fun" presentations of Basam Shakhashiri (University of Wisconsin), where the students are also involved.6
Zebrafish response to a robotic replica in three dimensions
Ruberto, Tommaso; Mwaffo, Violet; Singh, Sukhgewanpreet; Neri, Daniele
2016-01-01
As zebrafish emerge as a species of choice for the investigation of biological processes, a number of experimental protocols are being developed to study their social behaviour. While live stimuli may elicit varying response in focal subjects owing to idiosyncrasies, tiredness and circadian rhythms, video stimuli suffer from the absence of physical input and rely only on two-dimensional projections. Robotics has been recently proposed as an alternative approach to generate physical, customizable, effective and consistent stimuli for behavioural phenotyping. Here, we contribute to this field of investigation through a novel four-degree-of-freedom robotics-based platform to manoeuvre a biologically inspired three-dimensionally printed replica. The platform enables three-dimensional motions as well as body oscillations to mimic zebrafish locomotion. In a series of experiments, we demonstrate the differential role of the visual stimuli associated with the biologically inspired replica and its three-dimensional motion. Three-dimensional tracking and information-theoretic tools are complemented to quantify the interaction between zebrafish and the robotic stimulus. Live subjects displayed a robust attraction towards the moving replica, and such attraction was lost when controlling for its visual appearance or motion. This effort is expected to aid zebrafish behavioural phenotyping, by offering a novel approach to generate physical stimuli moving in three dimensions. PMID:27853566
Admissible perturbations and false instabilities in PT -symmetric quantum systems
NASA Astrophysics Data System (ADS)
Znojil, Miloslav
2018-03-01
One of the most characteristic mathematical features of the PT -symmetric quantum mechanics is the explicit Hamiltonian dependence of its physical Hilbert space of states H =H (H ) . Some of the most important physical consequences are discussed, with emphasis on the dynamical regime in which the system is close to phase transition. Consistent perturbation treatment of such a regime is proposed. An illustrative application of the innovated perturbation theory to a non-Hermitian but PT -symmetric user-friendly family of J -parametric "discrete anharmonic" quantum Hamiltonians H =H (λ ⃗) is provided. The models are shown to admit the standard probabilistic interpretation if and only if the parameters remain compatible with the reality of the spectrum, λ ⃗∈D(physical ) . In contradiction to conventional wisdom, the systems are then shown to be stable with respect to admissible perturbations, inside the domain D(physical ), even in the immediate vicinity of the phase-transition boundaries ∂ D(physical ) .
Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering
NASA Astrophysics Data System (ADS)
Hynes-Griffin, M. E.; Buege, L. L.
1983-09-01
Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.
A Roadmap to Fundamental Physics from LISA EMRI Observations
NASA Astrophysics Data System (ADS)
Sopuerta, Carlos F.
2010-09-01
The Laser Interferometer Space Antenna is a future space-based gravitational-wave observatory (a joint mission between the European Space Agency and the US National Aeronautics and Space Administration) that is expected to be launched during the next decade. It will operate in the low-frequency gravitational-wave band, probably the richest part of the gravitational-wave spectrum in terms of science potential, where we find: massive black hole mergers as the outcome of galaxy collisions; many galactic compact binaries; the capture and subsequent inspiral of a stellar compact object into a massive black hole; and gravitational-wave signatures from early universe physical processes connected to high-energy physics and physics not yet fully understood. In this article we focus on the third type of source, the so-called extreme-mass-ratio inspirals, a high precision tool for gravitational wave astronomy that can be used, among other things, to advance in our understanding of fundamental physics questions like the nature and structure of black holes and the details of the gravitational interaction in regimes not yet proven by other experiments/observatories. Here, we give an account of some of the progress made in the development of tools to exploit the future LISA EMRI observations, we discuss what scientific questions we can try to answer from this information and, finally, we discuss the main theoretical challenges that we face in order to develop all the necessary tools to maximize the scientific outcome and some avenues that can be followed to make progress in the near future.
A Survey of Memristive Threshold Logic Circuits.
Maan, Akshay Kumar; Jayadevi, Deepthi Anirudhan; James, Alex Pappachen
2017-08-01
In this paper, we review different memristive threshold logic (MTL) circuits that are inspired from the synaptic action of the flow of neurotransmitters in the biological brain. The brainlike generalization ability and the area minimization of these threshold logic circuits aim toward crossing Moore's law boundaries at device, circuits, and systems levels. Fast switching memory, signal processing, control systems, programmable logic, image processing, reconfigurable computing, and pattern recognition are identified as some of the potential applications of MTL systems. The physical realization of nanoscale devices with memristive behavior from materials, such as TiO 2 , ferroelectrics, silicon, and polymers, has accelerated research effort in these application areas, inspiring the scientific community to pursue the design of high-speed, low-cost, low-power, and high-density neuromorphic architectures.
Scalar perturbations of Eddington-inspired Born-Infeld braneworld
NASA Astrophysics Data System (ADS)
Yang, Ke; Liu, Yu-Xiao; Guo, Bin; Du, Xiao-Long
2017-09-01
We consider the scalar perturbations of Eddington-inspired Born-Infeld braneworld models in this paper. The dynamical equation for the physical propagating degree of freedom ξ (xμ,y ) is achieved by using the Arnowitt-Deser-Misner decomposition method: F1(y )∂y2ξ +F2(y )∂yξ +∂μ∂μ ξ =0 . We conclude that the solution is tachyonic-free and stable under scalar perturbations for F1(y )>0 but unstable for F1(y )<0 . The stability of a known analytic domain wall solution with the warp factor given by a (y )=sech3/4 p(k y ) is analyzed and it is shown that only the solution for 0
Argoball: A Dynamic-Rules Game for Teaching Striking-and-Fielding Game Tactics
ERIC Educational Resources Information Center
Todorovich, John R.; Fox, James P.; Ryan, Stuart; Todorovich, Sarah W.
2008-01-01
Physical educators using the teaching games for understanding (TGFU) approach employ modified game forms to teach children skills and tactics. Inspired by Danish Longball, "Argoball" is a dynamic-rules game that teachers interested in the TGFU model use to help their students develop effective skills and tactics to better participate in…
Pokémon Go: An Unexpected Inspiration for Next Generation Learning Environments
ERIC Educational Resources Information Center
Nigaglioni, Irene
2017-01-01
Although mobile applications and games often seem isolating and somewhat stationary, last year's augmented reality (AR) gaming craze Pokémon Go demonstrated how technology has the potential to promote socialization, collaboration, and physical activity while still engaging users. Pokémon Go's use of AR technology, which superimposes…
Global Warning: Project-Based Science Inspired by the Intergovernmental Panel on Climate Change
ERIC Educational Resources Information Center
Colaianne, Blake
2015-01-01
Misconceptions about climate change are common, which suggests a need to effectively address the subject in the classroom. This article describes a project-based science activity in which students report on the physical basis, adaptations, and mitigation of this global problem, adapting the framework of the United Nations' Intergovernmental Panel…
ERIC Educational Resources Information Center
Dryburgh, Anne; Fortin, Sylvie
2010-01-01
The aim of this qualitative study was to investigate professional ballet dancers' perceptions of the impact of surveillance on their psychological and physical health. The theoretical framework was inspired by Foucault's writing, particularly his concepts of surveillance, power, discipline and docile bodies. Fifteen professional ballet dancers…
Malcolm, Alan D B
2002-04-01
The books of Jean Henri Fabre, replete with lively accounts of his observations on nature, inspired generations of children from all over the world. The detail in his study of insects and the entertaining presentation allowed readers to absorb his fascination. Yet, he was a physics teacher by profession and virtually self taught on matters of entomology.
ERIC Educational Resources Information Center
Smith, Shaunna
2013-01-01
Digital fabrication consists of manufacturing design technology that is used to facilitate the creation of physical objects. Existing research suggests digital fabrication technology can inspire student creativity and innovation in mathematics and science. However, there is a lack of research that informs teacher education by identifying practical…
Biologically Inspired Purification and Dispersion of SWCNTs
NASA Technical Reports Server (NTRS)
Feeback, Daniel L.; Clarke, Mark S.; Nikolaev, Pavel
2009-01-01
A biologically inspired method has been developed for (1) separating single-wall carbon nanotubes (SWCNTs) from other materials (principally, amorphous carbon and metal catalysts) in raw production batches and (2) dispersing the SWCNTs as individual particles (in contradistinction to ropes and bundles) in suspension, as required for a number of applications. Prior methods of purification and dispersal of SWCNTs involve, variously, harsh physical processes (e.g., sonication) or harsh chemical processes (e.g., acid reflux). These processes do not completely remove the undesired materials and do not disperse bundles and ropes into individual suspended SWCNTs. Moreover, these processes cut long SWCNTs into shorter pieces, yielding typical nanotube lengths between 150 and 250 nm. In contrast, the present method does not involve harsh physical or chemical processes. The method involves the use of biologically derived dispersal agents (BDDAs) in an aqueous solution that is mechanically homogenized (but not sonicated) and centrifuged. The dense solid material remaining after centrifugation is resuspended by vortexing in distilled water, yielding an aqueous suspension of individual, separated SWCNTs having lengths from about 10 to about 15 microns.
An Experimental Investigation on Bio-inspired Icephobic Coatings for Aircraft Icing Mitigation
NASA Astrophysics Data System (ADS)
Hu, Hui; Li, Haixing; Waldman, Rye
2016-11-01
By leveraging the Icing Research Tunnel available at Iowa State University (ISU-IRT), a series of experimental investigations were conducted to elucidate the underlying physics pertinent to aircraft icing phenomena. A suite of advanced flow diagnostic techniques, which include high-speed photographic imaging, digital image projection (DIP), and infrared (IR) imaging thermometry, were developed and applied to quantify the transient behavior of water droplet impingement, wind-driven surface water runback, unsteady heat transfer and dynamic ice accreting process over the surfaces of airfoil/wing models. The icephobic performance of various bio-inspired superhydrophobic coatings were evaluated quantitatively at different icing conditions. The findings derived from the icing physics studies can be used to improve current icing accretion models for more accurate prediction of ice formation and accretion on aircraft wings and to develop effective anti-/deicing strategies for safer and more efficient operation of aircraft in cold weather. The research work is partially supported by NASA with Grant Number NNX12AC21A and National Science Foundation under Award Numbers of CBET-1064196 and CBET-1435590.
NASA Astrophysics Data System (ADS)
Hausman, Daniel M.
Causation is a frustrating subject. Suppose one begins with some promising idea such as that causation is counterfactual dependence or statistical relevance. One then develops this idea with care and intelligence, revises and improves it to cope with criticisms, and by the time one is finished, sane people will be looking elsewhere. If one wants conclusive reasons to reject the counterfactual theory of causation, one can do no better than to read Lewis' (1986) many postscripts. If one wants the best refutation of a probabilistic theory of causation, then one should read my colleague, Ellery Eells' (1991) magisterial defense. In Physical Causation, Phil Dowe performs the same service for physical process/interaction theories of causation.
NASA Astrophysics Data System (ADS)
Smith, David; Schuldt, Carsten; Lorenz, Jessica; Tschirner, Teresa; Moebius-Winkler, Maximilian; Kaes, Josef; Glaser, Martin; Haendler, Tina; Schnauss, Joerg
2015-03-01
Biologically evolved materials are often used as inspiration in the development of new materials as well as examinations into the underlying physical principles governing their behavior. For instance, the biopolymer constituents of the highly dynamic cellular cytoskeleton such as actin have inspired a deep understanding of soft polymer-based materials. However, the molecular toolbox provided by biological systems has been evolutionarily optimized to carry out the necessary functions of cells, and the inability modify basic properties such as biopolymer stiffness hinders a meticulous examination of parameter space. Using actin as inspiration, we circumvent these limitations using model systems assembled from programmable materials such as DNA. Nanorods with comparable, but controllable dimensions and mechanical properties as actin can be constructed from small sets of specially designed DNA strands. In entangled gels, these allow us to systematically determine the dependence of network mechanical properties on parameters such as persistence length and crosslink strength. At higher concentrations in the presence of local attractive forces, we see a transition to highly-ordered bundled and ``aster'' phases similar to those previously characterized in systems of actin or microtubules.
Nanofluidics in two-dimensional layered materials: inspirations from nature.
Gao, Jun; Feng, Yaping; Guo, Wei; Jiang, Lei
2017-08-29
With the advance of chemistry, materials science, and nanotechnology, significant progress has been achieved in the design and application of synthetic nanofluidic devices and materials, mimicking the gating, rectifying, and adaptive functions of biological ion channels. Fundamental physics and chemistry behind these novel transport phenomena on the nanoscale have been explored in depth on single-pore platforms. However, toward real-world applications, one major challenge is to extrapolate these single-pore devices into macroscopic materials. Recently, inspired partially by the layered microstructure of nacre, the material design and large-scale integration of artificial nanofluidic devices have stepped into a completely new stage, termed 2D nanofluidics. Unique advantages of the 2D layered materials have been found, such as facile and scalable fabrication, high flux, efficient chemical modification, tunable channel size, etc. These features enable wide applications in, for example, biomimetic ion transport manipulation, molecular sieving, water treatment, and nanofluidic energy conversion and storage. This review highlights the recent progress, current challenges, and future perspectives in this emerging research field of "2D nanofluidics", with emphasis on the thought of bio-inspiration.
Granacher, Urs; Muehlbauer, Thomas; Gollhofer, Albert; Kressig, Reto W; Zahner, Lukas
2011-01-01
The risk of sustaining a fall and fall-related injuries is particularly high in children and seniors, which is why there is a need to develop fall-preventive intervention programs. An intergenerational approach in balance and strength promotion appears to have great potential because it is specifically tailored to the physical, social and behavioural needs of children and seniors. Burtscher and Kopp [Gerontology, DOI: 10.1159/000322930] raised the question whether our previously published mini-review is evidence-based or evidence-inspired. These authors postulate that we did not follow a 4-stage conceptual model for the development of injury and/or fall-preventive intervention programs. In response to this criticism, we present information from the mini-review that comply with the 4-stage model incorporating evidence-based and evidence-inspired components. We additionally provide information on how to implement an intergenerational balance and resistance training approach in a school setting based on a study that is being currently conducted. Copyright © 2010 S. Karger AG, Basel.
Physics Education activities sponsored by LAPEN
NASA Astrophysics Data System (ADS)
Mora Ley, Cesar E.
2007-05-01
In this work we present the first activities of the Latin-American Physics Education Network (LAPEN) organized by representatives of Brazil, Cuba, Mexico, Argentina, Colombia, Uruguay, Peru and Spain. These activities include Seminars, Congress, Postgraduate Programs on Physics Education and several publications. The creation of LAPEN has been inspired and warranted by members of the International Commission on Physics Education of the International Union of Pure and Applied Physics. LAPEN was constituted in the International Meeting on Teaching Physics and Training Teachers (RIEFEP 2005) which was held in Matanzas, Cuba in November 2005. The creation of LAPEN was also warranted by the General Assembly of the IX Inter-American Conference on Physics Education held in San José, Costa Rica from 3 to 7 July 2006, and by the ICPE Committee in the International Conference on Physics Education 2006 at Tokyo, Japan. LAPEN has a Coordinator Committee integrated by a President, a Vice-president and an Executive Secretary.
VizieR Online Data Catalog: A catalog of exoplanet physical parameters (Foreman-Mackey+, 2014)
NASA Astrophysics Data System (ADS)
Foreman-Mackey, D.; Hogg, D. W.; Morton, T. D.
2017-05-01
The first ingredient for any probabilistic inference is a likelihood function, a description of the probability of observing a specific data set given a set of model parameters. In this particular project, the data set is a catalog of exoplanet measurements and the model parameters are the values that set the shape and normalization of the occurrence rate density. (2 data files).
Skill Assessment for Coupled Biological/Physical Models of Marine Systems
2009-01-01
cluster analysis e.g., Clark and Corley, 2006) and shown that the dimensions of the problem can be reduced and multivariate and univariate goodness...information; a follow-up analysis (Arhonditsis et al., 2006) reported no relationship between the level of skill assessment presented or the accuracy of the...uncertainty analysis (Beck, 1987), model selection (Kass and Raftery, 1995), model averaging (Hoeting et al., 1999), and scores for probabilistic
Statistical physics of medical diagnostics: Study of a probabilistic model.
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, Katrina M.; Zumwalt, Hannah Ruth; Clark, Andrew Jordan
2016-03-01
Hydrogen Risk Assessment Models (HyRAM) is a prototype software toolkit that integrates data and methods relevant to assessing the safety of hydrogen fueling and storage infrastructure. The HyRAM toolkit integrates deterministic and probabilistic models for quantifying accident scenarios, predicting physical effects, and characterizing the impact of hydrogen hazards, including thermal effects from jet fires and thermal pressure effects from deflagration. HyRAM version 1.0 incorporates generic probabilities for equipment failures for nine types of components, and probabilistic models for the impact of heat flux on humans and structures, with computationally and experimentally validated models of various aspects of gaseous hydrogen releasemore » and flame physics. This document provides an example of how to use HyRAM to conduct analysis of a fueling facility. This document will guide users through the software and how to enter and edit certain inputs that are specific to the user-defined facility. Description of the methodology and models contained in HyRAM is provided in [1]. This User’s Guide is intended to capture the main features of HyRAM version 1.0 (any HyRAM version numbered as 1.0.X.XXX). This user guide was created with HyRAM 1.0.1.798. Due to ongoing software development activities, newer versions of HyRAM may have differences from this guide.« less
Statistical physics of medical diagnostics: Study of a probabilistic model
NASA Astrophysics Data System (ADS)
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
Aono, Masashi; Naruse, Makoto; Kim, Song-Ju; Wakabayashi, Masamitsu; Hori, Hirokazu; Ohtsu, Motoichi; Hara, Masahiko
2013-06-18
Biologically inspired computing devices and architectures are expected to overcome the limitations of conventional technologies in terms of solving computationally demanding problems, adapting to complex environments, reducing energy consumption, and so on. We previously demonstrated that a primitive single-celled amoeba (a plasmodial slime mold), which exhibits complex spatiotemporal oscillatory dynamics and sophisticated computing capabilities, can be used to search for a solution to a very hard combinatorial optimization problem. We successfully extracted the essential spatiotemporal dynamics by which the amoeba solves the problem. This amoeba-inspired computing paradigm can be implemented by various physical systems that exhibit suitable spatiotemporal dynamics resembling the amoeba's problem-solving process. In this Article, we demonstrate that photoexcitation transfer phenomena in certain quantum nanostructures mediated by optical near-field interactions generate the amoebalike spatiotemporal dynamics and can be used to solve the satisfiability problem (SAT), which is the problem of judging whether a given logical proposition (a Boolean formula) is self-consistent. SAT is related to diverse application problems in artificial intelligence, information security, and bioinformatics and is a crucially important nondeterministic polynomial time (NP)-complete problem, which is believed to become intractable for conventional digital computers when the problem size increases. We show that our amoeba-inspired computing paradigm dramatically outperforms a conventional stochastic search method. These results indicate the potential for developing highly versatile nanoarchitectonic computers that realize powerful solution searching with low energy consumption.
Geology and Design: Formal and Rational Connections
NASA Astrophysics Data System (ADS)
Eriksson, S. C.; Brewer, J.
2016-12-01
Geological forms and the manmade environment have always been inextricably linked. From the time that Upper Paleolithic man created drawings in the Lascaux Caves in the southwest of France, geology has provided a critical and dramatic spoil for human creativity. This inspiration has manifested itself in many different ways, and the history of architecture is rife with examples of geologically derived buildings. During the early 20th Century, German Expressionist art and architecture was heavily influenced by the natural and often translucent quality of minerals. Architects like Bruno Taut drew and built crystalline forms that would go on to inspire the more restrained Bauhaus movement. Even within the context of Contemporary architecture, geology has been a fertile source for inspiration. Architectural practices across the globe leverage the rationality and grounding found in geology to inform a process that is otherwise dominated by computer-driven parametric design. The connection between advanced design technology and the beautifully realized geo natural forms insures that geology will be a relevant source of architectural inspiration well into the 21st century. The sometimes hidden relationship of geology to the various sub-disciplines of Design such as Architecture, Interiors, Landscape Architecture, and Historic Preservation is explored in relation to curriculum and the practice of design. Topics such as materials, form, history, the cultural and physical landscape, natural hazards, and global design enrich and inform curriculum across the college. Commonly, these help define place-based education.
Strasser, Michael; Theis, Fabian J.; Marr, Carsten
2012-01-01
A toggle switch consists of two genes that mutually repress each other. This regulatory motif is active during cell differentiation and is thought to act as a memory device, being able to choose and maintain cell fate decisions. Commonly, this switch has been modeled in a deterministic framework where transcription and translation are lumped together. In this description, bistability occurs for transcription factor cooperativity, whereas autoactivation leads to a tristable system with an additional undecided state. In this contribution, we study the stability and dynamics of a two-stage gene expression switch within a probabilistic framework inspired by the properties of the Pu/Gata toggle switch in myeloid progenitor cells. We focus on low mRNA numbers, high protein abundance, and monomeric transcription-factor binding. Contrary to the expectation from a deterministic description, this switch shows complex multiattractor dynamics without autoactivation and cooperativity. Most importantly, the four attractors of the system, which only emerge in a probabilistic two-stage description, can be identified with committed and primed states in cell differentiation. To begin, we study the dynamics of the system and infer the mechanisms that move the system between attractors using both the quasipotential and the probability flux of the system. Next, we show that the residence times of the system in one of the committed attractors are geometrically distributed. We derive an analytical expression for the parameter of the geometric distribution, therefore completely describing the statistics of the switching process and elucidate the influence of the system parameters on the residence time. Moreover, we find that the mean residence time increases linearly with the mean protein level. This scaling also holds for a one-stage scenario and for autoactivation. Finally, we study the implications of this distribution for the stability of a switch and discuss the influence of the stability on a specific cell differentiation mechanism. Our model explains lineage priming and proposes the need of either high protein numbers or long-term modifications such as chromatin remodeling to achieve stable cell fate decisions. Notably, we present a system with high protein abundance that nevertheless requires a probabilistic description to exhibit multistability, complex switching dynamics, and lineage priming. PMID:22225794
Probabilistic simulation of stress concentration in composite laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, L.
1993-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The probabilistic composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties while probabilistic finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate stress concentration factors such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using it to simulate the stress concentration factors in composite laminates made from three different composite systems. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the stress concentration factors are influenced by local stiffness variables, by load eccentricities and by initial stress fields.
Probabilistic load simulation: Code development status
NASA Astrophysics Data System (ADS)
Newell, J. F.; Ho, H.
1991-05-01
The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.
PlayPhysics: An Emotional Games Learning Environment for Teaching Physics
NASA Astrophysics Data System (ADS)
Muñoz, Karla; Kevitt, Paul Mc; Lunney, Tom; Noguez, Julieta; Neri, Luis
To ensure learning, game-based learning environments must incorporate assessment mechanisms, e.g. Intelligent Tutoring Systems (ITSs). ITSs are focused on recognising and influencing the learner's emotional or motivational states. This research focuses on designing and implementing an affective student model for intelligent gaming, which reasons about the learner's emotional state from cognitive and motivational variables using observable behaviour. A Probabilistic Relational Models (PRMs) approach is employed to derive Dynamic Bayesian Networks (DBNs). The model uses the Control-Value theory of 'achievement emotions' as a basis. A preliminary test was conducted to recognise the students' prospective-outcome emotions with results presented and discussed. PlayPhysics is an emotional games learning environment for teaching Physics. Once the affective student model proves effective it will be incorporated into PlayPhysics' architecture. The design, evaluation and postevaluation of PlayPhysics are also discussed. Future work will focus on evaluating the affective student model with a larger population of students, and on providing affective feedback.
Watt, Jennifer C.; Grove, George A.; Wollam, Mariegold E.; Uyar, Fatma; Mataro, Maria; Cohen, Neal J.; Howard, Darlene V.; Howard, James H.; Erickson, Kirk I.
2016-01-01
Accumulating evidence suggests that physical activity improves explicit memory and executive cognitive functioning at the extreme ends of the lifespan (i.e., in older adults and children). However, it is unknown whether these associations hold for younger adults who are considered to be in their cognitive prime, or for implicit cognitive functions that do not depend on motor sequencing. Here we report the results of a study in which we examine the relationship between objectively measured physical activity and (1) explicit relational memory, (2) executive control, and (3) implicit probabilistic sequence learning in a sample of healthy, college-aged adults. The main finding was that physical activity was positively associated with explicit relational memory and executive control (replicating previous research), but negatively associated with implicit learning, particularly in females. These results raise the intriguing possibility that physical activity upregulates some cognitive processes, but downregulates others. Possible implications of this pattern of results for physical health and health habits are discussed. PMID:27584059
NASA Astrophysics Data System (ADS)
Mert, Aydin; Fahjan, Yasin M.; Hutchings, Lawrence J.; Pınar, Ali
2016-08-01
The main motivation for this study was the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in the Marmara Sea and the disaster risk around the Marmara region, especially in Istanbul. This study provides the results of a physically based probabilistic seismic hazard analysis (PSHA) methodology, using broadband strong ground motion simulations, for sites within the Marmara region, Turkey, that may be vulnerable to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We included the effects of all considerable-magnitude earthquakes. To generate the high-frequency (0.5-20 Hz) part of the broadband earthquake simulation, real, small-magnitude earthquakes recorded by a local seismic array were used as empirical Green's functions. For the frequencies below 0.5 Hz, the simulations were obtained by using synthetic Green's functions, which are synthetic seismograms calculated by an explicit 2D /3D elastic finite difference wave propagation routine. By using a range of rupture scenarios for all considerable-magnitude earthquakes throughout the PIF segments, we produced a hazard calculation for frequencies of 0.1-20 Hz. The physically based PSHA used here followed the same procedure as conventional PSHA, except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes, and this approach utilizes the full rupture of earthquakes along faults. Furthermore, conventional PSHA predicts ground motion parameters by using empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitudes of earthquakes to obtain ground motion parameters. PSHA results were produced for 2, 10, and 50 % hazards for all sites studied in the Marmara region.
Li, Mi; Li, Haichang; Li, Xiangguang; Zhu, Hua; Xu, Zihui; Liu, Lianqing; Ma, Jianjie; Zhang, Mingjun
2017-07-12
Biopolymeric hydrogels have drawn increasing research interest in biomaterials due to their tunable physical and chemical properties for both creating bioactive cellular microenvironment and serving as sustainable therapeutic reagents. Inspired by a naturally occurring hydrogel secreted from the carnivorous Sundew plant for trapping insects, here we have developed a bioinspired hydrogel to deliver mitsugumin 53 (MG53), an important protein in cell membrane repair, for chronic wound healing. Both chemical compositions and micro-/nanomorphological properties inherent from the natural Sundew hydrogel were mimicked using sodium alginate and gum arabic with calcium ion-mediated cross-linking. On the basis of atomic force microscopy (AFM) force measurements, an optimal sticky hydrogel scaffold was obtained through orthogonal experimental design. Imaging and mechanical analysis showed the distinct correlation between structural morphology, adhesion characteristics, and mechanical properties of the Sundew-inspired hydrogel. Combined characterization and biochemistry techniques were utilized to uncover the underlying molecular composition involved in the interactions between hydrogel and protein. In vitro drug release experiments confirmed that the Sundew-inspired hydrogel had a biphasic-kinetics release, which can facilitate both fast delivery of MG53 for improving the reepithelization process of the wounds and sustained release of the protein for treating chronic wounds. In vivo experiments showed that the Sundew-inspired hydrogel encapsulating with rhMG53 could facilitate dermal wound healing in mouse model. Together, these studies confirmed that the Sundew-inspired hydrogel has both tunable micro-/nanostructures and physicochemical properties, which enable it as a delivery vehicle for chronic wounding healing. The research may provide a new way to develop biocompatible and tunable biomaterials for sustainable drug release to meet the needs of biological activities.
Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons.
Probst, Dimitri; Petrovici, Mihai A; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz
2015-01-01
The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems.
A probabilistic model of overt visual attention for cognitive robots.
Begum, Momotaz; Karray, Fakhri; Mann, George K I; Gosine, Raymond G
2010-10-01
Visual attention is one of the major requirements for a robot to serve as a cognitive companion for human. The robotic visual attention is mostly concerned with overt attention which accompanies head and eye movements of a robot. In this case, each movement of the camera head triggers a number of events, namely transformation of the camera and the image coordinate systems, change of content of the visual field, and partial appearance of the stimuli. All of these events contribute to the reduction in probability of meaningful identification of the next focus of attention. These events are specific to overt attention with head movement and, therefore, their effects are not addressed in the classical models of covert visual attention. This paper proposes a Bayesian model as a robot-centric solution for the overt visual attention problem. The proposed model, while taking inspiration from the primates visual attention mechanism, guides a robot to direct its camera toward behaviorally relevant and/or visually demanding stimuli. A particle filter implementation of this model addresses the challenges involved in overt attention with head movement. Experimental results demonstrate the performance of the proposed model.
A New Intrusion Detection Method Based on Antibody Concentration
NASA Astrophysics Data System (ADS)
Zeng, Jie; Li, Tao; Li, Guiyang; Li, Haibo
Antibody is one kind of protein that fights against the harmful antigen in human immune system. In modern medical examination, the health status of a human body can be diagnosed by detecting the intrusion intensity of a specific antigen and the concentration indicator of corresponding antibody from human body’s serum. In this paper, inspired by the principle of antigen-antibody reactions, we present a New Intrusion Detection Method Based on Antibody Concentration (NIDMBAC) to reduce false alarm rate without affecting detection rate. In our proposed method, the basic definitions of self, nonself, antigen and detector in the intrusion detection domain are given. Then, according to the antigen intrusion intensity, the change of antibody number is recorded from the process of clone proliferation for detectors based on the antigen classified recognition. Finally, building upon the above works, a probabilistic calculation method for the intrusion alarm production, which is based on the correlation between the antigen intrusion intensity and the antibody concen-tration, is proposed. Our theoretical analysis and experimental results show that our proposed method has a better performance than traditional methods.
NASA Astrophysics Data System (ADS)
Chen, L. Leon; Ulmer, Stephan; Deisboeck, Thomas S.
2010-01-01
We present an application of a previously developed agent-based glioma model (Chen et al 2009 Biosystems 95 234-42) for predicting spatio-temporal tumor progression using a patient-specific MRI lattice derived from apparent diffusion coefficient (ADC) data. Agents representing collections of migrating glioma cells are initialized based upon voxels at the outer border of the tumor identified on T1-weighted (Gd+) MRI at an initial time point. These simulated migratory cells exhibit a specific biologically inspired spatial search paradigm, representing a weighting of the differential contribution from haptotactic permission and biomechanical resistance on the migration decision process. ADC data from 9 months after the initial tumor resection were used to select the best search paradigm for the simulation, which was initiated using data from 6 months after the initial operation. Using this search paradigm, 100 simulations were performed to derive a probabilistic map of tumor invasion locations. The simulation was able to successfully predict a recurrence in the dorsal/posterior aspect long before it was depicted on T1-weighted MRI, 18 months after the initial operation.
Chen, L Leon; Ulmer, Stephan; Deisboeck, Thomas S
2010-01-21
We present an application of a previously developed agent-based glioma model (Chen et al 2009 Biosystems 95 234-42) for predicting spatio-temporal tumor progression using a patient-specific MRI lattice derived from apparent diffusion coefficient (ADC) data. Agents representing collections of migrating glioma cells are initialized based upon voxels at the outer border of the tumor identified on T1-weighted (Gd+) MRI at an initial time point. These simulated migratory cells exhibit a specific biologically inspired spatial search paradigm, representing a weighting of the differential contribution from haptotactic permission and biomechanical resistance on the migration decision process. ADC data from 9 months after the initial tumor resection were used to select the best search paradigm for the simulation, which was initiated using data from 6 months after the initial operation. Using this search paradigm, 100 simulations were performed to derive a probabilistic map of tumor invasion locations. The simulation was able to successfully predict a recurrence in the dorsal/posterior aspect long before it was depicted on T1-weighted MRI, 18 months after the initial operation.
Probabilistic inference in discrete spaces can be implemented into networks of LIF neurons
Probst, Dimitri; Petrovici, Mihai A.; Bytschok, Ilja; Bill, Johannes; Pecevski, Dejan; Schemmel, Johannes; Meier, Karlheinz
2015-01-01
The means by which cortical neural networks are able to efficiently solve inference problems remains an open question in computational neuroscience. Recently, abstract models of Bayesian computation in neural circuits have been proposed, but they lack a mechanistic interpretation at the single-cell level. In this article, we describe a complete theoretical framework for building networks of leaky integrate-and-fire neurons that can sample from arbitrary probability distributions over binary random variables. We test our framework for a model inference task based on a psychophysical phenomenon (the Knill-Kersten optical illusion) and further assess its performance when applied to randomly generated distributions. As the local computations performed by the network strongly depend on the interaction between neurons, we compare several types of couplings mediated by either single synapses or interneuron chains. Due to its robustness to substrate imperfections such as parameter noise and background noise correlations, our model is particularly interesting for implementation on novel, neuro-inspired computing architectures, which can thereby serve as a fast, low-power substrate for solving real-world inference problems. PMID:25729361
MAUVE: A New Strategy for Solving and Grading Physics Problems
NASA Astrophysics Data System (ADS)
Hill, Nicole Breanne
2016-05-01
MAUVE (magnitude, answer, units, variables, and equations) is a framework and rubric to help students and teachers through the process of clearly solving and assessing solutions to introductory physics problems. Success in introductory physics often derives from an understanding of units, a command over dimensional analysis, and good bookkeeping. I developed MAUVE for an introductory-level environmental physics course as an easy-to-remember checklist to help students construct organized and thoughtful solutions to physics problems. Environmental physics is a core physics course for environmental and sustainability science (ESS) majors that teaches principles of radiation, thermodynamics, and mechanics within the context of the environment and sustainable energy systems. ESS student concentrations include environmental biology, applied ecology, biogeochemistry, and natural resources. The MAUVE rubric, inspired by nature, has encouraged my students to produce legible and tactical work, and has significantly clarified the grading process.
Celebrating the physics in geophysics
NASA Astrophysics Data System (ADS)
Davis, Anthony B.; Sornette, Didier
The United Nations' Educational, Scientific and Cultural Organization (UNESCO) declared 2005 the “World Year of Physics” in celebration of the centennial of Einstein's annus mirabilis when, as junior clerk at the Swiss Patent Office in Berne, he published three papers that changed physics forever by (1) introducing Special Relativity and demonstrating the equivalence of mass and energy (E = mc2), (2) explaining the photoelectric effect with Planck's then-still-new-and-controversial concept of light quanta (E = hv), and (3) investigating the macroscopic phenomenon of Brownian motion using Boltzmann's molecular dynamics (E = kT), still far from fully accepted at the time.The celebration of Einstein's work in physics inspires the reflection on the status of geophysics and its relationship with physics, in particular with respect to great discoveries.
NASA Astrophysics Data System (ADS)
Levrini, Olivia; De Ambrosis, Anna; Hemmer, Sabine; Laherto, Antti; Malgieri, Massimiliano; Pantano, Ornella; Tasquier, Giulia
2017-03-01
This paper focuses on results of an interview based survey of first-year university physics students, carried out within the EU Horizons in Physics Education (HOPE) project (http://hopenetwork.eu/). 94 interviews conducted in 13 universities have been analyzed to investigate the factors that inspire young people to study physics. In particular, the main motivational factor, which was proven to consist of personal interest and curiosity, was unfolded into different categories and detailed interest profiles were produced. The results are arguably useful to help academic curriculum developers and teaching personnel in physics departments to provide guidance to students in developing and focusing their interest towards specific sub-fields and/or to design targeted recruitment and outreach initiatives.
The probabilistic nature of preferential choice.
Rieskamp, Jörg
2008-11-01
Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes different choices in nearly identical situations, or why the magnitude of these inconsistencies varies in different situations. To illustrate the advantage of probabilistic theories, three probabilistic theories of decision making under risk are compared with their deterministic counterparts. The probabilistic theories are (a) a probabilistic version of a simple choice heuristic, (b) a probabilistic version of cumulative prospect theory, and (c) decision field theory. By testing the theories with the data from three experimental studies, the superiority of the probabilistic models over their deterministic counterparts in predicting people's decisions under risk become evident. When testing the probabilistic theories against each other, decision field theory provides the best account of the observed behavior.
NASA Technical Reports Server (NTRS)
Singhal, Surendra N.
2003-01-01
The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.
Laptops and Diesel Generators: Introducing PhET Simulations to Teachers in Uganda
ERIC Educational Resources Information Center
McKagan, Sam
2010-01-01
This article describes workshops for high school physics teachers in Uganda on inquiry-based teaching and PhET simulations. I hope it increases awareness of the conditions teachers face in developing countries and inspires others to give similar workshops. This work demonstrates what is possible with some concerted, but not extraordinary, effort.
Clinical Study of Student Learning Using Mastery Style versus Immediate Feedback Online Activities
ERIC Educational Resources Information Center
Gladding, Gary; Gutmann, Brianne; Schroeder, Noah; Stelzer, Timothy
2015-01-01
This paper is part of a series of studies to improve the efficacy of online physics homework activities by integrating narrated animated solutions with mastery inspired exercises. In a clinical study using first- and second-year university students, the mastery group attempted question sets in four levels, with animated solutions between each…
Introducing Filters and Amplifiers Using a Two-Channel Light Organ
ERIC Educational Resources Information Center
Zavrel, Erik; Sharpsteen, Eric
2015-01-01
In an era when many students carry iPods, iPhones, and iPads, physics teachers are realizing that in order to continue to inspire and convey the amazing things made possible by a few fundamental principles, they must expand laboratory coverage of electricity and circuits beyond the conventional staples of constructing series and parallel…
It's Time--To Reveal the Whitlam Institute within the University of Western Sydney
ERIC Educational Resources Information Center
Curach, Liz
2005-01-01
The Whitlam Institute within the University of Western Sydney is a centre for public dialogue and progress, with the Whitlam Prime Ministerial Collection inspiring its programs. The collection, both physical and virtual, was established in 2002, drawing upon primary source material made available or donated by the Hon E G Whitlam AC QC, and…
Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones
NASA Astrophysics Data System (ADS)
Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto
2015-04-01
Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions Euros, shows that geological and geophysical investigations necessary to assess a reliable deterministic hazard evaluation are largely justified.
A Novel Method for Satellite Maneuver Prediction
NASA Astrophysics Data System (ADS)
Shabarekh, C.; Kent-Bryant, J.; Keselman, G.; Mitidis, A.
2016-09-01
A space operations tradecraft consisting of detect-track-characterize-catalog is insufficient for maintaining Space Situational Awareness (SSA) as space becomes increasingly congested and contested. In this paper, we apply analytical methodology from the Geospatial-Intelligence (GEOINT) community to a key challenge in SSA: predicting where and when a satellite may maneuver in the future. We developed a machine learning approach to probabilistically characterize Patterns of Life (PoL) for geosynchronous (GEO) satellites. PoL are repeatable, predictable behaviors that an object exhibits within a context and is driven by spatio-temporal, relational, environmental and physical constraints. An example of PoL are station-keeping maneuvers in GEO which become generally predictable as the satellite re-positions itself to account for orbital perturbations. In an earlier publication, we demonstrated the ability to probabilistically predict maneuvers of the Galaxy 15 (NORAD ID: 28884) satellite with high confidence eight days in advance of the actual maneuver. Additionally, we were able to detect deviations from expected PoL within hours of the predicted maneuver [6]. This was done with a custom unsupervised machine learning algorithm, the Interval Similarity Model (ISM), which learns repeating intervals of maneuver patterns from unlabeled historical observations and then predicts future maneuvers. In this paper, we introduce a supervised machine learning algorithm that works in conjunction with the ISM to produce a probabilistic distribution of when future maneuvers will occur. The supervised approach uses a Support Vector Machine (SVM) to process the orbit state whereas the ISM processes the temporal intervals between maneuvers and the physics-based characteristics of the maneuvers. This multiple model approach capitalizes on the mathematical strengths of each respective algorithm while incorporating multiple features and inputs. Initial findings indicate that the combined approach can predict 70% of maneuver times within 3 days of a true maneuver time and 22% of maneuver times within 24 hours of a maneuver. We have also been able to detect deviations from expected maneuver patterns up to a week in advance.
A Probabilistic Framework for the Validation and Certification of Computer Simulations
NASA Technical Reports Server (NTRS)
Ghanem, Roger; Knio, Omar
2000-01-01
The paper presents a methodology for quantifying, propagating, and managing the uncertainty in the data required to initialize computer simulations of complex phenomena. The purpose of the methodology is to permit the quantitative assessment of a certification level to be associated with the predictions from the simulations, as well as the design of a data acquisition strategy to achieve a target level of certification. The value of a methodology that can address the above issues is obvious, specially in light of the trend in the availability of computational resources, as well as the trend in sensor technology. These two trends make it possible to probe physical phenomena both with physical sensors, as well as with complex models, at previously inconceivable levels. With these new abilities arises the need to develop the knowledge to integrate the information from sensors and computer simulations. This is achieved in the present work by tracing both activities back to a level of abstraction that highlights their commonalities, thus allowing them to be manipulated in a mathematically consistent fashion. In particular, the mathematical theory underlying computer simulations has long been associated with partial differential equations and functional analysis concepts such as Hilbert spares and orthogonal projections. By relying on a probabilistic framework for the modeling of data, a Hilbert space framework emerges that permits the modeling of coefficients in the governing equations as random variables, or equivalently, as elements in a Hilbert space. This permits the development of an approximation theory for probabilistic problems that parallels that of deterministic approximation theory. According to this formalism, the solution of the problem is identified by its projection on a basis in the Hilbert space of random variables, as opposed to more traditional techniques where the solution is approximated by its first or second-order statistics. The present representation, in addition to capturing significantly more information than the traditional approach, facilitates the linkage between different interacting stochastic systems as is typically observed in real-life situations.
Students’ difficulties in probabilistic problem-solving
NASA Astrophysics Data System (ADS)
Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.
2018-03-01
There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.
Cicada-inspired cell-instructive nanopatterned arrays
NASA Astrophysics Data System (ADS)
Diu, Ting; Faruqui, Nilofar; Sjöström, Terje; Lamarre, Baptiste; Jenkinson, Howard F.; Su, Bo; Ryadnov, Maxim G.
2014-11-01
Biocompatible surfaces hold key to a variety of biomedical problems that are directly related to the competition between host-tissue cell integration and bacterial colonisation. A saving solution to this is seen in the ability of cells to uniquely respond to physical cues on such surfaces thus prompting the search for cell-instructive nanoscale patterns. Here we introduce a generic rationale engineered into biocompatible, titanium, substrates to differentiate cell responses. The rationale is inspired by cicada wing surfaces that display bactericidal nanopillar patterns. The surfaces engineered in this study are titania (TiO2) nanowire arrays that are selectively bactericidal against motile bacteria, while capable of guiding mammalian cell proliferation according to the type of the array. The concept holds promise for clinically relevant materials capable of differential physico-mechanical responses to cellular adhesion.
Cicada-inspired cell-instructive nanopatterned arrays.
Diu, Ting; Faruqui, Nilofar; Sjöström, Terje; Lamarre, Baptiste; Jenkinson, Howard F; Su, Bo; Ryadnov, Maxim G
2014-11-20
Biocompatible surfaces hold key to a variety of biomedical problems that are directly related to the competition between host-tissue cell integration and bacterial colonisation. A saving solution to this is seen in the ability of cells to uniquely respond to physical cues on such surfaces thus prompting the search for cell-instructive nanoscale patterns. Here we introduce a generic rationale engineered into biocompatible, titanium, substrates to differentiate cell responses. The rationale is inspired by cicada wing surfaces that display bactericidal nanopillar patterns. The surfaces engineered in this study are titania (TiO2) nanowire arrays that are selectively bactericidal against motile bacteria, while capable of guiding mammalian cell proliferation according to the type of the array. The concept holds promise for clinically relevant materials capable of differential physico-mechanical responses to cellular adhesion.
Check-Up of Planet Earth at the Turn of the Millennium: Anticipated New Phase in Earth Sciences
NASA Technical Reports Server (NTRS)
Kaufman, Y. J.; Ramanathan, V.
1998-01-01
Langley's remarkable solar and lunar spectra collected from Mt. Whitney inspired Arrhenius to develop the first quantitative climate model in 1896. In 1999, NASA's Earth Observing AM Satellite (EOS-AM) will repeat Langley's experiment, but for the entire planet, thus pioneering calibrated spectral observations from space. Conceived in response to real environmental problems, EOS-AM, in conjunction with other international satellite efforts, will fill a major gap in current efforts by providing quantitative global data sets with a resolution of few kilometers on the physical, chemical and biological elements of the earth system. Thus, like Langley's data, EOS-AM can revolutionize climate research by inspiring a new generation of climate system models and enable us to assess the human impact on the environment.
Coates, Janine; Vickerman, Philip B
2016-10-01
The London 2012 Olympic and Paralympic Games aimed to deliver a legacy to citizens of the United Kingdom, which included inspiring a generation of young people to participate in sport. This study aimed to understand the legacy of the Paralympic Games for children with disabilities. Eight adolescents (11-16 yr) with physical disabilities were interviewed about their perceptions of the Paralympic Games. Thematic analysis found 3 key themes that further our understanding of the Paralympic legacy. These were Paralympians as role models, changing perceptions of disability, and the motivating nature of the Paralympics. Findings demonstrate that the Games were inspirational for children with disabilities, improving their self-perceptions. This is discussed in relation to previous literature, and core recommendations are made.
Designing collective behavior in a termite-inspired robot construction team.
Werfel, Justin; Petersen, Kirstin; Nagpal, Radhika
2014-02-14
Complex systems are characterized by many independent components whose low-level actions produce collective high-level results. Predicting high-level results given low-level rules is a key open challenge; the inverse problem, finding low-level rules that give specific outcomes, is in general still less understood. We present a multi-agent construction system inspired by mound-building termites, solving such an inverse problem. A user specifies a desired structure, and the system automatically generates low-level rules for independent climbing robots that guarantee production of that structure. Robots use only local sensing and coordinate their activity via the shared environment. We demonstrate the approach via a physical realization with three autonomous climbing robots limited to onboard sensing. This work advances the aim of engineering complex systems that achieve specific human-designed goals.
Wirth, Anne-Gritli; Büssing, Arndt
2016-08-01
In a cross-sectional survey among 213 patients with multiple sclerosis, we intended to analyze their resources of hope, orientation, and inspiration in life, and how these resources are related to health-associated variables, adaptive coping strategies, and life satisfaction. Resources were categorized as Faith (10 %), Family (22 %), Other sources (16 %), and No answer (53 %). These non-respondents were predominantly neither religious nor spiritual (70 % R-S-). Although R-S- persons are a heterogeneous group with varying existential interest, they did not significantly differ from their spiritual/religious counterparts with respect to physical and mental health or life satisfaction, but for an adaptive Reappraisal strategy and Gratitude/Awe.
NASA Astrophysics Data System (ADS)
Andres Araujo, H.; Holt, Carrie; Curtis, Janelle M. R.; Perry, R. I.; Irvine, James R.; Michielsens, Catherine G. J.
2013-08-01
We evaluated the effects of biophysical conditions and hatchery production on the early marine survival of coho salmon Oncorhynchus kisutch in the Strait of Georgia, British Columbia, Canada. Due to a paucity of balanced multivariate ecosystem data, we developed a probabilistic network that integrated physical and ecological data and information from literature, expert opinion, oceanographic models, and in situ observations. This approach allowed us to evaluate alternate hypotheses about drivers of early marine survival while accounting for uncertainties in relationships among variables. Probabilistic networks allow users to explore multiple environmental settings and evaluate the consequences of management decisions under current and projected future states. We found that the zooplankton biomass anomaly, calanoid copepod biomass, and herring biomass were the best indicators of early marine survival. It also appears that concentrating hatchery supplementation during periods of negative PDO and ENSO (Pacific Decadal and El Niño Southern Oscillation respectively), indicative of generally favorable ocean conditions for salmon, tends to increase survival of hatchery coho salmon while minimizing negative impacts on the survival of wild juveniles. Scientists and managers can benefit from the approach presented here by exploring multiple scenarios, providing a basis for open and repeatable ecosystem-based risk assessments when data are limited.
Sarma-based key-group method for rock slope reliability analyses
NASA Astrophysics Data System (ADS)
Yarahmadi Bafghi, A. R.; Verdel, T.
2005-08-01
The methods used in conducting static stability analyses have remained pertinent to this day for reasons of both simplicity and speed of execution. The most well-known of these methods for purposes of stability analysis of fractured rock masses is the key-block method (KBM).This paper proposes an extension to the KBM, called the key-group method (KGM), which combines not only individual key-blocks but also groups of collapsable blocks into an iterative and progressive analysis of the stability of discontinuous rock slopes. To take intra-group forces into account, the Sarma method has been implemented within the KGM in order to generate a Sarma-based KGM, abbreviated SKGM. We will discuss herein the hypothesis behind this new method, details regarding its implementation, and validation through comparison with results obtained from the distinct element method.Furthermore, as an alternative to deterministic methods, reliability analyses or probabilistic analyses have been proposed to take account of the uncertainty in analytical parameters and models. The FOSM and ASM probabilistic methods could be implemented within the KGM and SKGM framework in order to take account of the uncertainty due to physical and mechanical data (density, cohesion and angle of friction). We will then show how such reliability analyses can be introduced into SKGM to give rise to the probabilistic SKGM (PSKGM) and how it can be used for rock slope reliability analyses. Copyright
A probabilistic Hu-Washizu variational principle
NASA Technical Reports Server (NTRS)
Liu, W. K.; Belytschko, T.; Besterfield, G. H.
1987-01-01
A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.
REVIEWS OF TOPICAL PROBLEMS: 21st century: what is life from the perspective of physics?
NASA Astrophysics Data System (ADS)
Ivanitskii, Genrikh R.
2010-07-01
The evolution of the biophysical paradigm over 65 years since the publication in 1944 of Erwin Schrödinger's What is Life? The Physical Aspects of the Living Cell is reviewed. Based on the advances in molecular genetics, it is argued that all the features characteristic of living systems can also be found in nonliving ones. Ten paradoxes in logic and physics are analyzed that allow defining life in terms of a spatial-temporal hierarchy of structures and combinatory probabilistic logic. From the perspective of physics, life can be defined as resulting from a game involving interactions of matter one part of which acquires the ability to remember the success (or failure) probabilities from the previous rounds of the game, thereby increasing its chances for further survival in the next round. This part of matter is currently called living matter.
Shors, Tracey J; Olson, Ryan L; Bates, Marsha E; Selby, Edward A; Alderman, Brandon L
2014-11-01
New neurons are generated in the hippocampus each day and their survival is greatly enhanced through effortful learning (Shors, 2014). The numbers of cells produced can be increased by physical exercise (van Praag, Kempermann, & Gage, 1999). These findings inspired us to develop a clinical intervention for humans known as Mental and Physical Training, or MAP Training. Each session consists of 30min of mental training with focused attention meditation (20min sitting and 10min walking). Meditation is an effortful training practice that involves learning about the transient nature of thoughts and thought patterns, and acquiring skills to recognize them without necessarily attaching meaning and/or emotions to them. The mental training component is followed by physical training with 30min of aerobic exercise performed at moderate intensity. During this component, participants learn choreographed dance routines while engaging in aerobic exercise. In a pilot "proof-of-concept" study, we provided supervised MAP Training (2 sessions per week for 8weeks) to a group of young mothers in the local community who were recently homeless, most of them having previously suffered from physical and sexual abuse, addiction, and depression. Preliminary data suggest that MAP Training improves dependent measures of aerobic fitness (as assessed by maximal rate of oxygen consumed) while decreasing symptoms of depression and anxiety. Similar changes were not observed in a group of recently homeless women who did not participate in MAP Training. It is not currently possible to determine whether new neurons in the human brain increase in number as a result of MAP Training. Rather these preliminary results of MAP Training illustrate how neuroscientific research can be translated into novel clinical interventions that benefit human health and wellness. Copyright © 2014 Elsevier Inc. All rights reserved.
Probabilistic Aeroelastic Analysis of Turbomachinery Components
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.
2004-01-01
A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.
NASA Astrophysics Data System (ADS)
Hockicko, Peter; Krišt‧ák, L.‧uboš; Němec, Miroslav
2015-03-01
Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think creatively, improves their performance and helps them in studying physics. This paper deals with increasing the key competencies in engineering by analysing real-life situation videos - physical problems - by means of video analysis and the modelling tools using the program Tracker and simulations of physical phenomena from The Physics Education Technology (PhET™) Project (VAS method of problem tasks). The statistical testing using the t-test confirmed the significance of the differences in the knowledge of the experimental and control groups, which were the result of interactive method application.
Microgravity Fluids for Biology, Workshop
NASA Technical Reports Server (NTRS)
Griffin, DeVon; Kohl, Fred; Massa, Gioia D.; Motil, Brian; Parsons-Wingerter, Patricia; Quincy, Charles; Sato, Kevin; Singh, Bhim; Smith, Jeffrey D.; Wheeler, Raymond M.
2013-01-01
Microgravity Fluids for Biology represents an intersection of biology and fluid physics that present exciting research challenges to the Space Life and Physical Sciences Division. Solving and managing the transport processes and fluid mechanics in physiological and biological systems and processes are essential for future space exploration and colonization of space by humans. Adequate understanding of the underlying fluid physics and transport mechanisms will provide new, necessary insights and technologies for analyzing and designing biological systems critical to NASAs mission. To enable this mission, the fluid physics discipline needs to work to enhance the understanding of the influence of gravity on the scales and types of fluids (i.e., non-Newtonian) important to biology and life sciences. In turn, biomimetic, bio-inspired and synthetic biology applications based on physiology and biology can enrich the fluid mechanics and transport phenomena capabilities of the microgravity fluid physics community.
New contributions to physics by Prof. C. N. Yang: 2009-2011
NASA Astrophysics Data System (ADS)
Ma, Zhong-Qi
2016-01-01
In a seminal paper of 1967, Professor Chen Ning Yang found the full solution of the one-dimensional Fermi gas with a repulsive delta function interaction by using the Bethe ansatz and group theory. This work with a brilliant discovery of the Yang-Baxter equation has been inspiring new developments in mathematical physics, statistical physics, and many-body physics. Based on experimental developments in simulating many-body physics of one-dimensional systems of ultracold atoms, during a period from 2009 to 2011, Prof. Yang published seven papers on the exact properties of the ground state of bosonic and fermionic atoms with the repulsive delta function interaction and a confined potential to one dimension. Here I would like to share my experience in doing research work fortunately under the direct supervision of Prof. Yang in that period.
New Contributions to Physics by Prof. C. N. Yang: 2009-2011
NASA Astrophysics Data System (ADS)
Ma, Zhong-Qi
In a seminal paper of 1967, Professor Chen Ning Yang found the full solution of the one-dimensional Fermi gas with a repulsive delta function interaction by using the Bethe ansatz and group theory. This work with a brilliant discovery of the Yang-Baxter equation has been inspiring new developments in mathematical physics, statistical physics, and many-body physics. Based on experimental developments in simulating many-body physics of one-dimensional systems of ultracold atoms, during a period from 2009 to 2011, Prof. Yang published seven papers on the exact properties of the ground state of bosonic and fermionic atoms with the repulsive delta function interaction and a confined potential to one dimension. Here I would like to share my experience in doing research work fortunately under the direct supervision of Prof. Yang in that period.
Probabilistic structural analysis methods for space propulsion system components
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
The development of a three-dimensional inelastic analysis methodology for the Space Shuttle main engine (SSME) structural components is described. The methodology is composed of: (1) composite load spectra, (2) probabilistic structural analysis methods, (3) the probabilistic finite element theory, and (4) probabilistic structural analysis. The methodology has led to significant technical progress in several important aspects of probabilistic structural analysis. The program and accomplishments to date are summarized.
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
Orhan, A Emin; Ma, Wei Ji
2017-07-26
Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.
Cruz Estrada, Flor de Maria; Tlatempa Sotelo, Patricia; Valdes-Ramos, Roxana; Hernández Murúa, José Aldo; Manjarrez-Montes-de-Oca, Rafael
2017-01-01
This is a prospective, cross-sectional, and correlational study with a probabilistic sampling in which 150 teenagers from three different high schools from the city of Toluca, Mexico, aged 15-17, were assessed. To determine if weight, age, and gender have an influence on physical fitness evaluated with the EUROFIT and ALPHA-FITNESS batteries. Women have a higher overweight and obesity rate than men (3 : 1). Adolescents who have normal weight have regular physical fitness (74.9%). When comparing genders we found that men have a higher mean than women in the tests, except for skinfold thickness and waist circumference. Age was only correlated with the plate tapping test ( p = 0.001). There are significant differences in the standing broad jump test and the Course-Navette of the EUROFIT and ALPHA-FITNESS batteries ( p = 0.000). It is likely that regular physical activity, and not normal weight, helps generate healthy physical fitness. Male subjects had a higher mean than women, reporting a better physical fitness and more frequent physical activity.
Tlatempa Sotelo, Patricia; Hernández Murúa, José Aldo; Manjarrez-Montes-de-Oca, Rafael
2017-01-01
Material and Method This is a prospective, cross-sectional, and correlational study with a probabilistic sampling in which 150 teenagers from three different high schools from the city of Toluca, Mexico, aged 15–17, were assessed. Objective To determine if weight, age, and gender have an influence on physical fitness evaluated with the EUROFIT and ALPHA-FITNESS batteries. Results Women have a higher overweight and obesity rate than men (3 : 1). Adolescents who have normal weight have regular physical fitness (74.9%). When comparing genders we found that men have a higher mean than women in the tests, except for skinfold thickness and waist circumference. Age was only correlated with the plate tapping test (p = 0.001). There are significant differences in the standing broad jump test and the Course-Navette of the EUROFIT and ALPHA-FITNESS batteries (p = 0.000). Conclusions It is likely that regular physical activity, and not normal weight, helps generate healthy physical fitness. Male subjects had a higher mean than women, reporting a better physical fitness and more frequent physical activity. PMID:28845436
Integrated Risk-Informed Decision-Making for an ALMR PRISM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muhlheim, Michael David; Belles, Randy; Denning, Richard S.
Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace ormore » supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute decision-making framework uses various sensor data (e.g., reactor outlet temperature, steam generator drum level) and calculates its position within the challenge state, its trajectory, and its margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. The metrics that are evaluated are based on reactor trip set points. The integration of the deterministic calculations using multi-physics analyses and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermalhydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies, and developing a user interface to mimic display panels at a modern nuclear power plant.« less
Focus on topological physics: from condensed matter to cold atoms and optics
NASA Astrophysics Data System (ADS)
Zhai, Hui; Rechtsman, Mikael; Lu, Yuan-Ming; Yang, Kun
2016-08-01
The notions of a topological phase and topological order were first introduced in the studies of integer and fractional quantum Hall effects, and further developed in the study of topological insulators and topological superconductors in the past decade. Topological concepts are now widely used in many branches of physics, not only limited to condensed matter systems but also in ultracold atomic systems, photonic materials and trapped ions. Papers published in this focus issue are direct testaments of that, and readers will gain a global view of how topology impacts different branches of contemporary physics. We hope that these pages will inspire new ideas through communication between different fields.
Bayesian networks and information theory for audio-visual perception modeling.
Besson, Patricia; Richiardi, Jonas; Bourdin, Christophe; Bringoux, Lionel; Mestre, Daniel R; Vercher, Jean-Louis
2010-09-01
Thanks to their different senses, human observers acquire multiple information coming from their environment. Complex cross-modal interactions occur during this perceptual process. This article proposes a framework to analyze and model these interactions through a rigorous and systematic data-driven process. This requires considering the general relationships between the physical events or factors involved in the process, not only in quantitative terms, but also in term of the influence of one factor on another. We use tools from information theory and probabilistic reasoning to derive relationships between the random variables of interest, where the central notion is that of conditional independence. Using mutual information analysis to guide the model elicitation process, a probabilistic causal model encoded as a Bayesian network is obtained. We exemplify the method by using data collected in an audio-visual localization task for human subjects, and we show that it yields a well-motivated model with good predictive ability. The model elicitation process offers new prospects for the investigation of the cognitive mechanisms of multisensory perception.
Ron, Gil; Globerson, Yuval; Moran, Dror; Kaplan, Tommy
2017-12-21
Proximity-ligation methods such as Hi-C allow us to map physical DNA-DNA interactions along the genome, and reveal its organization into topologically associating domains (TADs). As the Hi-C data accumulate, computational methods were developed for identifying domain borders in multiple cell types and organisms. Here, we present PSYCHIC, a computational approach for analyzing Hi-C data and identifying promoter-enhancer interactions. We use a unified probabilistic model to segment the genome into domains, which we then merge hierarchically and fit using a local background model, allowing us to identify over-represented DNA-DNA interactions across the genome. By analyzing the published Hi-C data sets in human and mouse, we identify hundreds of thousands of putative enhancers and their target genes, and compile an extensive genome-wide catalog of gene regulation in human and mouse. As we show, our predictions are highly enriched for ChIP-seq and DNA accessibility data, evolutionary conservation, eQTLs and other DNA-DNA interaction data.
Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?
NASA Astrophysics Data System (ADS)
Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.
2017-03-01
Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.
A Hough transform global probabilistic approach to multiple-subject diffusion MRI tractography.
Aganj, Iman; Lenglet, Christophe; Jahanshad, Neda; Yacoub, Essa; Harel, Noam; Thompson, Paul M; Sapiro, Guillermo
2011-08-01
A global probabilistic fiber tracking approach based on the voting process provided by the Hough transform is introduced in this work. The proposed framework tests candidate 3D curves in the volume, assigning to each one a score computed from the diffusion images, and then selects the curves with the highest scores as the potential anatomical connections. The algorithm avoids local minima by performing an exhaustive search at the desired resolution. The technique is easily extended to multiple subjects, considering a single representative volume where the registered high-angular resolution diffusion images (HARDI) from all the subjects are non-linearly combined, thereby obtaining population-representative tracts. The tractography algorithm is run only once for the multiple subjects, and no tract alignment is necessary. We present experimental results on HARDI volumes, ranging from simulated and 1.5T physical phantoms to 7T and 4T human brain and 7T monkey brain datasets. Copyright © 2011 Elsevier B.V. All rights reserved.
Augmenting Probabilistic Risk Assesment with Malevolent Initiators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis Smith; David Schwieder
2011-11-01
As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operatingmore » plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.« less
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
2017-06-01
The history of mathematical modeling outside physics has been dominated by the use of classical mathematical models, C-models, primarily those of a probabilistic or statistical nature. More recently, however, quantum mathematical models, Q-models, based in the mathematical formalism of quantum theory have become more prominent in psychology, economics, and decision science. The use of Q-models in these fields remains controversial, in part because it is not entirely clear whether Q-models are necessary for dealing with the phenomena in question or whether C-models would still suffice. My aim, however, is not to assess the necessity of Q-models in these fields, but instead to reflect on what the possible applicability of Q-models may tell us about the corresponding phenomena there, vis-à-vis quantum phenomena in physics. In order to do so, I shall first discuss the key reasons for the use of Q-models in physics. In particular, I shall examine the fundamental principles that led to the development of quantum mechanics. Then I shall consider a possible role of similar principles in using Q-models outside physics. Psychology, economics, and decision science borrow already available Q-models from quantum theory, rather than derive them from their own internal principles, while quantum mechanics was derived from such principles, because there was no readily available mathematical model to handle quantum phenomena, although the mathematics ultimately used in quantum did in fact exist then. I shall argue, however, that the principle perspective on mathematical modeling outside physics might help us to understand better the role of Q-models in these fields and possibly to envision new models, conceptually analogous to but mathematically different from those of quantum theory, helpful or even necessary there or in physics itself. I shall suggest one possible type of such models, singularized probabilistic, SP, models, some of which are time-dependent, TDSP-models. The necessity of using such models may change the nature of mathematical modeling in science and, thus, the nature of science, as it happened in the case of Q-models, which not only led to a revolutionary transformation of physics but also opened new possibilities for scientific thinking and mathematical modeling beyond physics.
NASA Astrophysics Data System (ADS)
Goradia, Shantilal
2013-04-01
Century old GR fails to unify quantum physics, nuclear force or distinguish between the mass of living bodies from inert mass. Probabilistic gravity [1] explains strong coupling (nuclear force). The natural log of the age of the universe, 10E60 in Planck times, equaling 137 (1/Alpha) extends physics to deeper science, if we stand on the shoulders of giants like Feynman and Gamow. Implications of [1] are that it is not the earth, but M and S numbers of the particles of the earth are remotely interacting with corresponding numbers of the particles of the moon and the sun respectively, neglecting other heavenly bodies in this short draft. This new physics is likely to enable creative scientific minds to throw light on a theoretical basis for an otherwise arbitrary cosmological constant, uniformity of microwave background, further vindication of Boltzmann, quantum informatics, Einstein’s later publicized views and more, eliminating the need to spend money for implicitly nonexistent quantum gravity and graviton.[4pt] [1] Journal of Physical Science and Applications 2 (7) (2012) 265-268.
Cyr, André; Boukadoum, Mounir; Thériault, Frédéric
2014-01-01
In this paper, we investigate the operant conditioning (OC) learning process within a bio-inspired paradigm, using artificial spiking neural networks (ASNN) to act as robot brain controllers. In biological agents, OC results in behavioral changes learned from the consequences of previous actions, based on progressive prediction adjustment from rewarding or punishing signals. In a neurorobotics context, virtual and physical autonomous robots may benefit from a similar learning skill when facing unknown and unsupervised environments. In this work, we demonstrate that a simple invariant micro-circuit can sustain OC in multiple learning scenarios. The motivation for this new OC implementation model stems from the relatively complex alternatives that have been described in the computational literature and recent advances in neurobiology. Our elementary kernel includes only a few crucial neurons, synaptic links and originally from the integration of habituation and spike-timing dependent plasticity as learning rules. Using several tasks of incremental complexity, our results show that a minimal neural component set is sufficient to realize many OC procedures. Hence, with the proposed OC module, designing learning tasks with an ASNN and a bio-inspired robot context leads to simpler neural architectures for achieving complex behaviors. PMID:25120464
Cyr, André; Boukadoum, Mounir; Thériault, Frédéric
2014-01-01
In this paper, we investigate the operant conditioning (OC) learning process within a bio-inspired paradigm, using artificial spiking neural networks (ASNN) to act as robot brain controllers. In biological agents, OC results in behavioral changes learned from the consequences of previous actions, based on progressive prediction adjustment from rewarding or punishing signals. In a neurorobotics context, virtual and physical autonomous robots may benefit from a similar learning skill when facing unknown and unsupervised environments. In this work, we demonstrate that a simple invariant micro-circuit can sustain OC in multiple learning scenarios. The motivation for this new OC implementation model stems from the relatively complex alternatives that have been described in the computational literature and recent advances in neurobiology. Our elementary kernel includes only a few crucial neurons, synaptic links and originally from the integration of habituation and spike-timing dependent plasticity as learning rules. Using several tasks of incremental complexity, our results show that a minimal neural component set is sufficient to realize many OC procedures. Hence, with the proposed OC module, designing learning tasks with an ASNN and a bio-inspired robot context leads to simpler neural architectures for achieving complex behaviors.
Biomimetic and bio-inspired uses of mollusc shells.
Morris, J P; Wang, Y; Backeljau, T; Chapelle, G
2016-06-01
Climate change and ocean acidification are likely to have a profound effect on marine molluscs, which are of great ecological and economic importance. One process particularly sensitive to climate change is the formation of biominerals in mollusc shells. Fundamental research is broadening our understanding of the biomineralization process, as well as providing more informed predictions on the effects of climate change on marine molluscs. Such studies are important in their own right, but their value also extends to applied sciences. Biominerals, organic/inorganic hybrid materials with many remarkable physical and chemical properties, have been studied for decades, and the possibilities for future improved use of such materials for society are widely recognised. This article highlights the potential use of our understanding of the shell biomineralization process in novel bio-inspired and biomimetic applications. It also highlights the potential for the valorisation of shells produced as a by-product of the aquaculture industry. Studying shells and the formation of biominerals will inspire novel functional hybrid materials. It may also provide sustainable, ecologically- and economically-viable solutions to some of the problems created by current human resource exploitation. Copyright © 2016 Elsevier B.V. All rights reserved.
Ghandeharioun, Asma; Azaria, Asaph; Taylor, Sara; Picard, Rosalind W
Previous research has shown that gratitude positively influences psychological wellbeing and physical health. Grateful people are reported to feel more optimistic and happy, to better mitigate aversive experiences, and to have stronger interpersonal bonds. Gratitude interventions have been shown to result in improved sleep, more frequent exercise and stronger cardiovascular and immune systems. These findings call for the development of technologies that would inspire gratitude. This paper presents a novel system designed toward this end. We leverage pervasive technologies to naturally embed inspiration to express gratitude in everyday life. Novel to this work, mobile sensor data is utilized to infer optimal moments for stimulating contextually relevant thankfulness and appreciation. Sporadic mood measurements are inventively obtained through the smartphone lock screen, investigating their interplay with grateful expressions. Both momentary thankful emotion and dispositional gratitude are measured. To evaluate our system, we ran two rounds of randomized control trials (RCT), including a pilot study (N = 15, 2 weeks) and a main study (N = 27, 5 weeks). Studies' participants were provided with a newly developed smartphone app through which they were asked to express gratitude; the app displayed inspirational content to only the intervention group, while measuring contextual cues for all users. In both rounds of the RCT, the intervention was associated with improved thankful behavior. Significant increase was observed in multiple facets of practicing gratitude in the intervention groups. The average frequency of practicing thankfulness increased by more than 120 %, comparing the baseline weeks with the intervention weeks of the main study. In contrast, the control group of the same study exhibited a decrease of 90 % in the frequency of thankful expressions. In the course of the study's 5 weeks, increases in dispositional gratitude and in psychological wellbeing were also apparent. Analyzing the relation between mood and gratitude expressions, our data suggest that practicing gratitude increases the probability of going up in terms of emotional valence and down in terms of emotional arousal. The influences of inspirational content and contextual cues on promoting thankful behavior were also analyzed: We present data suggesting that the more successful times for eliciting expressions of gratitude tend to be shortly after a social experience, shortly after location change, and shortly after physical activity. The results support our intervention as an impactful method to promote grateful affect and behavior. Moreover, they provide insights into design and evaluation of general behavioral intervention technologies.
Biological Inspiration for Agile Autonomous Air Vehicles
2007-11-01
rhythmically contract the thorax; the hind wings have become specialized as small body rotation sensors (halteres). Butterflies and moths have two pairs of...orthogonal pairs of power muscles that produce alternating dorso-ventral and longitudinal flexure of the thorax from rhythmic contractions similar to...other physical sciences lend themselves to somewhat reductionist approaches for both analysis and synthesis. Complex engineered systems are built from
ERIC Educational Resources Information Center
Reid, Greg; Jobling, Ian F.
2012-01-01
The London 2012 Summer Paralympics Mascot, "Mandeville", is named after the village of Stoke Mandeville, near Aylesbury in Buckinghamshire, England. This was where the inaugural Stoke Mandeville Games were held in 1948. On 28 July 1948, men and women who had been injured throughout the Second World War assembled in Stoke Mandeville for archery…
What Seams Do We Remove in Mobile-Assisted Seamless Learning? A Critical Review of the Literature
ERIC Educational Resources Information Center
Wong, Lung-Hsiang; Looi, Chee-Kit
2011-01-01
Seamless learning refers to the seamless integration of the learning experiences across various dimensions including formal and informal learning contexts, individual and social learning, and physical world and cyberspace. Inspired by the exposition by Chan et al. (2006) on the seamless learning model supported by the setting of one or more mobile…
OPTIMIZING THROUGH CO-EVOLUTIONARY AVALANCHES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. BOETTCHER; A. PERCUS
2000-08-01
We explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problems. The method, called extremal optimization, is inspired by ''self-organized critically,'' a concept introduced to describe emergent complexity in many physical systems. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, extremal optimization successively replaces extremely undesirable elements of a sub-optimal solution with new, random ones. Large fluctuations, called ''avalanches,'' ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements approximation methods inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Those phase transitions are found in the parameter space of most optimization problems, and have recently been conjectured to be the origin of some of the hardest instances in computational complexity. We will demonstrate how extremal optimization can be implemented for a variety of combinatorial optimization problems. We believe that extremal optimization will be a useful tool in the investigation of phase transitions in combinatorial optimization problems, hence valuable in elucidating the origin of computational complexity.« less
Combining local search with co-evolution in a remarkably simple way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boettcher, S.; Percus, A.
2000-05-01
The authors explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problem. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. In contrast to genetic algorithms, which operate on an entire gene-pool of possible solutions, extremal optimization successively replaces extremely undesirable elements of a single sub-optimal solution with new, random ones. Large fluctuations, or avalanches, ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements heuristics inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Phase transitions are found in many combinatorial optimization problems, and have been conjectured to occur in the region of parameter space containing the hardest instances. We demonstrate how extremal optimization can be implemented for a variety of hard optimization problems. We believe that this will be a useful tool in the investigation of phase transitions in combinatorial optimization, thereby helping to elucidate the origin of computational complexity.« less
The Inspiring Science Education project and the resources for HEP analysis by university students
NASA Astrophysics Data System (ADS)
Fassouliotis, Dimitris; Kourkoumelis, Christine; Vourakis, Stylianos
2016-11-01
The Inspiring Science Education outreach project has been running for more than two years, creating a large number of inquiry based educational resources for high-school teachers and students. Its goal is the promotion of science education in schools though new methods built on the inquiry based education techniques, involving large consortia of European partners and implementation of large-scale pilots in schools. Recent hands-on activities, developing and testing the above mentioned innovative applications are reviewed. In general, there is a lack for educational scenaria and laboratory courses earmarked for more advanced, namely university, students. At the University of Athens for the last four years, the HYPATIA on-line event analysis tool has been used as a lab course for fourth year undergraduate physics students, majoring in HEP. Up to now, the course was limited to visual inspection of a few tens of ATLAS events. Recently the course was enriched with additional analysis exercises, which involve large samples of events. The students through a user friendly interface can analyse the samples and optimize the cut selection in order to search for new physics. The implementation of this analysis is described.
Vibro-Perception of Optical Bio-Inspired Fiber-Skin.
Li, Tao; Zhang, Sheng; Lu, Guo-Wei; Sunami, Yuta
2018-05-12
In this research, based on the principle of optical interferometry, the Mach-Zehnder and Optical Phase-locked Loop (OPLL) vibro-perception systems of bio-inspired fiber-skin are designed to mimic the tactile perception of human skin. The fiber-skin is made of the optical fiber embedded in the silicone elastomer. The optical fiber is an instinctive and alternative sensor for tactile perception with high sensitivity and reliability, also low cost and susceptibility to the magnetic interference. The silicone elastomer serves as a substrate with high flexibility and biocompatibility, and the optical fiber core serves as the vibro-perception sensor to detect physical motions like tapping and sliding. According to the experimental results, the designed optical fiber-skin demonstrates the ability to detect the physical motions like tapping and sliding in both the Mach-Zehnder and OPLL vibro-perception systems. For direct contact condition, the OPLL vibro-perception system shows better performance compared with the Mach-Zehnder vibro-perception system. However, the Mach-Zehnder vibro-perception system is preferable to the OPLL system in the indirect contact experiment. In summary, the fiber-skin is validated to have light touch character and excellent repeatability, which is highly-suitable for skin-mimic sensing.
Probabilistic classifiers with high-dimensional data
Kim, Kyung In; Simon, Richard
2011-01-01
For medical classification problems, it is often desirable to have a probability associated with each class. Probabilistic classifiers have received relatively little attention for small n large p classification problems despite of their importance in medical decision making. In this paper, we introduce 2 criteria for assessment of probabilistic classifiers: well-calibratedness and refinement and develop corresponding evaluation measures. We evaluated several published high-dimensional probabilistic classifiers and developed 2 extensions of the Bayesian compound covariate classifier. Based on simulation studies and analysis of gene expression microarray data, we found that proper probabilistic classification is more difficult than deterministic classification. It is important to ensure that a probabilistic classifier is well calibrated or at least not “anticonservative” using the methods developed here. We provide this evaluation for several probabilistic classifiers and also evaluate their refinement as a function of sample size under weak and strong signal conditions. We also present a cross-validation method for evaluating the calibration and refinement of any probabilistic classifier on any data set. PMID:21087946
2014-06-01
Even well- controlled laboratory testing saw a range of COV from less than 10 percent to over 500 percent for different steels (Tryon & Cruse...identified. This 6-year limit is driven by concerns about corrosion fatigue , a process initiated by water gaining access to the carbon steel of the shaft...physics and is controlled by different parameters and interactions of the many variables involved. Figure 2 depicts the corrosion fatigue sequence of
Rule-based programming paradigm: a formal basis for biological, chemical and physical computation.
Krishnamurthy, V; Krishnamurthy, E V
1999-03-01
A rule-based programming paradigm is described as a formal basis for biological, chemical and physical computations. In this paradigm, the computations are interpreted as the outcome arising out of interaction of elements in an object space. The interactions can create new elements (or same elements with modified attributes) or annihilate old elements according to specific rules. Since the interaction rules are inherently parallel, any number of actions can be performed cooperatively or competitively among the subsets of elements, so that the elements evolve toward an equilibrium or unstable or chaotic state. Such an evolution may retain certain invariant properties of the attributes of the elements. The object space resembles Gibbsian ensemble that corresponds to a distribution of points in the space of positions and momenta (called phase space). It permits the introduction of probabilities in rule applications. As each element of the ensemble changes over time, its phase point is carried into a new phase point. The evolution of this probability cloud in phase space corresponds to a distributed probabilistic computation. Thus, this paradigm can handle tor deterministic exact computation when the initial conditions are exactly specified and the trajectory of evolution is deterministic. Also, it can handle probabilistic mode of computation if we want to derive macroscopic or bulk properties of matter. We also explain how to support this rule-based paradigm using relational-database like query processing and transactions.
NASA Astrophysics Data System (ADS)
Frisch, Uriel
1996-01-01
Written five centuries after the first studies of Leonardo da Vinci and half a century after A.N. Kolmogorov's first attempt to predict the properties of flow, this textbook presents a modern account of turbulence, one of the greatest challenges in physics. "Fully developed turbulence" is ubiquitous in both cosmic and natural environments, in engineering applications and in everyday life. Elementary presentations of dynamical systems ideas, probabilistic methods (including the theory of large deviations) and fractal geometry make this a self-contained textbook. This is the first book on turbulence to use modern ideas from chaos and symmetry breaking. The book will appeal to first-year graduate students in mathematics, physics, astrophysics, geosciences and engineering, as well as professional scientists and engineers.
A Proposed Probabilistic Extension of the Halpern and Pearl Definition of ‘Actual Cause’
2017-01-01
ABSTRACT Joseph Halpern and Judea Pearl ([2005]) draw upon structural equation models to develop an attractive analysis of ‘actual cause’. Their analysis is designed for the case of deterministic causation. I show that their account can be naturally extended to provide an elegant treatment of probabilistic causation. 1Introduction2Preemption3Structural Equation Models4The Halpern and Pearl Definition of ‘Actual Cause’5Preemption Again6The Probabilistic Case7Probabilistic Causal Models8A Proposed Probabilistic Extension of Halpern and Pearl’s Definition9Twardy and Korb’s Account10Probabilistic Fizzling11Conclusion PMID:29593362
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
The fourth year of technical developments on the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) system for Probabilistic Structural Analysis Methods is summarized. The effort focused on the continued expansion of the Probabilistic Finite Element Method (PFEM) code, the implementation of the Probabilistic Boundary Element Method (PBEM), and the implementation of the Probabilistic Approximate Methods (PAppM) code. The principal focus for the PFEM code is the addition of a multilevel structural dynamics capability. The strategy includes probabilistic loads, treatment of material, geometry uncertainty, and full probabilistic variables. Enhancements are included for the Fast Probability Integration (FPI) algorithms and the addition of Monte Carlo simulation as an alternate. Work on the expert system and boundary element developments continues. The enhanced capability in the computer codes is validated by applications to a turbine blade and to an oxidizer duct.
NASA Astrophysics Data System (ADS)
Wanchoo, S. K.; Vaishnavi, S.
2016-12-01
INSPIRE Science Internship Camps at SMVDU: forum for engagement of higher secondary students for promotion of innovative pursuit of science Samantha Vaishnavi1 and Sunil K. Wanchoo2*1Department of Biotechnology, SMVD University, Katra, INDIA 2Department of Physics, SMVD University, Katra, INDIA AbstractIndia recognizes that, to become a knowledge economy and take advantage of its demographic dividend in years to come, young bright minds need to be attracted to science, and be motivated to take up challenging research problems. Our union government, in the year 2008, launched INSPIRE (Innovation in Science Pursuit for Inspired Research), an initiative by the Department of Science & Technology, Government of India. Under the INSPIRE scheme and as a part of subcomponent called Scheme for Early Attraction of Talent (SEATS) an aim was set forth to attract children to pursue science by inviting 50000 top 1% merit rank holders in 10th boards and pursuing science in class 11th to be part of such camps organized annually by higher education institutions throughout the country. During these camps, students are made to interact with mentors - who are eminent scientists and gifted teachers to experience and reveal the joy of innovation. As a part of this initiative, we have organized 10 fully residential INSPIRE Science Internship Camps so far since April 2010. These have been attended by 3590 students from all over the State of Jammu and Kashmir. Two of these camps have been successfully held in the difficult and challenging terrain of Leh, Ladakh with the help of state administration and Army. In July 2015, we had the distinction of organizing the largest camp in India, attended by 620 students. 11th such camp and 3rd at Leh shall be held in the second week of September, 2016. We shall present the detailed approach adopted in organizing these camps and the feedback received from the attendees and some of the mentors. *Corresponding author (sunilkwanchoo@gmail.com)
Experimental Verification of Fully Decentralized Control Inspired by Plasmodium of True Slime Mold
NASA Astrophysics Data System (ADS)
Umedachi, Takuya; Takeda, Koichi; Nakagaki, Toshiyuki; Kobayashi, Ryo; Ishiguro, Akio
This paper presents a fully decentralized control inspired by plasmodium of true slime mold and its validity using a soft-bodied amoeboid robot. The notable features of this paper are twofold: (1) the robot has truly soft and deformable body stemming from real-time tunable springs and a balloon, the former is utilized as an outer skin of the body and the latter serves as protoplasm; and (2) a fully decentralized control using coupled oscillators with completely local sensory feedback mechanism is realized by exploiting the long-distance physical interaction between the body parts induced by the law of conservation of protoplasmic mass. Experimental results show that this robot exhibits truly supple locomotion without relying on any hierarchical structure. The results obtained are expected to shed new light on design scheme for autonomous decentralized control system.
MSSM-inspired multifield inflation
NASA Astrophysics Data System (ADS)
Dubinin, M. N.; Petrova, E. Yu.; Pozdeeva, E. O.; Sumin, M. V.; Vernov, S. Yu.
2017-12-01
Despite the fact that experimentally with a high degree of statistical significance only a single Standard Model-like Higgs boson is discovered at the LHC, extended Higgs sectors with multiple scalar fields not excluded by combined fits of the data are more preferable theoretically for internally consistent realistic models of particle physics. We analyze the inflationary scenarios which could be induced by the two-Higgs-doublet potential of the Minimal Supersymmetric Standard Model (MSSM) where five scalar fields have non-minimal couplings to gravity. Observables following from such MSSM-inspired multifield inflation are calculated and a number of consistent inflationary scenarios are constructed. Cosmological evolution with different initial conditions for the multifield system leads to consequences fully compatible with observational data on the spectral index and the tensor-to-scalar ratio. It is demonstrated that the strong coupling approximation is precise enough to describe such inflationary scenarios.
Nature inspires sensors to do more with less.
Mulvaney, Shawn P; Sheehan, Paul E
2014-10-28
The world is filled with widely varying chemical, physical, and biological stimuli. Over millennia, organisms have refined their senses to cope with these diverse stimuli, becoming virtuosos in differentiating closely related antigens, handling extremes in concentration, resetting the spent sensing mechanisms, and processing the multiple data streams being generated. Nature successfully deals with both repeating and new stimuli, demonstrating great adaptability when confronted with the latter. Interestingly, nature accomplishes these feats using a fairly simple toolbox. The sensors community continues to draw inspiration from nature's example: just look at the antibodies used as biosensor capture agents or the neural networks that process multivariate data streams. Indeed, many successful sensors have been built by simply mimicking natural systems. However, some of the most exciting breakthroughs occur when the community moves beyond mimicking nature and learns to use nature's tools in innovative ways.
A grounded theory of how social support influences physical activity in adolescent girls
Fawkner, Samantha
2018-01-01
ABSTRACT Purpose: Adolescent girls are not sufficiently active to achieve health benefits. Social support from friends and family has been positively associated with physical activity in adolescent girls; however it is unclear how social support influences physical activity behaviour. This study aimed to develop a grounded theory of how social support influences physical activity in adolescent girls. Methods: A qualitative, constructivist grounded theory approach was adopted. Individual interviews explored adolescent girls’ perspectives of how significant others’ influenced their physical activity through providing social support, and through modelling physical activity. Results: Participants perceived social support to influence physical activity behaviour through performance improvements, self-efficacy, enjoyment, motivation and by enabling physical activity. Improvements in performance and self-efficacy were also linked to motivation to be active. Girls perceived modelling to influence behaviour through providing opportunities for them to be physically active, and by inspiring them to be active. Conclusion: The grounded theory outlines adolescent girls’ perceptions of how significant others influence their physical activity and provides a framework for future research examining the role of social support on physical activity. PMID:29405881
A grounded theory of how social support influences physical activity in adolescent girls.
Laird, Yvonne; Fawkner, Samantha; Niven, Ailsa
2018-12-01
Adolescent girls are not sufficiently active to achieve health benefits. Social support from friends and family has been positively associated with physical activity in adolescent girls; however it is unclear how social support influences physical activity behaviour. This study aimed to develop a grounded theory of how social support influences physical activity in adolescent girls. A qualitative, constructivist grounded theory approach was adopted. Individual interviews explored adolescent girls' perspectives of how significant others' influenced their physical activity through providing social support, and through modelling physical activity. Participants perceived social support to influence physical activity behaviour through performance improvements, self-efficacy, enjoyment, motivation and by enabling physical activity. Improvements in performance and self-efficacy were also linked to motivation to be active. Girls perceived modelling to influence behaviour through providing opportunities for them to be physically active, and by inspiring them to be active. The grounded theory outlines adolescent girls' perceptions of how significant others influence their physical activity and provides a framework for future research examining the role of social support on physical activity.
Ants, eyelashes, and the 2015 Ig Nobel Prize in Physics
NASA Astrophysics Data System (ADS)
Hu, David
2016-11-01
The zoo can be a source of recreation and rich scientific investigation. In this lecture, I will give an overview of my recent research with animals at the Atlanta Zoo. We will talk about how to make ant hamburgers, how eyelashes reduce evaporation of your eyes by a factor of two, and why mammals urinate for the same duration of 21 seconds. Although animal-inspired research can sound trendy, it can lead the way toward potential future directions in fluid mechanics, including the dynamics of active materials, flow through hairy surfaces, and the physics of digestion and excretion.
Active chainmail fabrics for soft robotic applications
NASA Astrophysics Data System (ADS)
Ransley, Mark; Smitham, Peter; Miodownik, Mark
2017-08-01
This paper introduces a novel type of smart textile with electronically responsive flexibility. The chainmail inspired fabric is modelled parametrically and simulated via a rigid body physics framework with an embedded model of temperature controlled actuation. Our model assumes that individual fabric linkages are rigid and deform only through their own actuation, thereby decoupling flexibility from stiffness. A physical prototype of the active fabric is constructed and it is shown that flexibility can be significantly controlled through actuator strains of ≤10%. Applications of these materials to soft-robotics such as dynamically reconfigurable orthoses and splints are discussed.
Nunes, Ana Paula de Oliveira Barbosa; Luiz, Olinda do Carmo; Barros, Marilisa Berti Azevedo; Cesar, Chester Luis Galvão; Goldbaum, Moisés
2015-08-01
This study aimed to estimate the prevalence of physical activity in different domains and the association with schooling, using a serial cross-sectional population-based design comparing data from two editions of a health survey in the city of São Paulo, Brazil. Participation included 1,667 adults in 2003 and 2,086 in 2008. Probabilistic sampling was performed by two-stage clusters. The long version of International Physical Activity Questionnaire (IPAQ) allowed evaluating multiple domains of physical activity. Poisson regression was used. Men were more active in their leisure time and at work and women in the home. Schooling was associated directly with leisure-time activity (2003 and 2008) and inversely with work-related physical activity (2003) for men and for women in housework. The studies showed that Brazilians with less schooling are becoming less active, so that intervention strategies should consider different educational levels. Interventions in the urban space and transportation can increase the opportunities for physical activity and broaden access by the population.
Is probabilistic bias analysis approximately Bayesian?
MacLehose, Richard F.; Gustafson, Paul
2011-01-01
Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311
Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects
NASA Technical Reports Server (NTRS)
Nagpal, V. K.
1985-01-01
A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Probabilistic structural analysis methods for select space propulsion system components
NASA Technical Reports Server (NTRS)
Millwater, H. R.; Cruse, T. A.
1989-01-01
The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.
The Joy of Teaching and Writing Conceptual Physics
NASA Astrophysics Data System (ADS)
Hewitt, Paul G.
2011-10-01
When I began teaching at City College of San Francisco in 1964, I fell in love with a 1960 text-book that addressed non-science students, Physics for the Inquiring Mind, written by British-born physicist Eric M. Rogers, who taught physics at Princeton University and who later won the 1969 Oersted Medal of the American Association of Physics Teachers. His book was as inspirational as he was and remains a favorite of mine. My request to adopt that book for my physics class was rejected by my department chair, Art Austin, who claimed it was much too bulky and heavy for students to have to haul around. It weighed more than five pounds, with a trim size huge for that time—8 × 11 inches. To further justify its rejection, he also found topics considered important not covered in the book. I would have loved teaching from the Rogers book, but such was not to be.
How can we help students appreciate physics education?
NASA Astrophysics Data System (ADS)
Lin, Jia-Ling; Zaki, Eman; Schmidt, Jason; Woolston, Don
2004-03-01
Helping students appreciate physics education is a formidable task, considering that many students struggle to pass introductory physics courses. Numerous efforts have been made for this undertaking because it is an important step leading to successful learning. In an out-of-classroom academic program, the Supplemental Instruction (SI) Program, we have used the approach, INSPIRE (inquiry, network, skillfulness, perseverance, intuition, reasoning, and effort), to help more students value their experiences in these courses. The method basically includes key elements outlined by experts in physics education [1]. Student responses have been encouraging. Having undergraduates as facilitators in the program is advantageous in promoting principles of physics education. Their training emphasizes tenacity, resourcefulness, understanding, support, and teamwork, i.e. TRUST. We present the organization and focus of the SI Program, and discuss how these improve learning atmosphere and facilitate learning. [1] Edward F. Redish et al, Am J. Phys. 66(3), March 1998.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pitts, Kevin T.
This document is the program for the 6th Annual Midwest Conference for Undergraduate Women in Physics, which was held at the University of Illinois at Urbana-Champaign on January 18-20, 2013. The goals of the conference were to foster a culture in which undergraduate women are encouraged and supported to pursue, and also to succeed in, higher education in physics; to provide career information to students in physics and related fields; to give women the resources, motivation, and confidence to apply to graduate school and successfully complete a Ph.D. program in Physics; to provide information and dispel misconceptions about the applicationmore » process for graduate school and the diverse employment opportunities in physics and related fields, enabling women to make more informed decisions about their goals and attain them; and to connect female physics students with successful female physicists to whom they can relate and who can act as inspirational role models and mentors.« less
Frontal and Parietal Contributions to Probabilistic Association Learning
Rushby, Jacqueline A.; Vercammen, Ans; Loo, Colleen; Short, Brooke
2011-01-01
Neuroimaging studies have shown both dorsolateral prefrontal (DLPFC) and inferior parietal cortex (iPARC) activation during probabilistic association learning. Whether these cortical brain regions are necessary for probabilistic association learning is presently unknown. Participants' ability to acquire probabilistic associations was assessed during disruptive 1 Hz repetitive transcranial magnetic stimulation (rTMS) of the left DLPFC, left iPARC, and sham using a crossover single-blind design. On subsequent sessions, performance improved relative to baseline except during DLPFC rTMS that disrupted the early acquisition beneficial effect of prior exposure. A second experiment examining rTMS effects on task-naive participants showed that neither DLPFC rTMS nor sham influenced naive acquisition of probabilistic associations. A third experiment examining consecutive administration of the probabilistic association learning test revealed early trial interference from previous exposure to different probability schedules. These experiments, showing disrupted acquisition of probabilistic associations by rTMS only during subsequent sessions with an intervening night's sleep, suggest that the DLPFC may facilitate early access to learned strategies or prior task-related memories via consolidation. Although neuroimaging studies implicate DLPFC and iPARC in probabilistic association learning, the present findings suggest that early acquisition of the probabilistic cue-outcome associations in task-naive participants is not dependent on either region. PMID:21216842
Exploration of Advanced Probabilistic and Stochastic Design Methods
NASA Technical Reports Server (NTRS)
Mavris, Dimitri N.
2003-01-01
The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and tutorials are attached in electronic form with the enclosed CD.
Sequential Data Assimilation for Seismicity: a Proof of Concept
NASA Astrophysics Data System (ADS)
van Dinther, Y.; Fichtner, A.; Kuensch, H. R.
2015-12-01
Our physical understanding and probabilistic forecasting ability of earthquakes is significantly hampered by limited indications of the state of stress and strength on faults and their governing parameters. Using the sequential data assimilation framework developed in meteorology and oceanography (e.g., Evensen, JGR, 1994) and a seismic cycle forward model based on Navier-Stokes Partial Differential Equations (van Dinther et al., JGR, 2013), we show that such information with its uncertainties is within reach, at least for laboratory setups. We aim to provide the first, thorough proof of concept for seismicity related PDE applications via a perfect model test of seismic cycles in a simplified wedge-like subduction setup. By evaluating the performance with respect to known numerical input and output, we aim to answer wether there is any probabilistic forecast value for this laboratory-like setup, which and how many parameters can be constrained, and how much data in both space and time would be needed to do so. Thus far our implementation of an Ensemble Kalman Filter demonstrated that probabilistic estimates of both the state of stress and strength on a megathrust fault can be obtained and utilized even when assimilating surface velocity data at a single point in time and space. An ensemble-based error covariance matrix containing velocities, stresses and pressure links surface velocity observations to fault stresses and strengths well enough to update fault coupling accordingly. Depending on what synthetic data show, coseismic events can then be triggered or inhibited.
Advanced Concepts in Quantum Mechanics
NASA Astrophysics Data System (ADS)
Esposito, Giampiero; Marmo, Giuseppe; Miele, Gennaro; Sudarshan, George
2014-11-01
Preface; 1. Introduction: the need for a quantum theory; 2. Experimental foundations of quantum theory; 3. Waves and particles; 4. Schrödinger picture, Heisenberg picture and probabilistic aspects; 5. Integrating the equations of motion; 6. Elementary applications: 1-dimensional problems; 7. Elementary applications: multidimensional problems; 8. Coherent states and related formalism; 9. Introduction to spin; 10. Symmetries in quantum mechanics; 11. Approximation methods; 12. Modern pictures of quantum mechanics; 13. Formulations of quantum mechanics and their physical implications; 14. Exam problems; Glossary of geometric concepts; References; Index.
2015-10-02
ratio or physical layout than the training sample, or new vs old bananas . For our system, this is similar the multimodal case mentioned above; however...different modes. Foods with multiple “types” such as green, yellow, and brown bananas are seamlessly handled as well. Secondly, with hundreds or thousands...Recognition and Classification of Food Grains, Fruits and Flowers Using Machine Vision. INTERNATIONAL JOURNAL OF FOOD ENGINEERING, 5(4), 2009. [155] T. E
ERIC Educational Resources Information Center
Thompson, Michael; Tsui, Stella; Leung, Chi Fan
2011-01-01
A sweet spot is referred to in sport as the perfect place to strike a ball with a racquet or bat. It is the point of contact between bat and ball where maximum results can be produced with minimal effort from the hand of the player. Similar physics can be applied to the less inspiring examples of door stops; the perfect position of a door stop is…
Rank the Voltage across Light Bulbs … Then Set up the Live Experiment
ERIC Educational Resources Information Center
Jacobs, Greg C.
2018-01-01
The Tasks Inspired by Physics Education Research (TIPERS) workbooks pose questions in styles quite different from the end-of-chapter problems that those of us of a certain age were assigned back in the days before Netscape. My own spin on TIPERS is not just to do them on paper, but to have students set up the situations in the laboratory to…
Biology-Inspired Autonomous Control
2011-08-31
from load sensing in a turbulent flow field with high levels of plant uncertainty and optical feedback latency. The results of this paper suggest... Mimicry of biological systems, in the form of precise mathematical or physical dynamical modeling, is yielding impressive insight into the underlying...processing and plants , the aerospace industry has been slow to accept adaptive control. In the past decade however, newer methods for design of adaptive
Statistical physics of community ecology: a cavity solution to MacArthur’s consumer resource model
NASA Astrophysics Data System (ADS)
Advani, Madhu; Bunin, Guy; Mehta, Pankaj
2018-03-01
A central question in ecology is to understand the ecological processes that shape community structure. Niche-based theories have emphasized the important role played by competition for maintaining species diversity. Many of these insights have been derived using MacArthur’s consumer resource model (MCRM) or its generalizations. Most theoretical work on the MCRM has focused on small ecosystems with a few species and resources. However theoretical insights derived from small ecosystems many not scale up to large ecosystems with many resources and species because large systems with many interacting components often display new emergent behaviors that cannot be understood or deduced from analyzing smaller systems. To address these shortcomings, we develop a statistical physics inspired cavity method to analyze MCRM when both the number of species and the number of resources is large. Unlike previous work in this limit, our theory addresses resource dynamics and resource depletion and demonstrates that species generically and consistently perturb their environments and significantly modify available ecological niches. We show how our cavity approach naturally generalizes niche theory to large ecosystems by accounting for the effect of collective phenomena on species invasion and ecological stability. Our theory suggests that such phenomena are a generic feature of large, natural ecosystems and must be taken into account when analyzing and interpreting community structure. It also highlights the important role that statistical-physics inspired approaches can play in furthering our understanding of ecology.
Probabilistic Ontology Architecture for a Terrorist Identification Decision Support System
2014-06-01
in real-world problems requires probabilistic ontologies, which integrate the inferential reasoning power of probabilistic representations with the... inferential reasoning power of probabilistic representations with the first-order expressivity of ontologies. The Reference Architecture for...ontology, terrorism, inferential reasoning, architecture I. INTRODUCTION A. Background Whether by nature or design, the personas of terrorists are
A model for the development of university curricula in nanoelectronics
NASA Astrophysics Data System (ADS)
Bruun, E.; Nielsen, I.
2010-12-01
Nanotechnology is having an increasing impact on university curricula in electrical engineering and in physics. Major influencers affecting developments in university programmes related to nanoelectronics are discussed and a model for university programme development is described. The model takes into account that nanotechnology affects not only physics but also electrical engineering and computer engineering because of the advent of new nanoelectronics devices. The model suggests that curriculum development tends to follow one of three major tracks: physics; electrical engineering; computer engineering. Examples of European curricula following this framework are identified and described. These examples may serve as sources of inspiration for future developments and the model presented may provide guidelines for a systematic selection of topics in the university programmes.
Introducing Filters and Amplifiers Using a Two-Channel Light Organ
NASA Astrophysics Data System (ADS)
Zavrel, Erik; Sharpsteen, Eric
2015-11-01
In an era when many students carry iPods, iPhones, and iPads, physics teachers are realizing that in order to continue to inspire and convey the amazing things made possible by a few fundamental principles, they must expand laboratory coverage of electricity and circuits beyond the conventional staples of constructing series and parallel arrangements of light bulbs and confirming Kirchhoff's laws. Indeed, physics teachers are already incorporating smartphones into their laboratory activities in an effort to convey concepts in a more contemporary and relatable manner. As part of Cornell's Learning Initiative in Medicine and Bioengineering (CLIMB), we set out to design and implement an engaging curriculum to introduce high school physics students to filters and amplifiers.
Physical terms and leisure time activities
NASA Astrophysics Data System (ADS)
Valovičová, Ľubomíra; Siptáková, Mária; ŠtubÅa, Martin
2017-01-01
People have to educate not only in school but also outside it. One approach to acquire new knowledge are leisure activities such as hobby groups or camps. Leisure activities, more and more seem to be the appropriate form for informal learning of physics concepts. Within leisure activities pupils have the possibility to acquire new concepts in unusual and interesting way. It is possible to inspire their intrinsic motivation on the matter or the phenomenon which is the aim of all teachers. This article deals with the description of and insights on acquisition of the concept of uniform and non-uniform rectilinear movement during a physics camp where pupils had the opportunity to use modern technologies which are despite of modernization of education still unconventional teaching methods in our schools.
Probabilistic Evaluation of Blade Impact Damage
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Abumeri, G. H.
2003-01-01
The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.
Probabilistic Simulation of Stress Concentration in Composite Laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.
1994-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.
NASA Astrophysics Data System (ADS)
Bydlon, S. A.; Beroza, G. C.
2015-12-01
Recent debate on the efficacy of Probabilistic Seismic Hazard Analysis (PSHA), and the utility of hazard maps (i.e. Stein et al., 2011; Hanks et al., 2012), has prompted a need for validation of such maps using recorded strong ground motion data. Unfortunately, strong motion records are limited spatially and temporally relative to the area and time windows hazard maps encompass. We develop a framework to test the predictive powers of PSHA maps that is flexible with respect to a map's specified probability of exceedance and time window, and the strong motion receiver coverage. Using a combination of recorded and interpolated strong motion records produced through the ShakeMap environment, we compile a record of ground motion intensity measures for California from 2002-present. We use this information to perform an area-based test of California PSHA maps inspired by the work of Ward (1995). Though this framework is flexible in that it can be applied to seismically active areas where ShakeMap-like ground shaking interpolations have or can be produced, this testing procedure is limited by the relatively short lifetime of strong motion recordings and by the desire to only test with data collected after the development of the PSHA map under scrutiny. To account for this, we use the assumption that PSHA maps are time independent to adapt the testing procedure for periods of recorded data shorter than the lifetime of a map. We note that accuracy of this testing procedure will only improve as more data is collected, or as the time-horizon of interest is reduced, as has been proposed for maps of areas experiencing induced seismicity. We believe that this procedure can be used to determine whether PSHA maps are accurately portraying seismic hazard and whether discrepancies are localized or systemic.
NASA Astrophysics Data System (ADS)
Escalante, George
2017-05-01
Weak Value Measurements (WVMs) with pre- and post-selected quantum mechanical ensembles were proposed by Aharonov, Albert, and Vaidman in 1988 and have found numerous applications in both theoretical and applied physics. In the field of precision metrology, WVM techniques have been demonstrated and proven valuable as a means to shift, amplify, and detect signals and to make precise measurements of small effects in both quantum and classical systems, including: particle spin, the Spin-Hall effect of light, optical beam deflections, frequency shifts, field gradients, and many others. In principal, WVM amplification techniques are also possible in radar and could be a valuable tool for precision measurements. However, relatively limited research has been done in this area. This article presents a quantum-inspired model of radar range and range-rate measurements of arbitrary strength, including standard and pre- and post-selected measurements. The model is used to extend WVM amplification theory to radar, with the receive filter performing the post-selection role. It is shown that the description of range and range-rate measurements based on the quantum-mechanical measurement model and formalism produces the same results as the conventional approach used in radar based on signal processing and filtering of the reflected signal at the radar receiver. Numerical simulation results using simple point scatterrer configurations are presented, applying the quantum-inspired model of radar range and range-rate measurements that occur in the weak measurement regime. Potential applications and benefits of the quantum inspired approach to radar measurements are presented, including improved range and Doppler measurement resolution.
Development of a First-of-a-Kind Deterministic Decision-Making Tool for Supervisory Control System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Sacit M; Kisner, Roger A; Muhlheim, Michael David
2015-07-01
Decision-making is the process of identifying and choosing alternatives where each alternative offers a different approach or path to move from a given state or condition to a desired state or condition. The generation of consistent decisions requires that a structured, coherent process be defined, immediately leading to a decision-making framework. The overall objective of the generalized framework is for it to be adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or nomore » human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision- making modules that can perform a given set of tasks reliably. Risk-informed decision making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The implementation of the probabilistic portion of the decision-making engine of the proposed supervisory control system was detailed in previous milestone reports. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic multi-attribute decision-making framework uses variable sensor data (e.g., outlet temperature) and calculates where it is within the challenge state, its trajectory, and margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. Metrics to be evaluated include stability, cost, time to complete (action), power level, etc. The integration of deterministic calculations using multi-physics analyses (i.e., neutronics, thermal, and thermal-hydraulics) and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermal-hydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies.« less
The Role of Humor in Learning Physics: a Study of Undergraduate Students
NASA Astrophysics Data System (ADS)
Berge, Maria
2017-04-01
We all know that they do it, but what do students laugh about when learning science together? Although research has shown that students do use humor when they learn science, the role of humor in science education has received little attention. In this study, undergraduate students' laughter during collaborative work in physics has been investigated. In order to do this, a framework inspired by conversation analysis has been used. Empirical data was drawn from two video-recorded sessions in which first-year engineering students solved physics problems together. The analysis revealed that the students' use of humor was almost exclusively related to physics. Five themes identified summarize the role of humor in the group discussions: Something is obvious, Something is difficult, Something said might be wrong, Something is absurd, and Something said is not within informal norms.
Teaching the Physics of Energy While Traveling by Train
NASA Astrophysics Data System (ADS)
Hay, Katrina
2013-02-01
Pacific Lutheran University (Tacoma, WA) is renowned for the number of its courses that offer international and study-away opportunities. Inspired by the theme of sustainability, and my growing concern about the environmental impact of conventional fuels, I offered a course, Physics of Energy, for the first time during PLU's January 2011 term (a one-month semester). The two-week travel portion of the course took students on the Amtrak Coast Starlight train route from Tacoma, WA, to Los Angeles to study various forms of energy production. Students studied physics topics in the classroom, through hands-on activities, by traveling to energy research and production facilities, and while working on their interdisciplinary term projects. The course, which utilized local talent and focused on sustainability, provided a positive physics learning experience for majors and non-majors alike.
NASA Astrophysics Data System (ADS)
Fatimah, F.; Rosadi, D.; Hakim, R. B. F.
2018-03-01
In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.
Learning Probabilistic Logic Models from Probabilistic Examples
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2009-01-01
Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348
Learning Probabilistic Logic Models from Probabilistic Examples.
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2008-10-01
We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.
Probabilistic simulation of uncertainties in thermal structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael
1990-01-01
Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.
Information fusion methods based on physical laws.
Rao, Nageswara S V; Reister, David B; Barhen, Jacob
2005-01-01
We consider systems whose parameters satisfy certain easily computable physical laws. Each parameter is directly measured by a number of sensors, or estimated using measurements, or both. The measurement process may introduce both systematic and random errors which may then propagate into the estimates. Furthermore, the actual parameter values are not known since every parameter is measured or estimated, which makes the existing sample-based fusion methods inapplicable. We propose a fusion method for combining the measurements and estimators based on the least violation of physical laws that relate the parameters. Under fairly general smoothness and nonsmoothness conditions on the physical laws, we show the asymptotic convergence of our method and also derive distribution-free performance bounds based on finite samples. For suitable choices of the fuser classes, we show that for each parameter the fused estimate is probabilistically at least as good as its best measurement as well as best estimate. We illustrate the effectiveness of this method for a practical problem of fusing well-log data in methane hydrate exploration.
Recent developments of the NESSUS probabilistic structural analysis computer program
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
Eclipse 2017: Partnering with NASA MSFC to Inspire Students
NASA Technical Reports Server (NTRS)
Fry, Craig " Ghee"
2017-01-01
NASA's Marshall Space Flight Center (MSFC) is partnering with the U.S. Space and Rocket Center (USSRC), and Austin Peay State University (APSU) to engage citizen scientists, engineers, and students in science investigations during the 2017 American Solar Eclipse. Investigations will support the Citizen Continental America Telescopic Eclipse (CATE), Ham Radio Science Citizen Investigation(HamSCI), and Interactive NASA Space Physics Ionosphere Radio Experiments (INSPIRE). All planned activities will engage Space Campers and local high school students in the application of the scientific method as they seek to explore a wide range of observations during the eclipse. Where planned experiments touch on current scientific questions, the camper/students will be acting as citizen scientists, participating with researchers from APSU and MSFC. Participants will test their expectations and after the eclipse, share their results, experiences, and conclusions to younger Space Campers at the US Space & Rocket Center.
3D-printing and mechanics of bio-inspired articulated and multi-material structures.
Porter, Michael M; Ravikumar, Nakul; Barthelat, Francois; Martini, Roberto
2017-09-01
3D-printing technologies allow researchers to build simplified physical models of complex biological systems to more easily investigate their mechanics. In recent years, a number of 3D-printed structures inspired by the dermal armors of various fishes have been developed to study their multiple mechanical functionalities, including flexible protection, improved hydrodynamics, body support, or tail prehensility. Natural fish armors are generally classified according to their shape, material and structural properties as elasmoid scales, ganoid scales, placoid scales, carapace scutes, or bony plates. Each type of dermal armor forms distinct articulation patterns that facilitate different functional advantages. In this paper, we highlight recent studies that developed 3D-printed structures not only to inform the design and application of some articulated and multi-material structures, but also to explain the mechanics of the natural biological systems they mimic. Copyright © 2017 Elsevier Ltd. All rights reserved.
Spicing up Science: Mini Undergraduate Research Projects in Physics and Chemistry
NASA Astrophysics Data System (ADS)
Devendorf, George
2008-10-01
Individual student research projects are often small pieces of a larger research program and may or may not provide an interesting and satisfying research experience for a student researcher who only is engaged in the project for a limited time. This researcher describes a variety of research activities conducted with advanced high school students in a high school setting. These research projects are limited by the academic experience of the student, facilities and resources and available time. Such limitations however, have shaped some of the research projects into ``mini-projects'' that form interesting scientific questions which can be addressed within a semester or yearlong project. Several of these research ideas have been inspired from teaching introductory courses and though they may not further a continuing research program or spawn significant publications, they do provide an avenue for teaching and inspiring scientific inquiry in the minds of young potential scientists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadjidakis, Cynthia
2002-12-17
This report presents the exclusive rho0 meson electroproduction on the nucleon at intermediate square momentum transfers Q 2 (1.5 < Q 2 < 3 GeV 2) and above the resonance region. The experiment has been taken place at the Jefferson laboratory with the CLAS detector, with a 4.2 GeV beam energy on a hydrogen target in the February-March 1999 period. They present the results and in particular the L/T separated cross sections. This experimentally unexplored domain experimentally is at the intersection between traditional ''soft'' hadronic physics models (VDM and Regge inspired models) and ''hard'' pQCD inspired approaches (recently introduced Generalizedmore » Parton Distribution). They discuss both approaches and their domain of validity.« less
Nurzaman, Surya G.
2016-01-01
Sensor morphology, the morphology of a sensing mechanism which plays a role of shaping the desired response from physical stimuli from surroundings to generate signals usable as sensory information, is one of the key common aspects of sensing processes. This paper presents a structured review of researches on bioinspired sensor morphology implemented in robotic systems, and discusses the fundamental design principles. Based on literature review, we propose two key arguments: first, owing to its synthetic nature, biologically inspired robotics approach is a unique and powerful methodology to understand the role of sensor morphology and how it can evolve and adapt to its task and environment. Second, a consideration of an integrative view of perception by looking into multidisciplinary and overarching mechanisms of sensor morphology adaptation across biology and engineering enables us to extract relevant design principles that are important to extend our understanding of the unfinished concepts in sensing and perception. PMID:27499843
A biomimetic accelerometer inspired by the cricket's clavate hair
Droogendijk, H.; de Boer, M. J.; Sanders, R. G. P.; Krijnen, G. J. M.
2014-01-01
Crickets use so-called clavate hairs to sense (gravitational) acceleration to obtain information on their orientation. Inspired by this clavate hair system, a one-axis biomimetic accelerometer has been developed and fabricated using surface micromachining and SU-8 lithography. An analytical model is presented for the design of the accelerometer, and guidelines are derived to reduce responsivity due to flow-induced contributions to the accelerometer's output. Measurements show that this microelectromechanical systems (MEMS) hair-based accelerometer has a resonance frequency of 320 Hz, a detection threshold of 0.10 ms−2 and a dynamic range of more than 35 dB. The accelerometer exhibits a clear directional response to external accelerations and a low responsivity to airflow. Further, the accelerometer's physical limits with respect to noise levels are addressed and the possibility for short-term adaptation of the sensor to the environment is discussed. PMID:24920115
Bio-inspired Murray materials for mass transfer and activity
NASA Astrophysics Data System (ADS)
Zheng, Xianfeng; Shen, Guofang; Wang, Chao; Li, Yu; Dunphy, Darren; Hasan, Tawfique; Brinker, C. Jeffrey; Su, Bao-Lian
2017-04-01
Both plants and animals possess analogous tissues containing hierarchical networks of pores, with pore size ratios that have evolved to maximize mass transport and rates of reactions. The underlying physical principles of this optimized hierarchical design are embodied in Murray's law. However, we are yet to realize the benefit of mimicking nature's Murray networks in synthetic materials due to the challenges in fabricating vascularized structures. Here we emulate optimum natural systems following Murray's law using a bottom-up approach. Such bio-inspired materials, whose pore sizes decrease across multiple scales and finally terminate in size-invariant units like plant stems, leaf veins and vascular and respiratory systems provide hierarchical branching and precise diameter ratios for connecting multi-scale pores from macro to micro levels. Our Murray material mimics enable highly enhanced mass exchange and transfer in liquid-solid, gas-solid and electrochemical reactions and exhibit enhanced performance in photocatalysis, gas sensing and as Li-ion battery electrodes.
Learning from nature: binary cooperative complementary nanomaterials.
Su, Bin; Guo, Wei; Jiang, Lei
2015-03-01
In this Review, nature-inspired binary cooperative complementary nanomaterials (BCCNMs), consisting of two components with entirely opposite physiochemical properties at the nanoscale, are presented as a novel concept for the building of promising materials. Once the distance between the two nanoscopic components is comparable to the characteristic length of some physical interactions, the cooperation between these complementary building blocks becomes dominant and endows the macroscopic materials with novel and superior properties. The first implementation of the BCCNMs is the design of bio-inspired smart materials with superwettability and their reversible switching between different wetting states in response to various kinds of external stimuli. Coincidentally, recent studies on other types of functional nanomaterials contribute more examples to support the idea of BCCNMs, which suggests a potential yet comprehensive range of future applications in both materials science and engineering. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sun, Shengtong; Mao, Li-Bo; Lei, Zhouyue; Yu, Shu-Hong; Cölfen, Helmut
2016-09-19
Given increasing environmental issues due to the large usage of non-biodegradable plastics based on petroleum, new plastic materials, which are economic, environmentally friendly, and recyclable are in high demand. One feasible strategy is the bio-inspired synthesis of mineral-based hybrid materials. Herein we report a facile route for an amorphous CaCO3 (ACC)-based hydrogel consisting of very small ACC nanoparticles physically cross-linked by poly(acrylic acid). The hydrogel is shapeable, stretchable, and self-healable. Upon drying, the hydrogel forms free-standing, rigid, and transparent objects with remarkable mechanical performance. By swelling in water, the material can completely recover the initial hydrogel state. As a matrix, thermochromism can also be easily introduced. The present hybrid hydrogel may represent a new class of plastic materials, the "mineral plastics". © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Christian Andreas Doppler: A legendary man inspired by the dazzling light of the stars
Katsi, V; Felekos, I; Kallikazaros, I
2013-01-01
Christian Andreas Doppler is renowned primarily for his revolutionary theory of the Doppler effect, which has deeply influenced many areas of modern science and technology, including medicine. His work has laid the foundations for modern ultrasonography and his ideas are still inspiring discoveries more than a hundred years after his death. Doppler may well earn the title of Homo Universalis for his broad knowledge of physics, mathematics and astronomy and most of all for his indefatigable investigations for new ideas and his ingenious mind. According to Bolzano: “It is hard to believe how fruitful a genius Austria has in this man”. His legacy of scientific achievement have seen Doppler honoured in the later years on coinage and money, names of streets, educational institutions, rock groups, even of a lunar crater; while the ultimate tribute to his work is the countless references to the homonymous medical eponym. PMID:24376313
Sa, Thiago Herick; Salvador, Emanuel Péricles; Florindo, Alex Antonio
2013-08-01
Physical inactivity in transportation is negatively related to many health outcomes. However, little is known about the correlates of this condition among people living in regions of low socioeconomic level. Cross-sectional study aimed to assess factors associated with physical inactivity in transportation among adults in the Eastern Zone of São Paulo, Brazil. Home-based interviews were conducted between May 2007 and January 2008 on a probabilistic sample of the adult population (≥18 years), totaling 368 men and 522 women. Factors associated with physical inactivity in transportation (less than 10 minutes per week of walking or cycling) were assessed using multivariate Poisson regression with hierarchical selection of variables. Physical inactivity in transportation was associated with the presence of vehicles in the household in men (PR = 2.96) and women (PR = 2.42), with linear trend for both sexes (P < .001 and P = .004, respectively), even after adjusting for age, schooling level and chronic diseases (this last factor, only among women). Presence of vehicles in the household was associated positively with physical inactivity in transportation, both for men and for women. This should be taken into consideration in drawing up public policies for promoting physical activity.
A Figure of Merit: Quantifying the Probability of a Nuclear Reactor Accident.
Wellock, Thomas R
In recent decades, probabilistic risk assessment (PRA) has become an essential tool in risk analysis and management in many industries and government agencies. The origins of PRA date to the 1975 publication of the U.S. Nuclear Regulatory Commission's (NRC) Reactor Safety Study led by MIT professor Norman Rasmussen. The "Rasmussen Report" inspired considerable political and scholarly disputes over the motives behind it and the value of its methods and numerical estimates of risk. The Report's controversies have overshadowed the deeper technical origins of risk assessment. Nuclear experts had long sought to express risk in a "figure of merit" to verify the safety of weapons and, later, civilian reactors. By the 1970s, technical advances in PRA gave the methodology the potential to serve political ends, too. The Report, it was hoped, would prove nuclear power's safety to a growing chorus of critics. Subsequent attacks on the Report's methods and numerical estimates damaged the NRC's credibility. PRA's fortunes revived when the 1979 Three Mile Island accident demonstrated PRA's potential for improving the safety of nuclear power and other technical systems. Nevertheless, the Report's controversies endure in mistrust of PRA and its experts.
Lévy flight artificial bee colony algorithm
NASA Astrophysics Data System (ADS)
Sharma, Harish; Bansal, Jagdish Chand; Arya, K. V.; Yang, Xin-She
2016-08-01
Artificial bee colony (ABC) optimisation algorithm is a relatively simple and recent population-based probabilistic approach for global optimisation. The solution search equation of ABC is significantly influenced by a random quantity which helps in exploration at the cost of exploitation of the search space. In the ABC, there is a high chance to skip the true solution due to its large step sizes. In order to balance between diversity and convergence in the ABC, a Lévy flight inspired search strategy is proposed and integrated with ABC. The proposed strategy is named as Lévy Flight ABC (LFABC) has both the local and global search capability simultaneously and can be achieved by tuning the Lévy flight parameters and thus automatically tuning the step sizes. In the LFABC, new solutions are generated around the best solution and it helps to enhance the exploitation capability of ABC. Furthermore, to improve the exploration capability, the numbers of scout bees are increased. The experiments on 20 test problems of different complexities and five real-world engineering optimisation problems show that the proposed strategy outperforms the basic ABC and recent variants of ABC, namely, Gbest-guided ABC, best-so-far ABC and modified ABC in most of the experiments.
Caban-Martinez, Alberto J; Santiago, Katerina M; Stillman, Jordan; Moore, Kevin J; Sierra, Danielle A; Chalmers, Juanita; Baniak, Melissa; Jordan, Melissa M
2018-04-01
We characterize and compare the self-reported physical exposures, work tasks, and OSHA-10 training in a non-probabilistic sample of temporary and payroll construction workers. In June 2016, a total of 250 payroll and temporary general laborers employed at Florida construction sites completed a survey at the job site as part of the falls reported among minority employees (FRAME) study. Workers employed through temp agencies (57.1%) were significantly more likely to report moving or lifting materials more than 100 pounds than payroll workers (38.5%; P < 0.01). Temporary construction workers with 10-hour OSHA training (22.2%) spent significantly less time with intense hand use/awkward hand posture than temporary workers without 10-hour OSHA training (46.9%; P = 0.048). Temp construction workers with OSHA 10-hour training reported less hazardous physical postures than workers without the same training.
Damage and Loss Estimation for Natural Gas Networks: The Case of Istanbul
NASA Astrophysics Data System (ADS)
Çaktı, Eser; Hancılar, Ufuk; Şeşetyan, Karin; Bıyıkoǧlu, Hikmet; Şafak, Erdal
2017-04-01
Natural gas networks are one of the major lifeline systems to support human, urban and industrial activities. The continuity of gas supply is critical for almost all functions of modern life. Under natural phenomena such as earthquakes and landslides the damages to the system elements may lead to explosions and fires compromising human life and damaging physical environment. Furthermore, the disruption in the gas supply puts human activities at risk and also results in economical losses. This study is concerned with the performance of one of the largest natural gas distribution systems in the world. Physical damages to Istanbul's natural gas network are estimated under the most recent probabilistic earthquake hazard models available, as well as under simulated ground motions from physics based models. Several vulnerability functions are used in modelling damages to system elements. A first-order assessment of monetary losses to Istanbul's natural gas distribution network is also attempted.
An undergraduate course, and new textbook, on ``Physical Models of Living Systems''
NASA Astrophysics Data System (ADS)
Nelson, Philip
2015-03-01
I'll describe an intermediate-level course on ``Physical Models of Living Systems.'' The only prerequisite is first-year university physics and calculus. The course is a response to rapidly growing interest among undergraduates in several science and engineering departments. Students acquire several research skills that are often not addressed in traditional courses, including: basic modeling skills, probabilistic modeling skills, data analysis methods, computer programming using a general-purpose platform like MATLAB or Python, dynamical systems, particularly feedback control. These basic skills, which are relevant to nearly any field of science or engineering, are presented in the context of case studies from living systems, including: virus dynamics; bacterial genetics and evolution of drug resistance; statistical inference; superresolution microscopy; synthetic biology; naturally evolved cellular circuits. Publication of a new textbook by WH Freeman and Co. is scheduled for December 2014. Supported in part by EF-0928048 and DMR-0832802.
NASA Astrophysics Data System (ADS)
Tanabashi, M.
Shoichi Sakata and his Nagoya School made a lot of important achievements at the predawn of the particle physics revolution. The ``two-meson'' theory (introduction of the second generation leptons), the ``C-meson theory'' (a theory which inspired Tomonaga's renormalization theory), the ``Sakata model'' (a precursor to the quark model), and the ``Maki-Nakagawa-Sakata'' theory on the neutrino mixings are among them. These outputs are now regarded as essential ingredients in modern particle physics. Sakata also took his leadership in setting up democratic administration system in his theoretical particle physics group (E-ken). It was this democratic atmosphere in which many excellent physicists were brought up as Sakata's diciples. In this talk, I introduce Sakata and his achievements in physics, showing various materials archived in the Sakata Memorial Archival Library (SMAL), an archival repository of primary material showing Sakata's activities. These SMAL documents vividly show Sakata's way of thinking in his approach to the new physics.
Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.
2016-04-01
Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.
Learning Sparse Feature Representations using Probabilistic Quadtrees and Deep Belief Nets
2015-04-24
Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Learning sparse feature representations is a useful instru- ment for solving an...novel framework for the classifi cation of handwritten digits that learns sparse representations using probabilistic quadtrees and Deep Belief Nets... Learning Sparse Feature Representations usingProbabilistic Quadtrees and Deep Belief Nets Report Title Learning sparse feature representations is a useful
Bio-inspired nano-sensor-enhanced CNN visual computer.
Porod, Wolfgang; Werblin, Frank; Chua, Leon O; Roska, Tamas; Rodriguez-Vazquez, Angel; Roska, Botond; Fay, Patrick; Bernstein, Gary H; Huang, Yih-Fang; Csurgay, Arpad I
2004-05-01
Nanotechnology opens new ways to utilize recent discoveries in biological image processing by translating the underlying functional concepts into the design of CNN (cellular neural/nonlinear network)-based systems incorporating nanoelectronic devices. There is a natural intersection joining studies of retinal processing, spatio-temporal nonlinear dynamics embodied in CNN, and the possibility of miniaturizing the technology through nanotechnology. This intersection serves as the springboard for our multidisciplinary project. Biological feature and motion detectors map directly into the spatio-temporal dynamics of CNN for target recognition, image stabilization, and tracking. The neural interactions underlying color processing will drive the development of nanoscale multispectral sensor arrays for image fusion. Implementing such nanoscale sensors on a CNN platform will allow the implementation of device feedback control, a hallmark of biological sensory systems. These biologically inspired CNN subroutines are incorporated into the new world of analog-and-logic algorithms and software, containing also many other active-wave computing mechanisms, including nature-inspired (physics and chemistry) as well as PDE-based sophisticated spatio-temporal algorithms. Our goal is to design and develop several miniature prototype devices for target detection, navigation, tracking, and robotics. This paper presents an example illustrating the synergies emerging from the convergence of nanotechnology, biotechnology, and information and cognitive science.
Cross hole GPR traveltime inversion using a fast and accurate neural network as a forward model
NASA Astrophysics Data System (ADS)
Mejer Hansen, Thomas
2017-04-01
Probabilistic formulated inverse problems can be solved using Monte Carlo based sampling methods. In principle both advanced prior information, such as based on geostatistics, and complex non-linear forward physical models can be considered. However, in practice these methods can be associated with huge computational costs that in practice limit their application. This is not least due to the computational requirements related to solving the forward problem, where the physical response of some earth model has to be evaluated. Here, it is suggested to replace a numerical complex evaluation of the forward problem, with a trained neural network that can be evaluated very fast. This will introduce a modeling error, that is quantified probabilistically such that it can be accounted for during inversion. This allows a very fast and efficient Monte Carlo sampling of the solution to an inverse problem. We demonstrate the methodology for first arrival travel time inversion of cross hole ground-penetrating radar (GPR) data. An accurate forward model, based on 2D full-waveform modeling followed by automatic travel time picking, is replaced by a fast neural network. This provides a sampling algorithm three orders of magnitude faster than using the full forward model, and considerably faster, and more accurate, than commonly used approximate forward models. The methodology has the potential to dramatically change the complexity of the types of inverse problems that can be solved using non-linear Monte Carlo sampling techniques.
The role of probabilities in physics.
Le Bellac, Michel
2012-09-01
Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. Copyright © 2012 Elsevier Ltd. All rights reserved.
Trinh, T; Ishida, K; Kavvas, M L; Ercan, A; Carr, K
2017-05-15
Along with socioeconomic developments, and population increase, natural disasters around the world have recently increased the awareness of harmful impacts they cause. Among natural disasters, drought is of great interest to scientists due to the extraordinary diversity of their severity and duration. Motivated by the development of a potential approach to investigate future possible droughts in a probabilistic framework based on climate change projections, a methodology to consider thirteen future climate projections based on four emission scenarios to characterize droughts is presented. The proposed approach uses a regional climate model coupled with a physically-based hydrology model (Watershed Environmental Hydrology Hydro-Climate Model; WEHY-HCM) to generate thirteen equally likely future water supply projections. The water supply projections were compared to the current water demand for the detection of drought events and estimation of drought properties. The procedure was applied to Shasta Dam watershed to analyze drought conditions at the watershed outlet, Shasta Dam. The results suggest an increasing water scarcity at Shasta Dam with more severe and longer future drought events in some future scenarios. An important advantage of the proposed approach to the probabilistic analysis of future droughts is that it provides the drought properties of the 100-year and 200-year return periods without resorting to any extrapolation of the frequency curve. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mylott, Elliot; Kutschera, Ellynne; Dunlap, Justin C.; Christensen, Warren; Widenhorn, Ralf
2016-04-01
We will describe a one-quarter pilot algebra-based introductory physics course for pre-health and life science majors. The course features videos with biomedical experts and cogent biomedically inspired physics content. The materials were used in a flipped classroom as well as an all-online environment where students interacted with multimedia materials online and prior to engaging in classroom activities. Pre-lecture questions on both the medical content covered in the video media and the physics concepts in the written material were designed to engage students and probe their understanding of physics. The course featured group discussion and peer-lead instruction. Following in-class instruction, students engaged with homework assignments which explore the connections of physics and the medical field in a quantitative manner. Course surveys showed a positive response by the vast majority of students. Students largely indicated that the course helped them to make a connection between physics and the biomedical field. The biomedical focus and different course format were seen as an improvement to previous traditional physics instruction.
Bell's Inequality: Revolution in Quantum Physics or Just AN Inadequate Mathematical Model?
NASA Astrophysics Data System (ADS)
Khrennikov, Andrei
The main aim of this review is to stress the role of mathematical models in physics. The Bell inequality (BI) is often called the "most famous inequality of the 20th century." It is commonly accepted that its violation in corresponding experiments induced a revolution in quantum physics. Unlike "old quantum mechanics" (of Einstein, Schrodinger Bohr, Heisenberg, Pauli, Landau, Fock), "modern quantum mechanics" (of Bell, Aspect, Zeilinger, Shimony, Green-berger, Gisin, Mermin) takes seriously so called quantum non-locality. We will show that the conclusion that one has to give up the realism (i.e., a possibility to assign results of measurements to physical systems) or the locality (i.e., to assume action at a distance) is heavily based on one special mathematical model. This model was invented by A. N. Kolmogorov in 1933. One should pay serious attention to the role of mathematical models in physics. The problems of the realism and locality induced by Bell's argument can be solved by using non-Kolmogorovian probabilistic models. We compare this situation with non-Euclidean geometric models in relativity theory.
Using Game Development to Engage Students in Science and Technology
NASA Technical Reports Server (NTRS)
Wiacek, John
2011-01-01
Game design workshops, camps and activities engage K-12 students In STEM disciplines that use game engine and development tools. Game development will have students create games and simulations that Will inspire them to love technology while learning math, physics, and,logic. By using tools such as Gamemaker, Alice, Unity, Gamesalad and others, students will get a sense of confidence and accomplishment creating games and simulations.
Brownian motion and gambling: from ratchets to paradoxical games
NASA Astrophysics Data System (ADS)
Parrondo, J. M. R.; Dinís, Luis
2004-02-01
Two losing gambling games, when alternated in a periodic or random fashion, can produce a winning game. This paradox has been inspired by certain physical systems capable of rectifying fluctuations: the so-called Brownian ratchets. In this paper we review this paradox, from Brownian ratchets to the most recent studies on collective games, providing some intuitive explanations of the unexpected phenomena that we will find along the way.
Agent Based Study of Surprise Attacks:. Roles of Surveillance, Prompt Reaction and Intelligence
NASA Astrophysics Data System (ADS)
Shanahan, Linda; Sen, Surajit
Defending a confined territory from a surprise attack is seldom possible. We use molecular dynamics and statistical physics inspired agent-based simulations to explore the evolution and outcome of such attacks. The study suggests robust emergent behavior, which emphasizes the importance of accurate surveillance, automated and powerful attack response, building layout, and sheds light on the role of communication restrictions in defending such territories.
"Governmentality" in the Origins of European Female PE and Sport: The Spanish Case Study (1883-1936)
ERIC Educational Resources Information Center
Garcia, Raul Sanchez; Herraiz, Antonio Rivero
2013-01-01
The purpose of the paper is twofold: (1) to contribute to the analysis of the origins of modern European female PE and sports from a power perspective, inspired by Foucault's work; and (2) to present a detailed analysis of female PE and sport in Spain (1883-1936) as a specific European case study. It is argued that these physical activities could…
Cutting the Stovepipes: An Improved Staff Model for the Modern Unified Commander
2001-04-01
quick to point out, their profit was the result of creating an environment that liberated people’s creativity , nurtured their commitment, and inspired...production networks, and telecommuting .”23 These businesses operate in an electronic world (vice physical) and are facilitated by an array of information...achieve the efficiencies and creativity businesses have gained in the virtual and reengineered environments, while at the same time retaining the
A soft body as a reservoir: case studies in a dynamic model of octopus-inspired soft robotic arm.
Nakajima, Kohei; Hauser, Helmut; Kang, Rongjie; Guglielmino, Emanuele; Caldwell, Darwin G; Pfeifer, Rolf
2013-01-01
The behaviors of the animals or embodied agents are characterized by the dynamic coupling between the brain, the body, and the environment. This implies that control, which is conventionally thought to be handled by the brain or a controller, can partially be outsourced to the physical body and the interaction with the environment. This idea has been demonstrated in a number of recently constructed robots, in particular from the field of "soft robotics". Soft robots are made of a soft material introducing high-dimensionality, non-linearity, and elasticity, which often makes the robots difficult to control. Biological systems such as the octopus are mastering their complex bodies in highly sophisticated manners by capitalizing on their body dynamics. We will demonstrate that the structure of the octopus arm cannot only be exploited for generating behavior but also, in a sense, as a computational resource. By using a soft robotic arm inspired by the octopus we show in a number of experiments how control is partially incorporated into the physical arm's dynamics and how the arm's dynamics can be exploited to approximate non-linear dynamical systems and embed non-linear limit cycles. Future application scenarios as well as the implications of the results for the octopus biology are also discussed.
A soft body as a reservoir: case studies in a dynamic model of octopus-inspired soft robotic arm
Nakajima, Kohei; Hauser, Helmut; Kang, Rongjie; Guglielmino, Emanuele; Caldwell, Darwin G.; Pfeifer, Rolf
2013-01-01
The behaviors of the animals or embodied agents are characterized by the dynamic coupling between the brain, the body, and the environment. This implies that control, which is conventionally thought to be handled by the brain or a controller, can partially be outsourced to the physical body and the interaction with the environment. This idea has been demonstrated in a number of recently constructed robots, in particular from the field of “soft robotics”. Soft robots are made of a soft material introducing high-dimensionality, non-linearity, and elasticity, which often makes the robots difficult to control. Biological systems such as the octopus are mastering their complex bodies in highly sophisticated manners by capitalizing on their body dynamics. We will demonstrate that the structure of the octopus arm cannot only be exploited for generating behavior but also, in a sense, as a computational resource. By using a soft robotic arm inspired by the octopus we show in a number of experiments how control is partially incorporated into the physical arm's dynamics and how the arm's dynamics can be exploited to approximate non-linear dynamical systems and embed non-linear limit cycles. Future application scenarios as well as the implications of the results for the octopus biology are also discussed. PMID:23847526
Linear dimensions and volumes of human lungs
Hickman, David P.
2012-03-30
TOTAL LUNG Capacity is defined as “the inspiratory capacity plus the functional residual capacity; the volume of air contained in the lungs at the end of a maximal inspiration; also equals vital capacity plus residual volume” (from MediLexicon.com). Within the Results and Discussion section of their April 2012 Health Physics paper, Kramer et al. briefly noted that the lungs of their experimental subjects were “not fully inflated.” By definition and failure to obtain maximal inspiration, Kramer et. al. did not measure Total Lung Capacity (TLC). The TLC equation generated from this work will tend to underestimate TLC and does notmore » improve or update total lung capacity data provided by ICRP and others. Likewise, the five linear measurements performed by Kramer et. al. are only representative of the conditions of the measurement (i.e., not at-rest volume, but not fully inflated either). While there was significant work performed and the data are interesting, the data does not represent a maximal situation, a minimal situation, or an at-rest situation. Moreover, while interesting, the linear data generated by this study is limited by the conditions of the experiment and may not be fully comparative with other lung or inspiratory parameters, measures, or physical dimensions.« less
Locomotor Sub-functions for Control of Assistive Wearable Robots.
Sharbafi, Maziar A; Seyfarth, Andre; Zhao, Guoping
2017-01-01
A primary goal of comparative biomechanics is to understand the fundamental physics of locomotion within an evolutionary context. Such an understanding of legged locomotion results in a transition from copying nature to borrowing strategies for interacting with the physical world regarding design and control of bio-inspired legged robots or robotic assistive devices. Inspired from nature, legged locomotion can be composed of three locomotor sub-functions, which are intrinsically interrelated: Stance : redirecting the center of mass by exerting forces on the ground. Swing : cycling the legs between ground contacts. Balance : maintaining body posture. With these three sub-functions, one can understand, design and control legged locomotory systems with formulating them in simpler separated tasks. Coordination between locomotor sub-functions in a harmonized manner appears then as an additional problem when considering legged locomotion. However, biological locomotion shows that appropriate design and control of each sub-function simplifies coordination. It means that only limited exchange of sensory information between the different locomotor sub-function controllers is required enabling the envisioned modular architecture of the locomotion control system. In this paper, we present different studies on implementing different locomotor sub-function controllers on models, robots, and an exoskeleton in addition to demonstrating their abilities in explaining humans' control strategies.
Locomotor Sub-functions for Control of Assistive Wearable Robots
Sharbafi, Maziar A.; Seyfarth, Andre; Zhao, Guoping
2017-01-01
A primary goal of comparative biomechanics is to understand the fundamental physics of locomotion within an evolutionary context. Such an understanding of legged locomotion results in a transition from copying nature to borrowing strategies for interacting with the physical world regarding design and control of bio-inspired legged robots or robotic assistive devices. Inspired from nature, legged locomotion can be composed of three locomotor sub-functions, which are intrinsically interrelated: Stance: redirecting the center of mass by exerting forces on the ground. Swing: cycling the legs between ground contacts. Balance: maintaining body posture. With these three sub-functions, one can understand, design and control legged locomotory systems with formulating them in simpler separated tasks. Coordination between locomotor sub-functions in a harmonized manner appears then as an additional problem when considering legged locomotion. However, biological locomotion shows that appropriate design and control of each sub-function simplifies coordination. It means that only limited exchange of sensory information between the different locomotor sub-function controllers is required enabling the envisioned modular architecture of the locomotion control system. In this paper, we present different studies on implementing different locomotor sub-function controllers on models, robots, and an exoskeleton in addition to demonstrating their abilities in explaining humans' control strategies. PMID:28928650