Ultimate Realities: Deterministic and Evolutionary
ERIC Educational Resources Information Center
Moxley, Roy A.
2007-01-01
References to ultimate reality commonly turn up in the behavioral literature as references to determinism. However, this determinism is often difficult to interpret. There are different kinds of determinisms as well as different kinds of ultimate realities for a behaviorist to consider. To clarify some of the issues involved, the views of ultimate…
Ultimate Realities: Deterministic and Evolutionary
Moxley, Roy A
2007-01-01
References to ultimate reality commonly turn up in the behavioral literature as references to determinism. However, this determinism is often difficult to interpret. There are different kinds of determinisms as well as different kinds of ultimate realities for a behaviorist to consider. To clarify some of the issues involved, the views of ultimate realities are treated as falling along a continuum, with extreme views of complete indeterminism and complete determinism at either end and various mixes in between. Doing so brings into play evolutionary realities and the movement from indeterminism to determinism, as in Peirce's evolutionary cosmology. In addition, this framework helps to show how the views of determinism by B. F. Skinner and other behaviorists have shifted over time. PMID:22478489
Structural Deterministic Safety Factors Selection Criteria and Verification
NASA Technical Reports Server (NTRS)
Verderaime, V.
1992-01-01
Though current deterministic safety factors are arbitrarily and unaccountably specified, its ratio is rooted in resistive and applied stress probability distributions. This study approached the deterministic method from a probabilistic concept leading to a more systematic and coherent philosophy and criterion for designing more uniform and reliable high-performance structures. The deterministic method was noted to consist of three safety factors: a standard deviation multiplier of the applied stress distribution; a K-factor for the A- or B-basis material ultimate stress; and the conventional safety factor to ensure that the applied stress does not operate in the inelastic zone of metallic materials. The conventional safety factor is specifically defined as the ratio of ultimate-to-yield stresses. A deterministic safety index of the combined safety factors was derived from which the corresponding reliability proved the deterministic method is not reliability sensitive. The bases for selecting safety factors are presented and verification requirements are discussed. The suggested deterministic approach is applicable to all NASA, DOD, and commercial high-performance structures under static stresses.
Palmer, Tim N.; O’Shea, Michael
2015-01-01
How is the brain configured for creativity? What is the computational substrate for ‘eureka’ moments of insight? Here we argue that creative thinking arises ultimately from a synergy between low-energy stochastic and energy-intensive deterministic processing, and is a by-product of a nervous system whose signal-processing capability per unit of available energy has become highly energy optimised. We suggest that the stochastic component has its origin in thermal (ultimately quantum decoherent) noise affecting the activity of neurons. Without this component, deterministic computational models of the brain are incomplete. PMID:26528173
Virtual Reality in Schools: The Ultimate Educational Technology.
ERIC Educational Resources Information Center
Reid, Robert D.; Sykes, Wylmarie
1999-01-01
Discusses the use of virtual reality as an educational tool. Highlights include examples of virtual reality in public schools that lead to a more active learning process, simulated environments, integrating virtual reality into any curriculum, benefits to teachers and students, and overcoming barriers to implementation. (LRW)
Confronting an Augmented Reality
ERIC Educational Resources Information Center
Munnerley, Danny; Bacon, Matt; Wilson, Anna; Steele, James; Hedberg, John; Fitzgerald, Robert
2012-01-01
How can educators make use of augmented reality technologies and practices to enhance learning and why would we want to embrace such technologies anyway? How can an augmented reality help a learner confront, interpret and ultimately comprehend reality itself ? In this article, we seek to initiate a discussion that focuses on these questions, and…
Non Kolmogorov Probability Models Outside Quantum Mechanics
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2009-03-01
This paper is devoted to analysis of main conceptual problems in the interpretation of QM: reality, locality, determinism, physical state, Heisenberg principle, "deterministic" and "exact" theories, laws of chance, notion of event, statistical invariants, adaptive realism, EPR correlations and, finally, the EPR-chameleon experiment.
Intervention-Based Stochastic Disease Eradication
NASA Astrophysics Data System (ADS)
Billings, Lora; Mier-Y-Teran-Romero, Luis; Lindley, Brandon; Schwartz, Ira
2013-03-01
Disease control is of paramount importance in public health with infectious disease extinction as the ultimate goal. Intervention controls, such as vaccination of susceptible individuals and/or treatment of infectives, are typically based on a deterministic schedule, such as periodically vaccinating susceptible children based on school calendars. In reality, however, such policies are administered as a random process, while still possessing a mean period. Here, we consider the effect of randomly distributed intervention as disease control on large finite populations. We show explicitly how intervention control, based on mean period and treatment fraction, modulates the average extinction times as a function of population size and the speed of infection. In particular, our results show an exponential improvement in extinction times even though the controls are implemented using a random Poisson distribution. Finally, we discover those parameter regimes where random treatment yields an exponential improvement in extinction times over the application of strictly periodic intervention. The implication of our results is discussed in light of the availability of limited resources for control. Supported by the National Institute of General Medical Sciences Award No. R01GM090204
Epistemic Freedom and Education
ERIC Educational Resources Information Center
Hinchliffe, Geoffrey
2018-01-01
First of all, I define the concept of epistemic freedom in the light of the changing nature of educational practice that prioritise over-prescriptive conceptions of learning. I defend the 'reality' of this freedom against possible determinist-related criticisms. I do this by stressing the concept of agency as characterised by 'becoming'. I also…
The Non-Signalling theorem in generalizations of Bell's theorem
NASA Astrophysics Data System (ADS)
Walleczek, J.; Grössing, G.
2014-04-01
Does "epistemic non-signalling" ensure the peaceful coexistence of special relativity and quantum nonlocality? The possibility of an affirmative answer is of great importance to deterministic approaches to quantum mechanics given recent developments towards generalizations of Bell's theorem. By generalizations of Bell's theorem we here mean efforts that seek to demonstrate the impossibility of any deterministic theories to obey the predictions of Bell's theorem, including not only local hidden-variables theories (LHVTs) but, critically, of nonlocal hidden-variables theories (NHVTs) also, such as de Broglie-Bohm theory. Naturally, in light of the well-established experimental findings from quantum physics, whether or not a deterministic approach to quantum mechanics, including an emergent quantum mechanics, is logically possible, depends on compatibility with the predictions of Bell's theorem. With respect to deterministic NHVTs, recent attempts to generalize Bell's theorem have claimed the impossibility of any such approaches to quantum mechanics. The present work offers arguments showing why such efforts towards generalization may fall short of their stated goal. In particular, we challenge the validity of the use of the non-signalling theorem as a conclusive argument in favor of the existence of free randomness, and therefore reject the use of the non-signalling theorem as an argument against the logical possibility of deterministic approaches. We here offer two distinct counter-arguments in support of the possibility of deterministic NHVTs: one argument exposes the circularity of the reasoning which is employed in recent claims, and a second argument is based on the inconclusive metaphysical status of the non-signalling theorem itself. We proceed by presenting an entirely informal treatment of key physical and metaphysical assumptions, and of their interrelationship, in attempts seeking to generalize Bell's theorem on the basis of an ontic, foundational interpretation of the non-signalling theorem. We here argue that the non-signalling theorem must instead be viewed as an epistemic, operational theorem i.e. one that refers exclusively to what epistemic agents can, or rather cannot, do. That is, we emphasize that the non-signalling theorem is a theorem about the operational inability of epistemic agents to signal information. In other words, as a proper principle, the non-signalling theorem may only be employed as an epistemic, phenomenological, or operational principle. Critically, our argument emphasizes that the non-signalling principle must not be used as an ontic principle about physical reality as such, i.e. as a theorem about the nature of physical reality independently of epistemic agents e.g. human observers. One major reason in favor of our conclusion is that any definition of signalling or of non-signalling invariably requires a reference to epistemic agents, and what these agents can actually measure and report. Otherwise, the non-signalling theorem would equal a general "no-influence" theorem. In conclusion, under the assumption that the non-signalling theorem is epistemic (i.e. "epistemic non-signalling"), the search for deterministic approaches to quantum mechanics, including NHVTs and an emergent quantum mechanics, continues to be a viable research program towards disclosing the foundations of physical reality at its smallest dimensions.
Salgia, Ravi; Mambetsariev, Isa; Hewelt, Blake; Achuthan, Srisairam; Li, Haiqing; Poroyko, Valeriy; Wang, Yingyu; Sattler, Martin
2018-05-25
Mathematical cancer models are immensely powerful tools that are based in part on the fractal nature of biological structures, such as the geometry of the lung. Cancers of the lung provide an opportune model to develop and apply algorithms that capture changes and disease phenotypes. We reviewed mathematical models that have been developed for biological sciences and applied them in the context of small cell lung cancer (SCLC) growth, mutational heterogeneity, and mechanisms of metastasis. The ultimate goal is to develop the stochastic and deterministic nature of this disease, to link this comprehensive set of tools back to its fractalness and to provide a platform for accurate biomarker development. These techniques may be particularly useful in the context of drug development research, such as combination with existing omics approaches. The integration of these tools will be important to further understand the biology of SCLC and ultimately develop novel therapeutics.
Probing the Limits of Reality: The Metaphysics in Science Fiction.
ERIC Educational Resources Information Center
Taylor, John L.
2003-01-01
Addresses metaphysical questions concerning the ultimate structure of reality and discusses scientific nature. Suggests that the world cannot afford to neglect the role of conceptual analysis in thinking critically about the possibilities that science fiction claims to describe. (Author/KHR)
Reach for the Stars: A Constellational Approach to Ethnographies of Elite Schools
ERIC Educational Resources Information Center
Prosser, Howard
2014-01-01
This paper offers a method for examining elite schools in a global setting by appropriating Theodor Adorno's constellational approach. I contend that arranging ideas and themes in a non-deterministic fashion can illuminate the social reality of elite schools. Drawing on my own fieldwork at an elite school in Argentina, I suggest that local and…
NASA Astrophysics Data System (ADS)
Han, Jiang; Chen, Ye-Hwa; Zhao, Xiaomin; Dong, Fangfang
2018-04-01
A novel fuzzy dynamical system approach to the control design of flexible joint manipulators with mismatched uncertainty is proposed. Uncertainties of the system are assumed to lie within prescribed fuzzy sets. The desired system performance includes a deterministic phase and a fuzzy phase. First, by creatively implanting a fictitious control, a robust control scheme is constructed to render the system uniformly bounded and uniformly ultimately bounded. Both the manipulator modelling and control scheme are deterministic and not IF-THEN heuristic rules-based. Next, a fuzzy-based performance index is proposed. An optimal design problem for a control design parameter is formulated as a constrained optimisation problem. The global solution to this problem can be obtained from solving two quartic equations. The fuzzy dynamical system approach is systematic and is able to assure the deterministic performance as well as to minimise the fuzzy performance index.
Regarding Reality: Some Consequences of Two Incapacities
Edelman, Shimon
2011-01-01
By what empirical means can a person determine whether he or she is presently awake or dreaming? Any conceivable test addressing this question, which is a special case of the classical metaphysical doubting of reality, must be statistical (for the same reason that empirical science is, as noted by Hume). Subjecting the experienced reality to any kind of statistical test (for instance, a test for bizarreness) requires, however, that a set of baseline measurements be available. In a dream, or in a simulation, any such baseline data would be vulnerable to tampering by the same processes that give rise to the experienced reality, making the outcome of a reality test impossible to trust. Moreover, standard cryptographic defenses against such tampering cannot be relied upon, because of the potentially unlimited reach of reality modification within a dream, which may range from the integrity of the verification keys to the declared outcome of the entire process. In the face of this double predicament, the rational course of action is to take reality at face value. The predicament also has some intriguing corollaries. In particular, even the most revealing insight that a person may gain into the ultimate nature of reality (for instance, by attaining enlightenment in the Buddhist sense) is ultimately unreliable, for the reasons just mentioned. At the same time, to adhere to this principle, one has to be aware of it, which may not be possible in various states of reduced or altered cognitive function such as dreaming or religious experience. Thus, a subjectively enlightened person may still lack the one truly important piece of the puzzle concerning his or her existence. PMID:21716920
On the usefulness of the concept of presence in virtual reality applications
NASA Astrophysics Data System (ADS)
Mestre, Daniel R.
2015-03-01
Virtual Reality (VR) leads to realistic experimental situations, while enabling researchers to have deterministic control on these situations, and to precisely measure participants' behavior. However, because more realistic and complex situations can be implemented, important questions arise, concerning the validity and representativeness of the observed behavior, with reference to a real situation. One example is the investigation of a critical (virtually dangerous) situation, in which the participant knows that no actual threat is present in the simulated situation, and might thus exhibit a behavioral response that is far from reality. This poses serious problems, for instance in training situations, in terms of transfer of learning to a real situation. Facing this difficult question, it seems necessary to study the relationships between three factors: immersion (physical realism), presence (psychological realism) and behavior. We propose a conceptual framework, in which presence is a necessary condition for the emergence of a behavior that is representative of what is observed in real conditions. Presence itself depends not only on physical immersive characteristics of the Virtual Reality setup, but also on contextual and psychological factors.
Virtual Reality and Augmented Reality in Plastic Surgery: A Review.
Kim, Youngjun; Kim, Hannah; Kim, Yong Oock
2017-05-01
Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed.
Virtual Reality and Augmented Reality in Plastic Surgery: A Review
Kim, Youngjun; Kim, Hannah
2017-01-01
Recently, virtual reality (VR) and augmented reality (AR) have received increasing attention, with the development of VR/AR devices such as head-mounted displays, haptic devices, and AR glasses. Medicine is considered to be one of the most effective applications of VR/AR. In this article, we describe a systematic literature review conducted to investigate the state-of-the-art VR/AR technology relevant to plastic surgery. The 35 studies that were ultimately selected were categorized into 3 representative topics: VR/AR-based preoperative planning, navigation, and training. In addition, future trends of VR/AR technology associated with plastic surgery and related fields are discussed. PMID:28573091
2011-04-30
involvement before and after the training can have a significant impact on whether trainees use their newly developed skills” (Bassi & Russ- Eft , 1997). Other...motivation, cultural realities, learning self-efficacy, age, etc., that make a deterministic forecast more difficult (Bassi & Russ- Eft , 1997). Other...clo=fkclojba=`e^kdb==== - 280 - = = personnel can tap into freely. Give personnel easy access to key information sources of expertise. It
Modeling Reality - How Computers Mirror Life
NASA Astrophysics Data System (ADS)
Bialynicki-Birula, Iwo; Bialynicka-Birula, Iwona
2005-01-01
The bookModeling Reality covers a wide range of fascinating subjects, accessible to anyone who wants to learn about the use of computer modeling to solve a diverse range of problems, but who does not possess a specialized training in mathematics or computer science. The material presented is pitched at the level of high-school graduates, even though it covers some advanced topics (cellular automata, Shannon's measure of information, deterministic chaos, fractals, game theory, neural networks, genetic algorithms, and Turing machines). These advanced topics are explained in terms of well known simple concepts: Cellular automata - Game of Life, Shannon's formula - Game of twenty questions, Game theory - Television quiz, etc. The book is unique in explaining in a straightforward, yet complete, fashion many important ideas, related to various models of reality and their applications. Twenty-five programs, written especially for this book, are provided on an accompanying CD. They greatly enhance its pedagogical value and make learning of even the more complex topics an enjoyable pleasure.
Epistemic companions: shared reality development in close relationships.
Rossignac-Milon, Maya; Higgins, E Tory
2018-01-11
We propose a framework outlining the development of shared reality in close relationships. In this framework, we attempt to integrate disparate close relationship phenomena under the conceptual umbrella of shared reality. We argue that jointly satisfying epistemic needs-making sense of the world together-plays an important but under-appreciated role in establishing and maintaining close relationships. Specifically, we propose that dyads progress through four cumulative phases in which new forms of shared reality emerge. Relationships are often initiated when people discover Shared Feelings, which then facilitate the co-construction of dyad-specific Shared Practices. Partners then form an interdependent web of Shared Coordination and ultimately develop a Shared Identity. Each emergent form of shared reality continues to evolve throughout subsequent phases, and, if neglected, can engender relationship dissolution. Copyright © 2018 Elsevier Ltd. All rights reserved.
The RoboCup Mixed Reality League - A Case Study
NASA Astrophysics Data System (ADS)
Gerndt, Reinhard; Bohnen, Matthias; da Silva Guerra, Rodrigo; Asada, Minoru
In typical mixed reality systems there is only a one-way interaction from real to virtual. A human user or the physics of a real object may influence the behavior of virtual objects, but real objects usually cannot be influenced by the virtual world. By introducing real robots into the mixed reality system, we allow a true two-way interaction between virtual and real worlds. Our system has been used since 2007 to implement the RoboCup mixed reality soccer games and other applications for research and edutainment. Our framework system is freely programmable to generate any virtual environment, which may then be further supplemented with virtual and real objects. The system allows for control of any real object based on differential drive robots. The robots may be adapted for different applications, e.g., with markers for identification or with covers to change shape and appearance. They may also be “equipped” with virtual tools. In this chapter we present the hardware and software architecture of our system and some applications. The authors believe this can be seen as a first implementation of Ivan Sutherland’s 1965 idea of the ultimate display: “The ultimate display would, of course, be a room within which the computer can control the existence of matter …” (Sutherland, 1965, Proceedings of IFIPS Congress 2:506-508).
Whiteheadian Actual Entitities and String Theory
NASA Astrophysics Data System (ADS)
Bracken, Joseph A.
2012-06-01
In the philosophy of Alfred North Whitehead, the ultimate units of reality are actual entities, momentary self-constituting subjects of experience which are too small to be sensibly perceived. Their combination into "societies" with a "common element of form" produces the organisms and inanimate things of ordinary sense experience. According to the proponents of string theory, tiny vibrating strings are the ultimate constituents of physical reality which in harmonious combination yield perceptible entities at the macroscopic level of physical reality. Given that the number of Whiteheadian actual entities and of individual strings within string theory are beyond reckoning at any given moment, could they be two ways to describe the same non-verifiable foundational reality? For example, if one could establish that the "superject" or objective pattern of self- constitution of an actual entity vibrates at a specific frequency, its affinity with the individual strings of string theory would be striking. Likewise, if one were to claim that the size and complexity of Whiteheadian 'societies" require different space-time parameters for the dynamic interrelationship of constituent actual entities, would that at least partially account for the assumption of 10 or even 26 instead of just 3 dimensions within string theory? The overall conclusion of this article is that, if a suitably revised understanding of Whiteheadian metaphysics were seen as compatible with the philosophical implications of string theory, their combination into a single world view would strengthen the plausibility of both schemes taken separately. Key words: actual entities, subject/superjects, vibrating strings, structured fields of activity, multi-dimensional physical reality.
The seventh servant: the implications of a truth drive in Bion's theory of 'O'.
Grotstein, James S
2004-10-01
Drawing upon Bion's published works on the subjects of truth, dreaming, alpha-function and transformations in 'O', the author independently postulates that there exists a 'truth instinctual drive' that subserves a truth principle, the latter of which is associated with the reality principle. Further, he suggests, following Bion's postulation, that 'alpha-function' and dreaming/phantasying constitute unconscious thinking processes and that they mediate the activity of this 'truth drive' (quest, pulsion), which the author hypothesizes constitutes another aspect of a larger entity that also includes the epistemophilic component drive. It purportedly seeks and transmits as well as includes what Bion (1965, pp. 147-9) calls 'O', the 'Absolute Truth, Ultimate Reality, O' (also associated with infinity, noumena or things-in-themselves, and 'godhead') (1970, p. 26). It is further hypothesized that the truth drive functions in collaboration with an 'unconscious consciousness' that is associated with the faculty of 'attention', which is also known as 'intuition'. It is responsive to internal psychical reality and constitutes Bion's 'seventh servant'. O, the ultimate landscape of psychoanalysis, has many dimensions, but the one that seems to interest Bion is that of the emotional experience of the analysand's and the analyst's 'evolving O' respectively (1970, p. 52) during the analytic session. The author thus hypothesizes that a sense of truth presents itself to the subject as a quest for truth which has the quality and force of an instinctual drive and constitutes the counterpart to the epistemophilic drive. This 'truth quest' or 'drive' is hypothesized to be the source of the generation of the emotional truth of one's ongoing experiences, both conscious and unconscious. It is proposed that emotions are beacons of truth in regard to the acceptance of reality. The concepts of an emotional truth drive and a truth principle would help us understand why analysands are able to accept analysts' interpretations that favor the operation of the reality principle over the pleasure principle--because of what is postulated as their overriding adaptive need for truth. Ultimately, it would seem that Bion's legacy of truth aims at integrating finite man with infinite man.
Virtually the ultimate research lab.
Kulik, Alexander
2018-04-26
Virtual reality (VR) can serve as a viable platform for psychological research. The real world with many uncontrolled variables can be masked to immerse participants in complex interactive environments that are under full experimental control. However, as any other laboratory setting, these simulations are not perceived equally to reality and they also afford different behaviour. We need a better understanding of these differences, which are often related to parameters of the technical setup, to support valid interpretations of experimental results. © 2018 The British Psychological Society.
Probing the limits of reality: the metaphysics in science fiction
NASA Astrophysics Data System (ADS)
Taylor, John L.
2003-01-01
Science fiction provides a genre in which metaphysical questions concerning the ultimate structure of reality regularly arise. In addressing these questions, contemporary scientists tend to assume that the questions are of a scientific nature and should be handled solely by reference to our best theories. In this paper, it is argued that we cannot afford to neglect the role of conceptual analysis - a distinctively philosophical task - in thinking critically about the possibilities that science fiction claims to describe.
ERIC Educational Resources Information Center
Beeken, Paul
2014-01-01
Graphing is an essential skill that forms the foundation of any physical science. Understanding the relationships between measurements ultimately determines which modeling equations are successful in predicting observations. Over the years, science and math teachers have approached teaching this skill with a variety of techniques. For secondary…
Virtual reality gaming in the rehabilitation of the upper extremities post-stroke.
Yates, Michael; Kelemen, Arpad; Sik Lanyi, Cecilia
2016-01-01
Occurrences of strokes often result in unilateral upper limb dysfunction. Dysfunctions of this nature frequently persist and can present chronic limitations to activities of daily living. Research into applying virtual reality gaming systems to provide rehabilitation therapy have seen resurgence. Themes explored in stroke rehab for paretic limbs are action observation and imitation, versatility, intensity and repetition and preservation of gains. Fifteen articles were ultimately selected for review. The purpose of this literature review is to compare the various virtual reality gaming modalities in the current literature and ascertain their efficacy. The literature supports the use of virtual reality gaming rehab therapy as equivalent to traditional therapies or as successful augmentation to those therapies. While some degree of rigor was displayed in the literature, small sample sizes, variation in study lengths and therapy durations and unequal controls reduce generalizability and comparability. Future studies should incorporate larger sample sizes and post-intervention follow-up measures.
Applying Virtual Reality to commercial Edutainment
NASA Technical Reports Server (NTRS)
Grissom, F.; Goza, Sharon P.; Goza, S. Michael
1994-01-01
Virtual reality (VR) when defined as a computer generated, immersive, three dimensional graphics environment which provides varying degrees of interactivity, remains an expensive, highly specialized application, yet to find its way into the school, home, or business. As a novel approach to a theme park-type attraction, though, its use can be justified. This paper describes how a virtual reality 'tour of the human digestive system' was created for the Omniplex Science Museum of Oklahoma City, Oklahoma. The customers main objectives were: (1) to educate; (2) to entertain; (3) to draw visitors; and (4) to generate revenue. The 'Edutainment' system ultimately delivered met these goals. As more such systems come into existence the resulting library of licensable programs will greatly reduce development costs to individual institutions.
Schoeman, Rogier M; Kemna, Evelien W M; Wolbers, Floor; van den Berg, Albert
2014-02-01
In this article, we present a microfluidic device capable of successive high-yield single-cell encapsulation in droplets, with additional droplet pairing, fusion, and shrinkage. Deterministic single-cell encapsulation is realized using Dean-coupled inertial ordering of cells in a Yin-Yang-shaped curved microchannel using a double T-junction, with a frequency over 2000 Hz, followed by controlled droplet pairing with a 100% success rate. Subsequently, droplet fusion is realized using electrical actuation resulting in electro-coalescence of two droplets, each containing a single HL60 cell, with 95% efficiency. Finally, volume reduction of the fused droplet up to 75% is achieved by a triple pitchfork structure. This droplet volume reduction is necessary to obtain close cell-cell membrane contact necessary for final cell electrofusion, leading to hybridoma formation, which is the ultimate aim of this research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Unifying Complexity and Information
NASA Astrophysics Data System (ADS)
Ke, Da-Guan
2013-04-01
Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.
The One Human Problem, Its Solution, and Its Relation to UFO Phenomen
1977-01-01
spirit, stages of evolution, technological species, telepathy , time, tobiscope, tulpa, ultraviolet, unidentified flying objects. Block20. (Continued...manner undreamed of in his present wildest imagination. And indeed there are stranger things in ultimate reality than are dreamed of in our
When Depressive Cognitions Reflect Negative Realities.
ERIC Educational Resources Information Center
Krantz, Susan E.
The cognitive model of depression postulates that the depressed individual's cognitions are not only negative, but erroneous and impervious to information from the environment. However, the valence of that information ultimately determines whether those cognitions are impervious or merely receptive. The actual life circumstances of the depressed…
NASA Astrophysics Data System (ADS)
Pavese, Christian; Tibaldi, Carlo; Larsen, Torben J.; Kim, Taeseong; Thomsen, Kenneth
2016-09-01
The aim is to provide a fast and reliable approach to estimate ultimate blade loads for a multidisciplinary design optimization (MDO) framework. For blade design purposes, the standards require a large amount of computationally expensive simulations, which cannot be efficiently run each cost function evaluation of an MDO process. This work describes a method that allows integrating the calculation of the blade load envelopes inside an MDO loop. Ultimate blade load envelopes are calculated for a baseline design and a design obtained after an iteration of an MDO. These envelopes are computed for a full standard design load basis (DLB) and a deterministic reduced DLB. Ultimate loads extracted from the two DLBs with the two blade designs each are compared and analyzed. Although the reduced DLB supplies ultimate loads of different magnitude, the shape of the estimated envelopes are similar to the one computed using the full DLB. This observation is used to propose a scheme that is computationally cheap, and that can be integrated inside an MDO framework, providing a sufficiently reliable estimation of the blade ultimate loading. The latter aspect is of key importance when design variables implementing passive control methodologies are included in the formulation of the optimization problem. An MDO of a 10 MW wind turbine blade is presented as an applied case study to show the efficacy of the reduced DLB concept.
Global Warming - Myth or Reality?, The Erring Ways of Climatology
NASA Astrophysics Data System (ADS)
Leroux, Marcel
In the global-warming debate, definitive answers to questions about ultimate causes and effects remain elusive. In Global Warming: Myth or Reality? Marcel Leroux seeks to separate fact from fiction in this critical debate from a climatological perspective. Beginning with a review of the dire hypotheses for climate trends, the author describes the history of the 1998 Intergovernmental Panel on Climate Change (IPCC) and many subsequent conferences. He discusses the main conclusions of the three IPCC reports and the predicted impact on global temperatures, rainfall, weather and climate, while highlighting the mounting confusion and sensationalism of reports in the media.
Quantum Locality, Rings a Bell?: Bell's Inequality Meets Local Reality and True Determinism
NASA Astrophysics Data System (ADS)
Sánchez-Kuntz, Natalia; Nahmad-Achar, Eduardo
2018-01-01
By assuming a deterministic evolution of quantum systems and taking realism into account, we carefully build a hidden variable theory for Quantum Mechanics (QM) based on the notion of ontological states proposed by 't Hooft (The cellular automaton interpretation of quantum mechanics, arXiv:1405.1548v3, 2015; Springer Open 185, https://doi.org/10.1007/978-3-319-41285-6, 2016). We view these ontological states as the ones embedded with realism and compare them to the (usual) quantum states that represent superpositions, viewing the latter as mere information of the system they describe. Such a deterministic model puts forward conditions for the applicability of Bell's inequality: the usual inequality cannot be applied to the usual experiments. We build a Bell-like inequality that can be applied to the EPR scenario and show that this inequality is always satisfied by QM. In this way we show that QM can indeed have a local interpretation, and thus meet with the causal structure imposed by the Theory of Special Relativity in a satisfying way.
Borrel, Alexandre; Fourches, Denis
2017-12-01
There is a growing interest for the broad use of Augmented Reality (AR) and Virtual Reality (VR) in the fields of bioinformatics and cheminformatics to visualize complex biological and chemical structures. AR and VR technologies allow for stunning and immersive experiences, offering untapped opportunities for both research and education purposes. However, preparing 3D models ready to use for AR and VR is time-consuming and requires a technical expertise that severely limits the development of new contents of potential interest for structural biologists, medicinal chemists, molecular modellers and teachers. Herein we present the RealityConvert software tool and associated website, which allow users to easily convert molecular objects to high quality 3D models directly compatible for AR and VR applications. For chemical structures, in addition to the 3D model generation, RealityConvert also generates image trackers, useful to universally call and anchor that particular 3D model when used in AR applications. The ultimate goal of RealityConvert is to facilitate and boost the development and accessibility of AR and VR contents for bioinformatics and cheminformatics applications. http://www.realityconvert.com. dfourch@ncsu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Šendula-Jengić, Vesna; Šendula-Pavelić, Martina; Hodak, Jelena
2016-06-01
In terms of health and healthcare cyberspace and virtual reality can be used differently and for different purposes and consequently create different outcomes. The three main areas which we shall discuss here are: 1) cyberspace as provider of health information and self-help resources, since the anonymity cyberspace provides is particularly important in the highly stigmatized field of psychiatry where a large number of people never seek professional help, which in turn negatively affects not only the person in question, but the family and ultimately the society (work efficiency, disability-adjusted life year - DALY, etc.), 2) cyberspace and virtual reality (VR) as cause of psychopathology, starting from violent behaviour, to addictive behaviour and other, 3) and finally cyberspace and VR as providers of efficient professional therapy in the field of psychiatry.
Finally making sense of the double-slit experiment.
Aharonov, Yakir; Cohen, Eliahu; Colombo, Fabrizio; Landsberger, Tomer; Sabadini, Irene; Struppa, Daniele C; Tollaksen, Jeff
2017-06-20
Feynman stated that the double-slit experiment "…has in it the heart of quantum mechanics. In reality, it contains the only mystery" and that "nobody can give you a deeper explanation of this phenomenon than I have given; that is, a description of it" [Feynman R, Leighton R, Sands M (1965) The Feynman Lectures on Physics ]. We rise to the challenge with an alternative to the wave function-centered interpretations: instead of a quantum wave passing through both slits, we have a localized particle with nonlocal interactions with the other slit. Key to this explanation is dynamical nonlocality, which naturally appears in the Heisenberg picture as nonlocal equations of motion. This insight led us to develop an approach to quantum mechanics which relies on pre- and postselection, weak measurements, deterministic, and modular variables. We consider those properties of a single particle that are deterministic to be primal. The Heisenberg picture allows us to specify the most complete enumeration of such deterministic properties in contrast to the Schrödinger wave function, which remains an ensemble property. We exercise this approach by analyzing a version of the double-slit experiment augmented with postselection, showing that only it and not the wave function approach can be accommodated within a time-symmetric interpretation, where interference appears even when the particle is localized. Although the Heisenberg and Schrödinger pictures are equivalent formulations, nevertheless, the framework presented here has led to insights, intuitions, and experiments that were missed from the old perspective.
Mixed reality ultrasound guidance system: a case study in system development and a cautionary tale.
Ameri, Golafsoun; Baxter, John S H; Bainbridge, Daniel; Peters, Terry M; Chen, Elvis C S
2018-04-01
Real-time ultrasound has become a crucial aspect of several image-guided interventions. One of the main constraints of such an approach is the difficulty in interpretability of the limited field of view of the image, a problem that has recently been addressed using mixed reality, such as augmented reality and augmented virtuality. The growing popularity and maturity of mixed reality has led to a series of informal guidelines to direct development of new systems and to facilitate regulatory approval. However, the goals of mixed reality image guidance systems and the guidelines for their development have not been thoroughly discussed. The purpose of this paper is to identify and critically examine development guidelines in the context of a mixed reality ultrasound guidance system through a case study. A mixed reality ultrasound guidance system tailored to central line insertions was developed in close collaboration with an expert user. This system outperformed ultrasound-only guidance in a novice user study and has obtained clearance for clinical use in humans. A phantom study with 25 experienced physicians was carried out to compare the performance of the mixed reality ultrasound system against conventional ultrasound-only guidance. Despite the previous promising results, there was no statistically significant difference between the two systems. Guidelines for developing mixed reality image guidance systems cannot be applied indiscriminately. Each design decision, no matter how well justified, should be the subject of scientific and technical investigation. Iterative and small-scale evaluation can readily unearth issues and previously unknown or implicit system requirements. We recommend a wary eye in development of mixed reality ultrasound image guidance systems emphasizing small-scale iterative evaluation alongside system development. Ultimately, we recommend that the image-guided intervention community furthers and deepens this discussion into best practices in developing image-guided interventions.
NASA Astrophysics Data System (ADS)
Rodríguez, Clara Rojas; Fernández Calvo, Gabriel; Ramis-Conde, Ignacio; Belmonte-Beitia, Juan
2017-08-01
Tumor-normal cell interplay defines the course of a neoplastic malignancy. The outcome of this dual relation is the ultimate prevailing of one of the cells and the death or retreat of the other. In this paper we study the mathematical principles that underlay one important scenario: that of slow-progressing cancers. For this, we develop, within a stochastic framework, a mathematical model to account for tumor-normal cell interaction in such a clinically relevant situation and derive a number of deterministic approximations from the stochastic model. We consider in detail the existence and uniqueness of the solutions of the deterministic model and study the stability analysis. We then focus our model to the specific case of low grade gliomas, where we introduce an optimal control problem for different objective functionals under the administration of chemotherapy. We derive the conditions for which singular and bang-bang control exist and calculate the optimal control and states.
Understanding resistant effect of mosquito on fumigation strategy in dengue control program
NASA Astrophysics Data System (ADS)
Aldila, D.; Situngkir, N.; Nareswari, K.
2018-01-01
A mathematical model of dengue disease transmission will be introduced in this talk with involving fumigation intervention into mosquito population. Worsening effect of uncontrolled fumigation in the form of resistance of mosquito to fumigation chemicals will also be included into the model to capture the reality in the field. Deterministic approach in a 9 dimensional of ordinary differential equation will be used. Analytical result about the existence and local stability of the equilibrium points followed with the basic reproduction number will be discussed. Some numerical result will be performed for some scenario to give a better interpretation for the analytical results.
Web 2.0 for the Online Graduate Student: Technology Immersion for Both Curriculum and Residency
ERIC Educational Resources Information Center
Hewitt, Anne M.; Spencer, Susan S.
2012-01-01
Technology Integration has emerged as the ultimate critical educational challenge for the twenty-first century. Although many universities tout technology immersion in strategic plans, reality suggests that faculty often serve as the key change agents. As online programs increase exponentially, technology best practices become essential for fully…
Radical Constructivism, and the Sin of Relativism
ERIC Educational Resources Information Center
Quale, Andreas
2007-01-01
The epistemology of "relativism" that is featured by the theory of radical constructivism is addressed. In particular, I examine several objections, all based on this epistemic position of relativism, that are often raised by critics of the theory: the charge of "reality denial" (which, it is often claimed, must lead ultimately to the…
The Common Core "State" Standards: The Arts and Education Reform
ERIC Educational Resources Information Center
Wexler, Alice
2014-01-01
In this commentary, Alice Wexler notes that as the Common Core State Standards (CCSS) become reality, teachers have reason for concern. She contends that this reform to public education has consequently marginalized the arts and exacerbated the inequities of people in poverty and those with disabilities. Teachers, principals and, ultimately,…
ERIC Educational Resources Information Center
Vaughn, Margaret
2014-01-01
Theory suggests that effective teachers should possess a vision for teaching, but the reality of teaching within the current educational policy climate raises questions about teachers' autonomy of their instructional decisions and ultimately their personal convictions for teaching. This exploratory study examines 11 teachers' visions of…
Reality, Contextuality, and Probability in Quantum Theory and Beyond
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
This chapter explores the relationships among reality, contextuality, and probability, especially in quantum theory and, brie y and by extension, in other fields where these concepts, in their quantum-like versions, may play key roles. The chapter contends, following Derrida's argument, that while no meaning or event could be determined apart from its context, no context ultimately permits saturation, that is, could ever be determined with certainty. Any such determination is ultimately provisional. However, because of its mathematical-experimental character, physics allows one, in classical physics and relativity, to disregard the role of the context of observation in describing the physical systems considered, and in quantum mechanics, where the context of observation cannot be so disregarded, to determine such a context sufficiently. While, however, classical physics or relativity and quantum mechanics can do so sufficiently for their disciplinary functioning and practice, they cannot do so entirely. Moreover, a given concept of this functioning, especially as concerns what is considered its proper functioning, still depends on a broader contextual field that defies saturation or guaranteed determination.
Death: the ultimate social construction of reality.
Brabant, Sarah
Using Berger and Luckmann's thesis (1966) on the social construction of reality as rationale, this research analyzes the death drawings of 946 university students enrolled in a Death and Dying course between 1985 and 2004 to investigate the basic constructs elicited by the word "death": dying, moment of death, after death, after life, and bereavement. Consistent with earlier research, gender, race, religion, and religiosity proved to be significant factors. As expected, personal experience with grief was strongly correlated with drawings focused on bereavement. In contrast to earlier studies, fear of death was not significantly related to a particular construct. Implications for research, education, and counseling are discussed.
Camp, Christopher L
2018-05-01
Although we have come a long way, the rapidly expanding field of virtual reality simulation for arthroscopic surgical skills acquisition is supported by only a limited amount of evidence. That said, the good news is that the evidence suggests that simulator experience translates into improved performance in the operating room. If proving this relation is our ultimate goal, more work is certainly needed. In this commentary, a "Task List" is proposed for surgeons and educators interested in using simulators and better defining their role in resident education. Copyright © 2018 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.
Having a Daughter with a Disability: Is It Different for Girls?
ERIC Educational Resources Information Center
Horne, Richard, Ed.
1990-01-01
This guide focuses on some of the realities parents must face in helping their daughters with disabilities to become more self-reliant and, ultimately, independent. The degree to which daughters with a disability are encouraged to strive for an independent life may be critically less than for sons. These differences have far-reaching implications…
The "Research-Teaching Nexus" and the Learning-Teaching Relationship: Who's in Charge?
ERIC Educational Resources Information Center
Rowe, Christopher; Okell, Eleanor
2009-01-01
This article engages, from the point of view of the higher education (HE) department and practitioner, with the realities, and explores the rhetoric, of the "research-teaching nexus" with reference to the role of research and research skills, in the context of the student experience in higher education. The ultimate questions are: How…
Developing the E-Scape Software System
ERIC Educational Resources Information Center
Derrick, Karim
2012-01-01
Most innovations have contextual pre-cursors that prompt new ways of thinking and in their turn help to give form to the new reality. This was the case with the e-scape software development process. The origins of the system existed in software components and ideas that we had developed through previous projects, but the ultimate direction we took…
Rock and Roll! Using Classic Rock as a Guide to Fantasy-Theme Analysis
ERIC Educational Resources Information Center
Waite, Lisa
2008-01-01
This article presents an activity which makes use of Don McLean's song "American Pie," to engage students in fantasy-theme analysis. This discussion ultimately demonstrates how reality is constructed to satisfy the views shared by groups and individuals. Fantasy-theme analysis argues that audiences frequently shape their own connotation of an…
Entrepreneurs, chance, and the deterministic concentration of wealth.
Fargione, Joseph E; Lehman, Clarence; Polasky, Stephen
2011-01-01
In many economies, wealth is strikingly concentrated. Entrepreneurs--individuals with ownership in for-profit enterprises--comprise a large portion of the wealthiest individuals, and their behavior may help explain patterns in the national distribution of wealth. Entrepreneurs are less diversified and more heavily invested in their own companies than is commonly assumed in economic models. We present an intentionally simplified individual-based model of wealth generation among entrepreneurs to assess the role of chance and determinism in the distribution of wealth. We demonstrate that chance alone, combined with the deterministic effects of compounding returns, can lead to unlimited concentration of wealth, such that the percentage of all wealth owned by a few entrepreneurs eventually approaches 100%. Specifically, concentration of wealth results when the rate of return on investment varies by entrepreneur and by time. This result is robust to inclusion of realities such as differing skill among entrepreneurs. The most likely overall growth rate of the economy decreases as businesses become less diverse, suggesting that high concentrations of wealth may adversely affect a country's economic growth. We show that a tax on large inherited fortunes, applied to a small portion of the most fortunate in the population, can efficiently arrest the concentration of wealth at intermediate levels.
Virtual reality based surgery simulation for endoscopic gynaecology.
Székely, G; Bajka, M; Brechbühler, C; Dual, J; Enzler, R; Haller, U; Hug, J; Hutter, R; Ironmonger, N; Kauer, M; Meier, V; Niederer, P; Rhomberg, A; Schmid, P; Schweitzer, G; Thaler, M; Vuskovic, V; Tröster, G
1999-01-01
Virtual reality (VR) based surgical simulator systems offer very elegant possibilities to both enrich and enhance traditional education in endoscopic surgery. However, while a wide range of VR simulator systems have been proposed and realized in the past few years, most of these systems are far from able to provide a reasonably realistic surgical environment. We explore the basic approaches to the current limits of realism and ultimately seek to extend these based on our description and analysis of the most important components of a VR-based endoscopic simulator. The feasibility of the proposed techniques is demonstrated on a first modular prototype system implementing the basic algorithms for VR-training in gynaecologic laparoscopy.
Castiel, L D
1999-01-01
The author analyzes the underlying theoretical aspects in the construction of the molecular watershed of epidemiology and the concept of genetic risk, focusing on issues raised by contemporary reality: new technologies, globalization, proliferation of communications strategies, and the dilution of identity matrices. He discusses problems pertaining to the establishment of such new interdisciplinary fields as molecular epidemiology and molecular genetics. Finally, he analyzes the repercussions of the social communication of genetic content, especially as related to predictive genetic tests and cloning of animals, based on triumphal, deterministic metaphors sustaining beliefs relating to the existence and supremacy of concepts such as 'purity', 'essence', and 'unification' of rational, integrated 'I's/egos'.
Ambient Intelligence in Multimeda and Virtual Reality Environments for the rehabilitation
NASA Astrophysics Data System (ADS)
Benko, Attila; Cecilia, Sik Lanyi
This chapter presents a general overview about the use of multimedia and virtual reality in rehabilitation and assistive and preventive healthcare. This chapter deals with multimedia, virtual reality applications based AI intended for use by medical doctors, nurses, special teachers and further interested persons. It describes methods how multimedia and virtual reality is able to assist their work. These include the areas how multimedia and virtual reality can help the patients everyday life and their rehabilitation. In the second part of the chapter we present the Virtual Therapy Room (VTR) a realized application for aphasic patients that was created for practicing communication and expressing emotions in a group therapy setting. The VTR shows a room that contains a virtual therapist and four virtual patients (avatars). The avatars are utilizing their knowledge base in order to answer the questions of the user providing an AI environment for the rehabilitation. The user of the VTR is the aphasic patient who has to solve the exercises. The picture that is relevant for the actual task appears on the virtual blackboard. Patient answers questions of the virtual therapist. Questions are about pictures describing an activity or an object in different levels. Patient can ask an avatar for answer. If the avatar knows the answer the avatars emotion changes to happy instead of sad. The avatar expresses its emotions in different dimensions. Its behavior, face-mimic, voice-tone and response also changes. The emotion system can be described as a deterministic finite automaton where places are emotion-states and the transition function of the automaton is derived from the input-response reaction of an avatar. Natural language processing techniques were also implemented in order to establish highquality human-computer interface windows for each of the avatars. Aphasic patients are able to interact with avatars via these interfaces. At the end of the chapter we visualize the possible future research field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lizcano, D., E-mail: david.lizcano@udima.es, E-mail: mariaaurora.martinez@udima.es; Martínez, A. María, E-mail: david.lizcano@udima.es, E-mail: mariaaurora.martinez@udima.es
Edward Fredkin was an enthusiastic advocate of information-based theoretical physics, who, in the early 1980s, proposed a new theory of physics based on the idea that the universe is ultimately composed of software. According to Fredkin, reality should be considered as being composed not of particles, matter and forces or energy but of bits of data or information modified according to computational rules. Fredkin went on to demonstrate that, while energy is necessary for storing and retrieving information, it can be arbitrarily reduced in order to carry out any particular instance of information processing, and this operation does not havemore » a lower bound. This implies that it is information rather than matter or energy that should be considered at the ultimate fundamental constituent of reality. This possibility had already been suggested by other scientists. Norbert Wiener heralded a fundamental shift from energy to information and suggested that the universe was founded essentially on the transformation of information, not energy. However, Konrad Zuse was the first, back in 1967, to defend the idea that a digital computer is computing the universe. Richard P. Feynman showed this possibility in a similar light in his reflections on how information related to matter and energy. Other pioneering research on the theory of digital physics was published by Kantor in 1977 and more recently by Stephen Wolfram in 2002, who thereby joined the host of voices upholding that it is patterns of information, not matter and energy, that constitute the cornerstones of reality. In this paper, we introduce the use of knowledge management tools for the purpose of analysing this topic.« less
Deterministic reshaping of single-photon spectra using cross-phase modulation.
Matsuda, Nobuyuki
2016-03-01
The frequency conversion of light has proved to be a crucial technology for communication, spectroscopy, imaging, and signal processing. In the quantum regime, it also offers great potential for realizing quantum networks incorporating disparate physical systems and quantum-enhanced information processing over a large computational space. The frequency conversion of quantum light, such as single photons, has been extensively investigated for the last two decades using all-optical frequency mixing, with the ultimate goal of realizing lossless and noiseless conversion. I demonstrate another route to this target using frequency conversion induced by cross-phase modulation in a dispersion-managed photonic crystal fiber. Owing to the deterministic and all-optical nature of the process, the lossless and low-noise spectral reshaping of a single-photon wave packet in the telecommunication band has been readily achieved with a modulation bandwidth as large as 0.4 THz. I further demonstrate that the scheme is applicable to manipulations of a nonclassical frequency correlation, wave packet interference, and entanglement between two photons. This approach presents a new coherent frequency interface for photons for quantum information processing.
Deterministic reshaping of single-photon spectra using cross-phase modulation
Matsuda, Nobuyuki
2016-01-01
The frequency conversion of light has proved to be a crucial technology for communication, spectroscopy, imaging, and signal processing. In the quantum regime, it also offers great potential for realizing quantum networks incorporating disparate physical systems and quantum-enhanced information processing over a large computational space. The frequency conversion of quantum light, such as single photons, has been extensively investigated for the last two decades using all-optical frequency mixing, with the ultimate goal of realizing lossless and noiseless conversion. I demonstrate another route to this target using frequency conversion induced by cross-phase modulation in a dispersion-managed photonic crystal fiber. Owing to the deterministic and all-optical nature of the process, the lossless and low-noise spectral reshaping of a single-photon wave packet in the telecommunication band has been readily achieved with a modulation bandwidth as large as 0.4 THz. I further demonstrate that the scheme is applicable to manipulations of a nonclassical frequency correlation, wave packet interference, and entanglement between two photons. This approach presents a new coherent frequency interface for photons for quantum information processing. PMID:27051862
Health safety nets can break cycles of poverty and disease: a stochastic ecological model.
Plucinski, Mateusz M; Ngonghala, Calistus N; Bonds, Matthew H
2011-12-07
The persistence of extreme poverty is increasingly attributed to dynamic interactions between biophysical processes and economics, though there remains a dearth of integrated theoretical frameworks that can inform policy. Here, we present a stochastic model of disease-driven poverty traps. Whereas deterministic models can result in poverty traps that can only be broken by substantial external changes to the initial conditions, in the stochastic model there is always some probability that a population will leave or enter a poverty trap. We show that a 'safety net', defined as an externally enforced minimum level of health or economic conditions, can guarantee ultimate escape from a poverty trap, even if the safety net is set within the basin of attraction of the poverty trap, and even if the safety net is only in the form of a public health measure. Whereas the deterministic model implies that small improvements in initial conditions near the poverty-trap equilibrium are futile, the stochastic model suggests that the impact of changes in the location of the safety net on the rate of development may be strongest near the poverty-trap equilibrium.
See-through 3D technology for augmented reality
NASA Astrophysics Data System (ADS)
Lee, Byoungho; Lee, Seungjae; Li, Gang; Jang, Changwon; Hong, Jong-Young
2017-06-01
Augmented reality is recently attracting a lot of attention as one of the most spotlighted next-generation technologies. In order to get toward realization of ideal augmented reality, we need to integrate 3D virtual information into real world. This integration should not be noticed by users blurring the boundary between the virtual and real worlds. Thus, ultimate device for augmented reality can reconstruct and superimpose 3D virtual information on the real world so that they are not distinguishable, which is referred to as see-through 3D technology. Here, we introduce our previous researches to combine see-through displays and 3D technologies using emerging optical combiners: holographic optical elements and index matched optical elements. Holographic optical elements are volume gratings that have angular and wavelength selectivity. Index matched optical elements are partially reflective elements using a compensation element for index matching. Using these optical combiners, we could implement see-through 3D displays based on typical methodologies including integral imaging, digital holographic displays, multi-layer displays, and retinal projection. Some of these methods are expected to be optimized and customized for head-mounted or wearable displays. We conclude with demonstration and analysis of fundamental researches for head-mounted see-through 3D displays.
2011-01-01
Background Although principles based in motor learning, rehabilitation, and human-computer interfaces can guide the design of effective interactive systems for rehabilitation, a unified approach that connects these key principles into an integrated design, and can form a methodology that can be generalized to interactive stroke rehabilitation, is presently unavailable. Results This paper integrates phenomenological approaches to interaction and embodied knowledge with rehabilitation practices and theories to achieve the basis for a methodology that can support effective adaptive, interactive rehabilitation. Our resulting methodology provides guidelines for the development of an action representation, quantification of action, and the design of interactive feedback. As Part I of a two-part series, this paper presents key principles of the unified approach. Part II then describes the application of this approach within the implementation of the Adaptive Mixed Reality Rehabilitation (AMRR) system for stroke rehabilitation. Conclusions The accompanying principles for composing novel mixed reality environments for stroke rehabilitation can advance the design and implementation of effective mixed reality systems for the clinical setting, and ultimately be adapted for home-based application. They furthermore can be applied to other rehabilitation needs beyond stroke. PMID:21875441
Dunnington, Renee M
2014-01-01
Simulation technology is increasingly being used in nursing education. Previously used primarily for teaching procedural, instrumental, or critical incident types of skills, simulation is now being applied to training related to more dynamic, complex, and interpersonal human contexts. While high fidelity human patient simulators have significantly increased in authenticity, human responses have greater complexity and are qualitatively different than current technology represents. This paper examines the texture of representation by simulation. Through a tracing of historical and contemporary philosophical perspectives on simulation, the nature and limits of the reality of human health responses represented by high fidelity human patient simulation (HF-HPS) are explored. Issues concerning nursing education are raised around the nature of reality represented in HF-HPS. Drawing on Waks, a framework for guiding pedagogical considerations around simulation in nursing education is presented for the ultimate purpose of promoting an educative experience with simulation. © 2013 John Wiley & Sons Ltd.
A story: nursing-damaged lives.
Fenwick, J
1999-12-01
This paper presents a story that captures forever a 'difficult', 'horrible' but in many respects totally 'normal' nursing moment. It is a short story of only one person's reality. On that fateful night in which many lives were changed forever, there were, of course, many realities, all of which hold their own truth. This tale is offered in the spirit of sharing and in the hope that others may find it useful. I believe that 'story telling' allows us to revisit and review our practice. In doing so, stories facilitate the discovery of nursing knowledge and the self. Ultimately this contributes to the development of expert practice. Nursing stories, then, become an excellent medium for nursing inquiry, from both an academic and a clinical perspective.
Liu, Xiujuan; Tao, Haiquan; Xiao, Xigang; Guo, Binbin; Xu, Shangcai; Sun, Na; Li, Maotong; Xie, Li; Wu, Changjun
2018-07-01
This study aimed to compare the diagnostic performance of the stereoscopic virtual reality display system with the conventional computed tomography (CT) workstation and three-dimensional rotational angiography (3DRA) for intracranial aneurysm detection and characterization, with a focus on small aneurysms and those near the bone. First, 42 patients with suspected intracranial aneurysms underwent both 256-row CT angiography (CTA) and 3DRA. Volume rendering (VR) images were captured using the conventional CT workstation. Next, VR images were transferred to the stereoscopic virtual reality display system. Two radiologists independently assessed the results that were obtained using the conventional CT workstation and stereoscopic virtual reality display system. The 3DRA results were considered as the ultimate reference standard. Based on 3DRA images, 38 aneurysms were confirmed in 42 patients. Two cases were misdiagnosed and 1 was missed when the traditional CT workstation was used. The sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy of the conventional CT workstation were 94.7%, 85.7%, 97.3%, 75%, and99.3%, respectively, on a per-aneurysm basis. The stereoscopic virtual reality display system missed a case. The sensitivity, specificity, PPV, NPV, and accuracy of the stereoscopic virtual reality display system were 100%, 85.7%, 97.4%, 100%, and 97.8%, respectively. No difference was observed in the accuracy of the traditional CT workstation, stereoscopic virtual reality display system, and 3DRA in detecting aneurysms. The stereoscopic virtual reality display system has some advantages in detecting small aneurysms and those near the bone. The virtual reality stereoscopic vision obtained through the system was found as a useful tool in intracranial aneurysm diagnosis and pre-operative 3D imaging. Copyright © 2018 Elsevier B.V. All rights reserved.
Augmented Reality at the Tactical and Operational Levels of War
2015-10-24
benefits and challenges their personnel will experience once AR systems are fully adopted. This paper will explain these benefits and challenges as...develop, procure, and integrate systems it believes will benefit its tactical combat units and operational leaders. Ultimately, as the capabilities of...friendly forces, can also help to prevent collateral damage and civilian casualties. Beyond the immediate life-and-death benefits at the tactical
What Lies between the Religious and the Secular?: Education beyond the Human
ERIC Educational Resources Information Center
Seo, Yong-Seok
2014-01-01
The current age is characterised by many as secular, and a source of such a characterisation can be found in the Nietzschean claim that thoughts about there being some ultimate reality have to be jettisoned, and human existence and the world need to be embraced as they are. That claim is renewed by some secular thinkers who insist that education…
Entrepreneurs, Chance, and the Deterministic Concentration of Wealth
Fargione, Joseph E.; Lehman, Clarence; Polasky, Stephen
2011-01-01
In many economies, wealth is strikingly concentrated. Entrepreneurs–individuals with ownership in for-profit enterprises–comprise a large portion of the wealthiest individuals, and their behavior may help explain patterns in the national distribution of wealth. Entrepreneurs are less diversified and more heavily invested in their own companies than is commonly assumed in economic models. We present an intentionally simplified individual-based model of wealth generation among entrepreneurs to assess the role of chance and determinism in the distribution of wealth. We demonstrate that chance alone, combined with the deterministic effects of compounding returns, can lead to unlimited concentration of wealth, such that the percentage of all wealth owned by a few entrepreneurs eventually approaches 100%. Specifically, concentration of wealth results when the rate of return on investment varies by entrepreneur and by time. This result is robust to inclusion of realities such as differing skill among entrepreneurs. The most likely overall growth rate of the economy decreases as businesses become less diverse, suggesting that high concentrations of wealth may adversely affect a country's economic growth. We show that a tax on large inherited fortunes, applied to a small portion of the most fortunate in the population, can efficiently arrest the concentration of wealth at intermediate levels. PMID:21814540
On the strong influence of molecular interactions over large distances
NASA Astrophysics Data System (ADS)
Pfennig, Andreas
2018-03-01
Molecular-dynamics simulations of liquid water show deterministic chaos, i.e. an intentionally introduced molecular position shift of an individual molecule increases exponentially by a factor of 10 in 0.23 ps. This is a Lyaponov instability. As soon as it reaches molecular scale, the direction of the resulting shift in molecular motions is unpredictable. The influence of any individual distant particle on an observed molecule will be minute, but the effect will quickly increase to molecular scale and beyond due to this exponential growth. Consequently, any individual particle in the universe will affect the behavior of any molecule within at most 33 ps after the interaction reaches it. A larger distance of the faraway particle does not decrease the influence on an observed molecule, but the effect reaches molecular scale only some ps later. Thus in evaluating the interactions, nearby and faraway molecules have to be equally accounted for. The consequences of this quickly reacting network of interactions on universal scale are fundamental. Even in a strictly deterministic view, molecular behavior is principally unpredictable, and thus has to be regarded random. Corresponding statements apply for any particles interacting. This result leads to a fundamental rethinking of the structure of interactions of molecules and particles as well as the behavior of reality.
Ten reasons why a thermalized system cannot be described by a many-particle wave function
NASA Astrophysics Data System (ADS)
Drossel, Barbara
2017-05-01
It is widely believed that the underlying reality behind statistical mechanics is a deterministic and unitary time evolution of a many-particle wave function, even though this is in conflict with the irreversible, stochastic nature of statistical mechanics. The usual attempts to resolve this conflict for instance by appealing to decoherence or eigenstate thermalization are riddled with problems. This paper considers theoretical physics of thermalized systems as it is done in practice and shows that all approaches to thermalized systems presuppose in some form limits to linear superposition and deterministic time evolution. These considerations include, among others, the classical limit, extensivity, the concepts of entropy and equilibrium, and symmetry breaking in phase transitions and quantum measurement. As a conclusion, the paper suggests that the irreversibility and stochasticity of statistical mechanics should be taken as a real property of nature. It follows that a gas of a macroscopic number N of atoms in thermal equilibrium is best represented by a collection of N wave packets of a size of the order of the thermal de Broglie wave length, which behave quantum mechanically below this scale but classically sufficiently far beyond this scale. In particular, these wave packets must localize again after scattering events, which requires stochasticity and indicates a connection to the measurement process.
Kin, Taichi; Nakatomi, Hirofumi; Shono, Naoyuki; Nomura, Seiji; Saito, Toki; Oyama, Hiroshi; Saito, Nobuhito
2017-10-15
Simulation and planning of surgery using a virtual reality model is becoming common with advances in computer technology. In this study, we conducted a literature search to find trends in virtual simulation of surgery for brain tumors. A MEDLINE search for "neurosurgery AND (simulation OR virtual reality)" retrieved a total of 1,298 articles published in the past 10 years. After eliminating studies designed solely for education and training purposes, 28 articles about the clinical application remained. The finding that the vast majority of the articles were about education and training rather than clinical applications suggests that several issues need be addressed for clinical application of surgical simulation. In addition, 10 of the 28 articles were from Japanese groups. In general, the 28 articles demonstrated clinical benefits of virtual surgical simulation. Simulation was particularly useful in better understanding complicated spatial relations of anatomical landmarks and in examining surgical approaches. In some studies, Virtual reality models were used on either surgical navigation system or augmented reality technology, which projects virtual reality images onto the operating field. Reported problems were difficulties in standardized, objective evaluation of surgical simulation systems; inability to respond to tissue deformation caused by surgical maneuvers; absence of the system functionality to reflect features of tissue (e.g., hardness and adhesion); and many problems with image processing. The amount of description about image processing tended to be insufficient, indicating that the level of evidence, risk of bias, precision, and reproducibility need to be addressed for further advances and ultimately for full clinical application.
The Evolution of Consciousness in the Novel in English
NASA Astrophysics Data System (ADS)
Gojkovic, Zorica
This dissertation examines how the novel in English reflects the evolution of human consciousness. Characters in novels express a level of consciousness through their world view, which reflects the level of consciousness of the author and his/her period. Over time the world view evolves from a perception of physical reality as ultimate reality, to physical reality as illusion, in contrast to primary reality, which is spirit, or energy, or God, or the holistic frequency realm. Great mystics and sages all over the world, and throughout history, have had this understanding about the nature of reality. What is new is that different investigative currents are coming together and sharing this new vision of reality. The underlying unity, or enfolded order, is a broader realm where fragmentation is united by a deeper truth. This oneness is analogized to a hologram, where each part is in the whole and the whole in each part. The process of the evolution of consciousness in the novel is examined in three parts. In part one, Chapter One, connections are established between some of the pertinent developments in quantum physics, mysticism and Erich Neumann's theory of the evolution of consciousness. This information sets the stage for the exploration of the evolutionary process in the novel. Part two, chapters two to seven, explore various themes that demonstrate the evolutionary process in the novel. Novels that most effectively demonstrate the evolution are used. Part three, Chapter Eight, summarizes the evolutionary process and demonstrates the way in which wholeness is achieved from the initial separateness. Part three also explores some implications for the novel in light of this analysis.
Complexity theory and physical unification: From microscopic to oscopic level
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Iliopoulos, A. C.; Karakatsanis, L. P.; Tsoutsouras, V. G.; Pavlos, E. G.
During the last two decades, low dimensional chaotic or self-organized criticality (SOC) processes have been observed by our group in many different physical systems such as space plasmas, the solar or the magnetospheric dynamics, the atmosphere, earthquakes, the brain activity as well as in informational systems. All these systems are complex systems living far from equilibrium with strong self-organization and phase transition character. The theoretical interpretation of these natural phenomena needs a deeper insight into the fundamentals of complexity theory. In this study, we try to give a synoptic description of complexity theory both at the microscopic and at the oscopic level of the physical reality. Also, we propose that the self-organization observed oscopically is a phenomenon that reveals the strong unifying character of the complex dynamics which includes thermodynamical and dynamical characteristics in all levels of the physical reality. From this point of view, oscopical deterministic and stochastic processes are closely related to the microscopical chaos and self-organization. In this study the scientific work of scientists such as Wilson, Nicolis, Prigogine, Hooft, Nottale, El Naschie, Castro, Tsallis, Chang and others is used for the development of a unified physical comprehension of complex dynamics from the microscopic to the oscopic level.
Virtual reality simulation for construction safety promotion.
Zhao, Dong; Lucas, Jason
2015-01-01
Safety is a critical issue for the construction industry. Literature argues that human error contributes to more than half of occupational incidents and could be directly impacted by effective training programs. This paper reviews the current safety training status in the US construction industry. Results from the review evidence the gap between the status and industry expectation on safety. To narrow this gap, this paper demonstrates the development and utilisation of a training program that is based on virtual reality (VR) simulation. The VR-based safety training program can offer a safe working environment where users can effectively rehearse tasks with electrical hazards and ultimately promote their abilities for electrical hazard cognition and intervention. Its visualisation and simulation can also remove the training barriers caused by electricity's features of invisibility and dangerousness.
Iannone, Maria; Ventre, Maurizio; Formisano, Lucia; Casalino, Laura; Patriarca, Eduardo J; Netti, Paolo A
2015-03-11
The initial conditions for morphogenesis trigger a cascade of events that ultimately dictate structure and functions of tissues and organs. Here we report that surface nanopatterning can control the initial assembly of focal adhesions, hence guiding human mesenchymal stem cells (hMSCs) through the process of self-organization and differentiation. This process self-sustains, leading to the development of macroscopic tissues with molecular profiles and microarchitecture reminiscent of embryonic tendons. Therefore, material surfaces can be in principle engineered to set off the hMSC program toward tissuegenesis in a deterministic manner by providing adequate sets of initial environmental conditions.
Probabilistic metrology or how some measurement outcomes render ultra-precise estimates
NASA Astrophysics Data System (ADS)
Calsamiglia, J.; Gendra, B.; Muñoz-Tapia, R.; Bagan, E.
2016-10-01
We show on theoretical grounds that, even in the presence of noise, probabilistic measurement strategies (which have a certain probability of failure or abstention) can provide, upon a heralded successful outcome, estimates with a precision that exceeds the deterministic bounds for the average precision. This establishes a new ultimate bound on the phase estimation precision of particular measurement outcomes (or sequence of outcomes). For probe systems subject to local dephasing, we quantify such precision limit as a function of the probability of failure that can be tolerated. Our results show that the possibility of abstaining can set back the detrimental effects of noise.
New figuring model based on surface slope profile for grazing-incidence reflective optics
Zhou, Lin; Huang, Lei; Bouet, Nathalie; ...
2016-08-09
Surface slope profile is widely used in the metrology of grazing-incidence reflective optics instead of surface height profile. Nevertheless, the theoretical and experimental model currently used in deterministic optical figuring processes is based on surface height, not on surface slope. This means that the raw slope profile data from metrology need to be converted to height profile to perform the current height-based figuring processes. The inevitable measurement noise in the raw slope data will introduce significant cumulative error in the resultant height profiles. As a consequence, this conversion will degrade the determinism of the figuring processes, and will have anmore » impact on the ultimate surface figuring results. To overcome this problem, an innovative figuring model is proposed, which directly uses the raw slope profile data instead of the usual height data as input for the deterministic process. In this article, first the influence of the measurement noise on the resultant height profile is analyzed, and then a new model is presented; finally a demonstration experiment is carried out using a one-dimensional ion beam figuring process to demonstrate the validity of our approach.« less
Health safety nets can break cycles of poverty and disease: a stochastic ecological model
Pluciński, Mateusz M.; Ngonghala, Calistus N.; Bonds, Matthew H.
2011-01-01
The persistence of extreme poverty is increasingly attributed to dynamic interactions between biophysical processes and economics, though there remains a dearth of integrated theoretical frameworks that can inform policy. Here, we present a stochastic model of disease-driven poverty traps. Whereas deterministic models can result in poverty traps that can only be broken by substantial external changes to the initial conditions, in the stochastic model there is always some probability that a population will leave or enter a poverty trap. We show that a ‘safety net’, defined as an externally enforced minimum level of health or economic conditions, can guarantee ultimate escape from a poverty trap, even if the safety net is set within the basin of attraction of the poverty trap, and even if the safety net is only in the form of a public health measure. Whereas the deterministic model implies that small improvements in initial conditions near the poverty-trap equilibrium are futile, the stochastic model suggests that the impact of changes in the location of the safety net on the rate of development may be strongest near the poverty-trap equilibrium. PMID:21593026
Can China Defend A Core Interest In The South China Sea?
2011-01-01
must amass the wherewithal to defeat outsiders’ efforts to make today’s status quo a permanent political reality . Beijing ultimately needs sufficient...to launch a cross-strait invasion. To date, the PLA Navy has exhibited curious myopia toward such capabilities and systems. Constant strain on the...standard of fielding enough naval power to meet the largest fleet likely to be arrayed against it. ASBMs might provide full-time virtual presence, but they
Virtual reality for mine safety training.
Filigenzi, M T; Orr, T J; Ruff, T M
2000-06-01
Mining has long remained one of America's most hazardous occupations. Researchers believe that by developing realistic, affordable VR training software, miners will be able to receive accurate training in hazard recognition and avoidance. In addition, the VR software will allow miners to follow mine evacuation routes and safe procedures without exposing themselves to danger. This VR software may ultimately be tailored to provide training in other industries, such as the construction, agricultural, and petroleum industries.
Transcending matter: physics and ultimate meaning.
Paulson, Steve; Frank, Adam; Kaiser, David; Maudlin, Tim; Natarajan, Priyamvada
2015-12-01
From the discovery of new galaxies and nearly undetectable dark energy to the quantum entanglement of particles across the universe, new findings in physics naturally elicit a sense of awe and wonder. For the founders of modern physics-from Einstein and Bohr to Heisenberg, Pauli, and Bohm-a fascination with deeper questions of meaning and ultimate reality led some of them to explore esoteric traditions and metaphysics. More recently, however, physicists have largely shunned such philosophical and spiritual associations. What can contemporary physics offer us in the quest to understand our place in the universe? Has physics in some ways become a religion unto itself that rejects the search for existential meaning? Discussion of these and related questions is presented in this paper. © 2015 New York Academy of Sciences.
Cellular intelligence: Microphenomenology and the realities of being.
Ford, Brian J
2017-12-01
Traditions of Eastern thought conceptualised life in a holistic sense, emphasising the processes of maintaining health and conquering sickness as manifestations of an essentially spiritual principle that was of overriding importance in the conduct of living. Western science, which drove the overriding and partial eclipse of Eastern traditions, became founded on a reductionist quest for ultimate realities which, in the modern scientific world, has embraced the notion that every living process can be successfully modelled by a digital computer system. It is argued here that the essential processes of cognition, response and decision-making inherent in living cells transcend conventional modelling, and microscopic studies of organisms like the shell-building amoebae and the rhodophyte alga Antithamnion reveal a level of cellular intelligence that is unrecognized by science and is not amenable to computer analysis. Copyright © 2017. Published by Elsevier Ltd.
Development of a Computer-Controlled Polishing Process for X-Ray Optics
NASA Technical Reports Server (NTRS)
Khan, Gufran S.; Gubarev, Mikhail; Arnold, William; Ramsey, Brian
2009-01-01
The future X-ray observatory missions require grazing-incidence x-ray optics with angular resolution of < 5 arcsec half-power diameter. The achievable resolution depends ultimately on the quality of polished mandrels from which the shells are replicated. With an aim to fabricate better shells, and reduce the cost/time of mandrel production, a computer-controlled polishing machine is developed for deterministic and localized polishing of mandrels. Cylindrical polishing software is also developed that predicts the surface residual errors under a given set of operating parameters and lap configuration. Design considerations of the polishing lap are discussed and the effects of nonconformance of the lap and the mandrel are presented.
Quantum structures: An attempt to explain the origin of their appearance in nature
NASA Astrophysics Data System (ADS)
Aerts, Diederik
1995-08-01
We explain quantum structure as due to two effects: (a) a real change of state of the entity under the influence of the measurement and (b) a lack of knowledge about a deeper deterministic reality of the measurement process. We present a quantum machine, with which we can illustrate in a simple way how the quantum structure arises as a consequence of the two mentioned effects. We introduce a parameter ɛ that measures the size of the lack of knowledge of the measurement process, and by varying this parameter, we describe a continuous evolution from a quantum structure (maximal lack of knowledge) to a classical structure (zero lack of knowledge). We show that for intermediate values of ɛ we find a new type of structure that is neither quantum nor classical. We apply the model to situations of lack of knowledge about the measurement process appearing in other aspects of reality. Specifically, we investigate the quantumlike structures that appear in the situation of psychological decision processes, where the subject is influenced during the testing and forms some opinions during the testing process. Our conclusion is that in the light of this explanation, the quantum probabilities are epistemic and not ontological, which means that quantum mechanics is compatible with a determinism of the whole.
Optical architecture of HoloLens mixed reality headset
NASA Astrophysics Data System (ADS)
Kress, Bernard C.; Cummings, William J.
2017-06-01
HoloLens by Microsoft Corp. is the world's first untethered Mixed Reality (MR) Head Mounted Display (HMD) system, released to developers in March 2016 as a Development Kit. We review in this paper the various display requirements and subsequent optical hardware choices we made for HoloLens. Its main achievements go along performance and comfort for the user: it is the first fully untethered MR headset, with the highest angular resolution and the industry's largest eyebox. It has the first inside-out global sensor fusion system including precise head tracking and 3D mapping all controlled by a fully custom on-board GPU. Based on such achievements, HoloLens came out as the most advanced MR system today. Additional features may be implemented in next generations MR headsets, leading to the ultimate experience for the user, and securing the upcoming fabulous AR/MR market predicted by most analysts.
NASA Astrophysics Data System (ADS)
Thubaasini, P.; Rusnida, R.; Rohani, S. M.
This paper describes Linux, an open source platform used to develop and run a virtual architectural walkthrough application. It proposes some qualitative reflections and observations on the nature of Linux in the concept of Virtual Reality (VR) and on the most popular and important claims associated with the open source approach. The ultimate goal of this paper is to measure and evaluate the performance of Linux used to build the virtual architectural walkthrough and develop a proof of concept based on the result obtain through this project. Besides that, this study reveals the benefits of using Linux in the field of virtual reality and reflects a basic comparison and evaluation between Windows and Linux base operating system. Windows platform is use as a baseline to evaluate the performance of Linux. The performance of Linux is measured based on three main criteria which is frame rate, image quality and also mouse motion.
The Case for Durative Actions: A Commentary on PDDL2.1
NASA Technical Reports Server (NTRS)
Smith, David E.
2003-01-01
The addition of durative actions to PDDL2.1 sparked some controversy. Fox and Long argued that actions should be considered as instantaneous, but can start and stop processes. Ultimately, a limited notion of durative actions was incorporated into the language. I argue that this notion is impoverished, and that the underlying philosophical position of regarding durative actions as being a shorthand for a start action, process, and stop action ignores the realities of modelling and execution for complex systems.
Future Of Visual Entertainment
NASA Astrophysics Data System (ADS)
Dryer, Ivan
1983-10-01
The development of new visual entertainment forms has and will continue to have a powerful impact on the direction of our society. Foremost among these new forms will be the Holo's--moving Holographic images of anything imaginable, projected in mid air (a room, a dome) and so lifelike they are virtually indistinguishable from "reality". The Holo's and space development will ultimately transform entertainment and in the process, humanity, too. Meanwhile, the seeds of these changes are now being planted in entertainment trends and innovations whose implications are just beginning to emerge.
Nuclear power and probabilistic safety assessment (PSA): past through future applications
NASA Astrophysics Data System (ADS)
Stamatelatos, M. G.; Moieni, P.; Everline, C. J.
1995-03-01
Nuclear power reactor safety in the United States is about to enter a new era -- an era of risk- based management and risk-based regulation. First, there was the age of `prescribed safety assessment,' during which a series of design-basis accidents in eight categories of severity, or classes, were postulated and analyzed. Toward the end of that era, it was recognized that `Class 9,' or `beyond design basis,' accidents would need special attention because of the potentially severe health and financial consequences of these accidents. The accident at Three Mile Island showed that sequences of low-consequence, high-frequency events and human errors can be much more risk dominant than the Class 9 accidents. A different form of safety assessment, PSA, emerged and began to gain ground against the deterministic safety establishment. Eventually, this led to the current regulatory requirements for individual plant examinations (IPEs). The IPEs can serve as a basis for risk-based regulation and management, a concept that may ultimately transform the U.S. regulatory process from its traditional deterministic foundations to a process predicated upon PSA. Beyond the possibility of a regulatory environment predicated upon PSA lies the possibility of using PSA as the foundation for managing daily nuclear power plant operations.
Deshmukh, Vinod D.
2006-01-01
Dhyana-Yoga is a Sanskrit word for the ancient discipline of meditation, as a means to Samadhi or enlightenment. Samadhi is a self-absorptive, adaptive state with realization of ones being in harmony with reality. It is unitive, undifferentiated, reality-consciousness, an essential being, which can only be experienced by spontaneous intuition and self-understanding. Modern neuroscience can help us to better understand Dhyana-Yoga. This article discusses topics including brain-mind-reality, consciousness, attention, emotional intelligence, sense of self, meditative mind, and meditative brain. A new hypothesis is proposed for a better understanding of the meditative mind. Meditation is an art of being serene and alert in the present moment, instead of constantly struggling to change or to become. It is an art of efficient management of attentional energy with total engagement (poornata, presence, mindfulness) or disengagement (shunyata, silence, emptiness). In both states, there is an experience of spontaneous unity with no sense of situational interactive self or personal time. It is a simultaneous, participatory consciousness rather than a dualistic, sequential attentiveness. There is a natural sense of well being with self-understanding, spontaneous joy, serenity, freedom, and self-fulfillment. It is where the ultimate pursuit of happiness and the search for meaning of life resolve. One realizes the truth of ones harmonious being in nature and nature in oneself. It is being alive at its fullest, when each conscious moment becomes a dynamic process of discovery and continuous learning of the ever-new unfolding reality. PMID:17370019
Biomimetics and the Development of Humanlike Robots as the Ultimate Challenge
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph
2011-01-01
Evolution led to effective solutions to nature's challenges and they were improved over millions of years. Humans have always made efforts to use nature as a model for innovation and problems solving. These efforts became more intensive in recent years where systematic studies of nature are being made towards better understanding and applying more sophisticated capabilities. Making humanlike robots, including the appearance, functions and intelligence, poses the ultimate challenges to biomimetics. For many years, making such robots was considered science fiction, but as a result of significant advances in biologically inspired technologies, such robots are increasingly becoming an engineering reality. There are already humanlike robots that walk, talk, interpret speech, make eye-contact and facial expressions, as well as perform many other humanlike functions. In this paper, the state-of-the-art of humanlike robots, potential applications and issues of concern will be reviewed.
NASA Astrophysics Data System (ADS)
Pavlos, George P.
2017-12-01
In this study, we present the highlights of complexity theory (Part I) and significant experimental verifications (Part II) and we try to give a synoptic description of complexity theory both at the microscopic and at the macroscopic level of the physical reality. Also, we propose that the self-organization observed macroscopically is a phenomenon that reveals the strong unifying character of the complex dynamics which includes thermodynamical and dynamical characteristics in all levels of the physical reality. From this point of view, macroscopical deterministic and stochastic processes are closely related to the microscopical chaos and self-organization. The scientific work of scientists such as Wilson, Nicolis, Prigogine, Hooft, Nottale, El Naschie, Castro, Tsallis, Chang and others is used for the development of a unified physical comprehension of complex dynamics from the microscopic to the macroscopic level. Finally, we provide a comprehensive description of the novel concepts included in the complexity theory from microscopic to macroscopic level. Some of the modern concepts that can be used for a unified description of complex systems and for the understanding of modern complexity theory, as it is manifested at the macroscopic and the microscopic level, are the fractal geometry and fractal space-time, scale invariance and scale relativity, phase transition and self-organization, path integral amplitudes, renormalization group theory, stochastic and chaotic quantization and E-infinite theory, etc.
Yamaguchi, Satoshi; Yamada, Yuya; Yoshida, Yoshinori; Noborio, Hiroshi; Imazato, Satoshi
2012-01-01
The virtual reality (VR) simulator is a useful tool to develop dental hand skill. However, VR simulations with reactions of patients have limited computational time to reproduce a face model. Our aim was to develop a patient face model that enables real-time collision detection and cutting operation by using stereolithography (STL) and deterministic finite automaton (DFA) data files. We evaluated dependence of computational cost and constructed the patient face model using the optimum condition for combining STL and DFA data files, and assessed the computational costs for operation in do-nothing, collision, cutting, and combination of collision and cutting. The face model was successfully constructed with low computational costs of 11.3, 18.3, 30.3, and 33.5 ms for do-nothing, collision, cutting, and collision and cutting, respectively. The patient face model could be useful for developing dental hand skill with VR.
Boucher, Philip
2011-09-01
This article builds upon previous discussion of social and technical determinisms as implicit positions in the biofuel debate. To ensure these debates are balanced, it has been suggested that they should be designed to contain a variety of deterministic positions. Whilst it is agreed that determinism does not feature strongly in contemporary academic literatures, it is found that they have generally been superseded by an absence of any substantive conceptualisation of how the social shaping of technology may be related to, or occur alongside, an objective or autonomous reality. The problem of determinism emerges at an ontological level and must be resolved in situ. A critical realist approach to technology is presented which may provide a more appropriate framework for debate. In dialogue with previous discussion, the distribution of responsibility is revisited with reference to the role of scientists and engineers.
Virtual reality in radiology: virtual intervention
NASA Astrophysics Data System (ADS)
Harreld, Michael R.; Valentino, Daniel J.; Duckwiler, Gary R.; Lufkin, Robert B.; Karplus, Walter J.
1995-04-01
Intracranial aneurysms are the primary cause of non-traumatic subarachnoid hemorrhage. Morbidity and mortality remain high even with current endovascular intervention techniques. It is presently impossible to identify which aneurysms will grow and rupture, however hemodynamics are thought to play an important role in aneurysm development. With this in mind, we have simulated blood flow in laboratory animals using three dimensional computational fluid dynamics software. The data output from these simulations is three dimensional, complex and transient. Visualization of 3D flow structures with standard 2D display is cumbersome, and may be better performed using a virtual reality system. We are developing a VR-based system for visualization of the computed blood flow and stress fields. This paper presents the progress to date and future plans for our clinical VR-based intervention simulator. The ultimate goal is to develop a software system that will be able to accurately model an aneurysm detected on clinical angiography, visualize this model in virtual reality, predict its future behavior, and give insight into the type of treatment necessary. An associated database will give historical and outcome information on prior aneurysms (including dynamic, structural, and categorical data) that will be matched to any current case, and assist in treatment planning (e.g., natural history vs. treatment risk, surgical vs. endovascular treatment risks, cure prediction, complication rates).
A brief review of augmented reality science learning
NASA Astrophysics Data System (ADS)
Gopalan, Valarmathie; Bakar, Juliana Aida Abu; Zulkifli, Abdul Nasir
2017-10-01
This paper reviews several literatures concerning the theories and model that could be applied for science motivation for upper secondary school learners (16-17 years old) in order to make the learning experience more amazing and useful. The embedment of AR in science could bring an awe-inspiring transformation on learners' viewpoint towards the respective subject matters. Augmented Reality is able to present the real and virtual learning experience with the addition of multiple media without replacing the real environment. Due to the unique feature of AR, it attracts the mass attention of researchers to implement AR in science learning. This impressive technology offers learners with the ultimate visualization and provides an astonishing and transparent learning experience by bringing to light the unseen perspective of the learning content. This paper will attract the attention of researchers in the related field as well as academicians in the related discipline. This paper aims to propose several related theoretical guidance that could be applied in science motivation to transform the learning in an effective way.
Hysteroscopic simulator for training and educational purposes.
Lim, Fabian; Brown, Ian; McColl, Ryan; Seligman, Cory; Alsaraira, Amer
2006-01-01
Hysteroscopy is an extensively popular option in evaluating and treating women with infertility. The procedure utilizes an endoscope, inserted through the vagina and cervix to examine the intra-uterine cavity via a monitor. The difficulty of hysteroscopy from the surgeon's perspective is the visual spatial perception of interpreting 3D images on a 2D monitor, and the associated psychomotor skills in overcoming the fulcrum-effect. Despite the widespread use of this procedure, current qualified hysteroscopy surgeons have not been trained the fundamentals through an organized curriculum. The emergence of virtual reality as an educational tool for this procedure, and for other endoscopic procedures, has undoubtedly raised interests. The ultimate objective is for the inclusion of virtual reality training as a mandatory component for gynecological endoscopic training. Part of this process involves the design of a simulator, encompassing the technical difficulties and complications associated with the procedure. The proposed research examines fundamental hysteroscopic factors as well as current training and accreditation norms, and proposes a hysteroscopic simulator design that is suitable for educating and training.
Prostatitis: myths and realities.
Nickel, J C
1998-03-01
To explore the myths surrounding the enigmatic syndrome that the urologic community has labeled as prostatitis and to determine the actual realities associated with this disease. A critical evaluation of the syndrome of prostatitis based on examination of the recent world literature, undisputed scientific facts, solid hypotheses, common sense, and the author's personal opinion. The most common myths surrounding the importance, etiology, diagnosis, classification, and treatment of prostatitis are in fact merely myths. Recent research has led to a new awareness of the importance of prostatitis, new insights into its pathogenesis, improved disease classification and symptom assessment, and will ultimately lead to more rational diagnostic and treatment strategies. The introduction of a new more rational classification system, the development and validation of reliable symptom assessment instruments, new funding initiatives by granting agencies and the pharmaceutical industry, and an awakening appeal for intellectual examination of this common prostate disease by academic urologists guarantees that prostatitis will find an important place on the urologic agenda as we enter the next millennium.
A visual graphic/haptic rendering model for hysteroscopic procedures.
Lim, Fabian; Brown, Ian; McColl, Ryan; Seligman, Cory; Alsaraira, Amer
2006-03-01
Hysteroscopy is an extensively popular option in evaluating and treating women with infertility. The procedure utilises an endoscope, inserted through the vagina and cervix to examine the intra-uterine cavity via a monitor. The difficulty of hysteroscopy from the surgeon's perspective is the visual spatial perception of interpreting 3D images on a 2D monitor, and the associated psychomotor skills in overcoming the fulcrum-effect. Despite the widespread use of this procedure, current qualified hysteroscopy surgeons have not been trained the fundamentals through an organised curriculum. The emergence of virtual reality as an educational tool for this procedure, and for other endoscopic procedures, has undoubtedly raised interests. The ultimate objective is for the inclusion of virtual reality training as a mandatory component for gynaecologic endoscopy training. Part of this process involves the design of a simulator, encompassing the technical difficulties and complications associated with the procedure. The proposed research examines fundamental hysteroscopy factors, current training and accreditation, and proposes a hysteroscopic simulator design that is suitable for educating and training.
Quest for ultimate reality and meaning: a scientist's view
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilbert, T.L.
The theme of unity and diversity is developed in four parts. The first part is an examination of the relationship between unity and diversity in terms of the concepts of scale and resolution, using an imaginary journey in a shrinking spaceship from the cosmos to quarks in order to present the concepts in concrete terms. The second part is an examination of the roles of different fields of scholarship - aesthetics and the humanities, ethics, religion, and science - in coping with diversity and in gleaning the unity hidden in diversity. The third part is an examination of a philosophicalmore » problem, closely related to unity and diversity, that can be expected to play a central role in later stages of the URAM program: the question of what we mean by the word true. The fourth part is a discussion of the concept of reality from the epistemological viewpoint of the sciences, and how unity and diversity enter into this concept.« less
Multiprocessor shared-memory information exchange
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santoline, L.L.; Bowers, M.D.; Crew, A.W.
1989-02-01
In distributed microprocessor-based instrumentation and control systems, the inter-and intra-subsystem communication requirements ultimately form the basis for the overall system architecture. This paper describes a software protocol which addresses the intra-subsystem communications problem. Specifically the protocol allows for multiple processors to exchange information via a shared-memory interface. The authors primary goal is to provide a reliable means for information to be exchanged between central application processor boards (masters) and dedicated function processor boards (slaves) in a single computer chassis. The resultant Multiprocessor Shared-Memory Information Exchange (MSMIE) protocol, a standard master-slave shared-memory interface suitable for use in nuclear safety systems, ismore » designed to pass unidirectional buffers of information between the processors while providing a minimum, deterministic cycle time for this data exchange.« less
Attosecond-resolution Hong-Ou-Mandel interferometry.
Lyons, Ashley; Knee, George C; Bolduc, Eliot; Roger, Thomas; Leach, Jonathan; Gauger, Erik M; Faccio, Daniele
2018-05-01
When two indistinguishable photons are each incident on separate input ports of a beamsplitter, they "bunch" deterministically, exiting via the same port as a direct consequence of their bosonic nature. This two-photon interference effect has long-held the potential for application in precision measurement of time delays, such as those induced by transparent specimens with unknown thickness profiles. However, the technique has never achieved resolutions significantly better than the few-femtosecond (micrometer) scale other than in a common-path geometry that severely limits applications. We develop the precision of Hong-Ou-Mandel interferometry toward the ultimate limits dictated by statistical estimation theory, achieving few-attosecond (or nanometer path length) scale resolutions in a dual-arm geometry, thus providing access to length scales pertinent to cell biology and monoatomic layer two-dimensional materials.
Kafatos, Menas C; Kato, Goro C
2017-12-01
Questions about the nature of reality, whether Consciousness is the fundamental reality in the universe, and what is Consciousness itself, have no answer in systems that assume an external reality independent of Consciousness. Ultimately, the ontological foundation of such systems is the absolute division of subject and object. We advocate instead what we consider to be an approach that is in agreement with the foundation of quantum reality, which is based on Rāmānuja's version of Vedanta philosophy and non-dual Kashmir Śaivism. Quantum mechanics opened the door to consciousness, but it cannot account for consciousness. However, the quantum measurement problem implies that we cannot remove subjective experience from the practice of science. It is then appropriate to seek mathematical formalisms for the workings of consciousness that don't rely on specific interpretations of quantum mechanics. Temporal topos provides such a framework. In the theory of temporal topos, which we outline here, the difference between a subject and an object involves the direction of a morphism in a category. We also note that in the dual category, the direction of the morphism is in the opposite direction compared with the original direction of the original category. The resulting formalism provides powerful ways to address consciousness and qualia, beyond attempts to account for consciousness through physical theories. We also discuss the implications of the mathematics presented here for the convergence of science and non-dualist philosophies, as an emerging science of Consciousness, that may bring out the underlying unity of physics, life and mind. Copyright © 2017. Published by Elsevier Ltd.
Computer Based Training: Field Deployable Trainer and Shared Virtual Reality
NASA Technical Reports Server (NTRS)
Mullen, Terence J.
1997-01-01
Astronaut training has traditionally been conducted at specific sites with specialized facilities. Because of its size and nature the training equipment is generally not portable. Efforts are now under way to develop training tools that can be taken to remote locations, including into orbit. Two of these efforts are the Field Deployable Trainer and Shared Virtual Reality projects. Field Deployable Trainer NASA has used the recent shuttle mission by astronaut Shannon Lucid to the Russian space station, Mir, as an opportunity to develop and test a prototype of an on-orbit computer training system. A laptop computer with a customized user interface, a set of specially prepared CD's, and video tapes were taken to the Mir by Ms. Lucid. Based upon the feedback following the launch of the Lucid flight, our team prepared materials for the next Mir visitor. Astronaut John Blaha will fly on NASA/MIR Long Duration Mission 3, set to launch in mid September. He will take with him a customized hard disk drive and a package of compact disks containing training videos, references and maps. The FDT team continues to explore and develop new and innovative ways to conduct offsite astronaut training using personal computers. Shared Virtual Reality Training NASA's Space Flight Training Division has been investigating the use of virtual reality environments for astronaut training. Recent efforts have focused on activities requiring interaction by two or more people, called shared VR. Dr. Bowen Loftin, from the University of Houston, directs a virtual reality laboratory that conducts much of the NASA sponsored research. I worked on a project involving the development of a virtual environment that can be used to train astronauts and others to operate a science unit called a Biological Technology Facility (BTF). Facilities like this will be used to house and control microgravity experiments on the space station. It is hoped that astronauts and instructors will ultimately be able to share common virtual environments and, using telephone links, conduct interactive training from separate locations.
One hundred years of quantum physics.
Kleppner, D; Jackiw, R
2000-08-11
This year marks the 100th anniversary of Max Planck's creation of the quantum concept, an idea so revolutionary that it took nearly 30 years for scientists to develop it into the theory that has transformed the way scientists view reality. In this month's essay, Daniel Kleppner and Roman Jackiw recount how quantum theory, which they rate as "the most precisely tested and most successful theory in the history of science," came to be, how it changed the world, and how it might continue to evolve to make the dream of ultimate understanding of the universe come true.
An investigation to improve selenodetic control through surface and orbital lunar photography
NASA Technical Reports Server (NTRS)
Sweet, H. J., III
1970-01-01
The use of lunar surface photography to achieve the photogrammetric transfer of available selenographic coordinates from future lunar landing sites to neighboring, photoidentifiable features was investigated. It can be implied from the procedures developed that overhead photography, were it available, could be utilized and would provide a material strengthening of the total solution. By the methodic selection of features and confirmation that they can in reality be identified from orbital photography, a modest selenodetic control system can be expanded into a net that could ultimately control all future, manned or unmanned, orbital photographic missions.
Assessment of Wind Parameter Sensitivity on Ultimate and Fatigue Wind Turbine Loads: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N; Sethuraman, Latha; Jonkman, Jason
Wind turbines are designed using a set of simulations to ascertain the structural loads that the turbine could encounter. While mean hub-height wind speed is considered to vary, other wind parameters such as turbulence spectra, sheer, veer, spatial coherence, and component correlation are fixed or conditional values that, in reality, could have different characteristics at different sites and have a significant effect on the resulting loads. This paper therefore seeks to assess the sensitivity of different wind parameters on the resulting ultimate and fatigue loads on the turbine during normal operational conditions. Eighteen different wind parameters are screened using anmore » Elementary Effects approach with radial points. As expected, the results show a high sensitivity of the loads to the turbulence standard deviation in the primary wind direction, but the sensitivity to wind shear is often much greater. To a lesser extent, other wind parameters that drive loads include the coherence in the primary wind direction and veer.« less
Religion, nature, science education and the epistemology of dialectics
NASA Astrophysics Data System (ADS)
Alexakos, Konstantinos
2010-03-01
In his article Scientists at Play in a Field of the Lord, David Long (2010) rightly challenges our presumptions of what science is and brings forth some of the disjunctures between science and deeply held American religious beliefs. Reading his narrative of the conflicts that he experienced on the opening day of the Creation Museum, I cannot help but reconsider what the epistemology of science is and science learning ought to be. Rather than science being taught as a prescribed, deterministic system of beliefs and procedures as it is often done, I suggest instead that it would be more appropriate to teach science as a way of thinking and making sense of dialectical processes in nature. Not as set of ultimate "truths", but as understandings of processes themselves in the process of simultaneously becoming and being transformed.
Detecting and Teaching Desire: Phallometry, Freund, and Behaviorist Sexology.
Ha, Nathan
2015-01-01
During the 1960s and 1970s, Kurt Freund and other researchers developed phallometry to demonstrate the effectiveness of behaviorism in the diagnosis and treatment of male homosexuality and pedophilia. Researchers used phallometers to segment different aspects of male arousal, to discern cryptic hierarchies of eroticism, and to monitor the effectiveness of treatments to change an individual's sexuality. Phallometry ended up challenging the expectations of behaviorist researchers by demonstrating that most men could not change their sexual preferences--no matter how hard they tried or how hard others tried to change them. This knowledge, combined with challenges mounted by gay political activists, eventually motivated Freund and other researchers to revise their ideas of what counted as therapy. Phallometric studies ultimately revealed the limitations of efforts to shape "abnormal" and "normal" masculinity and heralded the rise of biologically determinist theories of sexuality.
Initial validation of a virtual-reality robotic simulator.
Lendvay, Thomas S; Casale, Pasquale; Sweet, Robert; Peters, Craig
2008-09-01
Robotic surgery is an accepted adjunct to minimally invasive surgery, but training is restricted to console time. Virtual-reality (VR) simulation has been shown to be effective for laparoscopic training and so we seek to validate a novel VR robotic simulator. The American Urological Association (AUA) Office of Education approved this study. Subjects enrolled in a robotics training course at the 2007 AUA annual meeting underwent skills training in a da Vinci dry-lab module and a virtual-reality robotics module which included a three-dimensional (3D) VR robotic simulator. Demographic and acceptability data were obtained, and performance metrics from the simulator were compared between experienced and nonexperienced roboticists for a ring transfer task. Fifteen subjects-four with previous robotic surgery experience and 11 without-participated. Nine subjects were still in urology training and nearly half of the group had reported playing video games. Overall performance of the da Vinci system and the simulator were deemed acceptable by a Likert scale (0-6) rating of 5.23 versus 4.69, respectively. Experienced subjects outperformed nonexperienced subjects on the simulator on three metrics: total task time (96 s versus 159 s, P < 0.02), economy of motion (1,301 mm versus 2,095 mm, P < 0.04), and time the telemanipulators spent outside of the center of the platform's workspace (4 s versus 35 s, P < 0.02). This is the first demonstration of face and construct validity of a virtual-reality robotic simulator. Further studies assessing predictive validity are ultimately required to support incorporation of VR robotic simulation into training curricula.
Li, Liang; Yang, Jian; Chu, Yakui; Wu, Wenbo; Xue, Jin; Liang, Ping; Chen, Lei
2016-01-01
Objective To verify the reliability and clinical feasibility of a self-developed navigation system based on an augmented reality technique for endoscopic sinus and skull base surgery. Materials and Methods In this study we performed a head phantom and cadaver experiment to determine the display effect and accuracy of our navigational system. We compared cadaver head-based simulated operations, the target registration error, operation time, and National Aeronautics and Space Administration Task Load Index scores of our navigation system to conventional navigation systems. Results The navigation system developed in this study has a novel display mode capable of fusing endoscopic images to three-dimensional (3-D) virtual images. In the cadaver head experiment, the target registration error was 1.28 ± 0.45 mm, which met the accepted standards of a navigation system used for nasal endoscopic surgery. Compared with conventional navigation systems, the new system was more effective in terms of operation time and the mental workload of surgeons, which is especially important for less experienced surgeons. Conclusion The self-developed augmented reality navigation system for endoscopic sinus and skull base surgery appears to have advantages that outweigh those of conventional navigation systems. We conclude that this navigational system will provide rhinologists with more intuitive and more detailed imaging information, thus reducing the judgment time and mental workload of surgeons when performing complex sinus and skull base surgeries. Ultimately, this new navigational system has potential to increase the quality of surgeries. In addition, the augmented reality navigational system could be of interest to junior doctors being trained in endoscopic techniques because it could speed up their learning. However, it should be noted that the navigation system serves as an adjunct to a surgeon’s skills and knowledge, not as a substitute. PMID:26757365
Li, Liang; Yang, Jian; Chu, Yakui; Wu, Wenbo; Xue, Jin; Liang, Ping; Chen, Lei
2016-01-01
To verify the reliability and clinical feasibility of a self-developed navigation system based on an augmented reality technique for endoscopic sinus and skull base surgery. In this study we performed a head phantom and cadaver experiment to determine the display effect and accuracy of our navigational system. We compared cadaver head-based simulated operations, the target registration error, operation time, and National Aeronautics and Space Administration Task Load Index scores of our navigation system to conventional navigation systems. The navigation system developed in this study has a novel display mode capable of fusing endoscopic images to three-dimensional (3-D) virtual images. In the cadaver head experiment, the target registration error was 1.28 ± 0.45 mm, which met the accepted standards of a navigation system used for nasal endoscopic surgery. Compared with conventional navigation systems, the new system was more effective in terms of operation time and the mental workload of surgeons, which is especially important for less experienced surgeons. The self-developed augmented reality navigation system for endoscopic sinus and skull base surgery appears to have advantages that outweigh those of conventional navigation systems. We conclude that this navigational system will provide rhinologists with more intuitive and more detailed imaging information, thus reducing the judgment time and mental workload of surgeons when performing complex sinus and skull base surgeries. Ultimately, this new navigational system has potential to increase the quality of surgeries. In addition, the augmented reality navigational system could be of interest to junior doctors being trained in endoscopic techniques because it could speed up their learning. However, it should be noted that the navigation system serves as an adjunct to a surgeon's skills and knowledge, not as a substitute.
A radiation-free mixed-reality training environment and assessment concept for C-arm-based surgery.
Stefan, Philipp; Habert, Séverine; Winkler, Alexander; Lazarovici, Marc; Fürmetz, Julian; Eck, Ulrich; Navab, Nassir
2018-06-25
The discrepancy of continuously decreasing opportunities for clinical training and assessment and the increasing complexity of interventions in surgery has led to the development of different training and assessment options like anatomical models, computer-based simulators or cadaver trainings. However, trainees, following training, assessment and ultimately performing patient treatment, still face a steep learning curve. To address this problem for C-arm-based surgery, we introduce a realistic radiation-free simulation system that combines patient-based 3D printed anatomy and simulated X-ray imaging using a physical C-arm. To explore the fidelity and usefulness of the proposed mixed-reality system for training and assessment, we conducted a user study with six surgical experts performing a facet joint injection on the simulator. In a technical evaluation, we show that our system simulates X-ray images accurately with an RMSE of 1.85 mm compared to real X-ray imaging. The participants expressed agreement with the overall realism of the simulation, the usefulness of the system for assessment and strong agreement with the usefulness of such a mixed-reality system for training of novices and experts. In a quantitative analysis, we furthermore evaluated the suitability of the system for the assessment of surgical skills and gather preliminary evidence for validity. The proposed mixed-reality simulation system facilitates a transition to C-arm-based surgery and has the potential to complement or even replace large parts of cadaver training, to provide a safe assessment environment and to reduce the risk for errors when proceeding to patient treatment. We propose an assessment concept and outline the steps necessary to expand the system into a test instrument that provides reliable and justified assessments scores indicative of surgical proficiency with sufficient evidence for validity.
Deterministic chaotic dynamics of Raba River flow (Polish Carpathian Mountains)
NASA Astrophysics Data System (ADS)
Kędra, Mariola
2014-02-01
Is the underlying dynamics of river flow random or deterministic? If it is deterministic, is it deterministic chaotic? This issue is still controversial. The application of several independent methods, techniques and tools for studying daily river flow data gives consistent, reliable and clear-cut results to the question. The outcomes point out that the investigated discharge dynamics is not random but deterministic. Moreover, the results completely confirm the nonlinear deterministic chaotic nature of the studied process. The research was conducted on daily discharge from two selected gauging stations of the mountain river in southern Poland, the Raba River.
Parker, Melissa; Allen, Tim
2014-01-01
Large amounts of funding are being allocated to the control of neglected tropical diseases. Strategies primarily rely on the mass distribution of drugs to adults and children living in endemic areas. The approach is presented as morally appropriate, technically effective, and context-free. Drawing on research undertaken in East Africa, we discuss ways in which normative ideas about global health programs are used to set aside social and biological evidence. In particular, there is a tendency to ignore local details, including information about actual drug take up. Ferguson’s ‘anti-politics’ thesis is a useful starting point for analyzing why this happens, but is overly deterministic. Anti-politics discourse about healing the suffering poor may shape thinking and help explain cognitive dissonance. However, use of such discourse is also a means of strategically promoting vested interests and securing funding. Whatever the underlying motivations, rhetoric and realities are conflated, with potentially counterproductive consequences. PMID:24761976
Proteomics of the Human Placenta: Promises and Realities
Robinson, J.M.; Ackerman, W.E.; Kniss, D.A.; Takizawa, T.; Vandré, D.D.
2015-01-01
Proteomics is an area of study that sets as its ultimate goal the global analysis of all of the proteins expressed in a biological system of interest. However, technical limitations currently hamper proteome-wide analyses of complex systems. In a more practical sense, a desired outcome of proteomics research is the translation of large protein data sets into formats that provide meaningful information regarding clinical conditions (e.g., biomarkers to serve as diagnostic and/or prognostic indicators of disease). Herein, we discuss placental proteomics by describing existing studies, pointing out their strengths and weaknesses. In so doing, we strive to inform investigators interested in this area of research about the current gap between hyperbolic promises and realities. Additionally, we discuss the utility of proteomics in discovery-based research, particularly as regards the capacity to unearth novel insights into placental biology. Importantly, when considering under studied systems such as the human placenta and diseases associated with abnormalities in placental function, proteomics can serve as a robust ‘shortcut’ to obtaining information unlikely to be garnered using traditional approaches. PMID:18222537
The stress and workload of virtual reality training: the effects of presence, immersion and flow.
Lackey, S J; Salcedo, J N; Szalma, J L; Hancock, P A
2016-08-01
The present investigation evaluated the effects of virtual reality (VR) training on the performance, perceived workload and stress response to a live training exercise in a sample of Soldiers. We also examined the relationship between the perceptions of that same VR as measured by engagement, immersion, presence, flow, perceived utility and ease of use with the performance, workload and stress reported on the live training task. To a degree, these latter relationships were moderated by task performance, as measured by binary (Go/No-Go) ratings. Participants who reported positive VR experiences also tended to experience lower stress and lower workload when performing the live version of the task. Thus, VR training regimens may be efficacious for mitigating the stress and workload associated with criterion tasks, thereby reducing the ultimate likelihood of real-world performance failure. Practitioner Summary: VR provides opportunities for training in artificial worlds comprised of highly realistic features. Our virtual room clearing scenario facilitated the integration of Training and Readiness objectives and satisfied training doctrine obligations in a compelling engaging experience for both novice and experienced trainees.
Oele, Marjolein
2018-01-17
This paper contends, following Plato and Broekman, that (1) seeing images as images is crucial to theorizing medicine and that (2) considering clinical pictures as images of images is a much-needed epistemic complement to the domineering view that sees clinical pictures as mirrors of disease. This does not only offer epistemic, but also ethical benefits to individual patients, especially in those cases where patients suffer from chronic, debilitating, and terminal illnesses and where medicine provides no, or limited, answers in terms of treatment, intervention, and meaning. By creating room for a theory of clinical pictures that rightfully emphasizes its pictorial nature, patients and doctors alike may be encouraged to consider under what authorship, and with which epistemic tools, alternative, supplemental images may be produced to get at the existential reality of disease and suffering. Ultimately, this paper argues that the epistemic tools provided by aesthetics may offer such glimpses into the reality of disease and suffering, and I conclude by discussing a few artistic renditions of breast cancer to illustrate my point.
On Mathematical Modeling Of Quantum Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Achuthan, P.; Dept. of Mathematics, Indian Institute of Technology, Madras, 600 036; Narayanankutty, Karuppath
2009-07-02
The world of physical systems at the most fundamental levels is replete with efficient, interesting models possessing sufficient ability to represent the reality to a considerable extent. So far, quantum mechanics (QM) forming the basis of almost all natural phenomena, has found beyond doubt its intrinsic ingenuity, capacity and robustness to stand the rigorous tests of validity from and through appropriate calculations and experiments. No serious failures of quantum mechanical predictions have been reported, yet. However, Albert Einstein, the greatest theoretical physicist of the twentieth century and some other eminent men of science have stated firmly and categorically that QM,more » though successful by and large, is incomplete. There are classical and quantum reality models including those based on consciousness. Relativistic quantum theoretical approaches to clearly understand the ultimate nature of matter as well as radiation have still much to accomplish in order to qualify for a final theory of everything (TOE). Mathematical models of better, suitable character as also strength are needed to achieve satisfactory explanation of natural processes and phenomena. We, in this paper, discuss some of these matters with certain apt illustrations as well.« less
Schlesinger, Matthew
2015-12-01
The interface theory offers a rich blend of logic and mathematical modeling with a dash of evolutionary story-telling, leading to the conclusion that perceptual experience and physical reality are only loosely related. Is the theory convincing? I would have to say "almost"; although it certainly has many elements working in its favor, ultimately, I also found that some important questions were ignored or left unanswered (e.g., a more fully articulated account of how evolutionary mechanisms operate on perception). I am quite optimistic that the next iteration of the theory will be able to address these issues.
Virtual Reality Simulation of the Effects of Microgravity in Gastrointestinal Physiology
NASA Technical Reports Server (NTRS)
Compadre, Cesar M.
1998-01-01
The ultimate goal of this research is to create an anatomically accurate three-dimensional (3D) simulation model of the effects of microgravity in gastrointestinal physiology and to explore the role that such changes may have in the pharmacokinetics of drugs given to the space crews for prevention or therapy. To accomplish this goal the specific aims of this research are: 1) To generate a complete 3-D reconstructions of the human GastroIntestinal (GI) tract of the male and female Visible Humans. 2) To develop and implement time-dependent computer algorithms to simulate the GI motility using the above 3-D reconstruction.
Neurology education: current and emerging concepts in residency and fellowship training.
Stern, Barney J; Józefowicz, Ralph F; Kissela, Brett; Lewis, Steven L
2010-05-01
This article discusses the current and future state of neurology training. A priority is to attract sufficient numbers of qualified candidates for the existing residency programs. A majority of neurology residents elects additional training in a neurologic subspecialty, and programs will have to be accredited accordingly. Attempts are being made to standardize and strengthen the existing general residency and subspecialty programs through cooperative efforts. Ultimately, residency programs must comply with the increasing requirements and try to adapt these requirements to the unique demands and realities of neurology training. An effort is underway to establish consistent competency-testing methods. Copyright 2010 Elsevier Inc. All rights reserved.
Galán, S F; Aguado, F; Díez, F J; Mira, J
2002-07-01
The spread of cancer is a non-deterministic dynamic process. As a consequence, the design of an assistant system for the diagnosis and prognosis of the extent of a cancer should be based on a representation method that deals with both uncertainty and time. The ultimate goal is to know the stage of development of a cancer in a patient before selecting the appropriate treatment. A network of probabilistic events in discrete time (NPEDT) is a type of Bayesian network for temporal reasoning that models the causal mechanisms associated with the time evolution of a process. This paper describes NasoNet, a system that applies NPEDTs to the diagnosis and prognosis of nasopharyngeal cancer. We have made use of temporal noisy gates to model the dynamic causal interactions that take place in the domain. The methodology we describe is general enough to be applied to any other type of cancer.
Effects of Moist Convection on Hurricane Predictability
NASA Technical Reports Server (NTRS)
Zhang, Fuqing; Sippel, Jason A.
2008-01-01
This study exemplifies inherent uncertainties in deterministic prediction of hurricane formation and intensity. Such uncertainties could ultimately limit the predictability of hurricanes at all time scales. In particular, this study highlights the predictability limit due to the effects on moist convection of initial-condition errors with amplitudes far smaller than those of any observation or analysis system. Not only can small and arguably unobservable differences in the initial conditions result in different routes to tropical cyclogenesis, but they can also determine whether or not a tropical disturbance will significantly develop. The details of how the initial vortex is built can depend on chaotic interactions of mesoscale features, such as cold pools from moist convection, whose timing and placement may significantly vary with minute initial differences. Inherent uncertainties in hurricane forecasts illustrate the need for developing advanced ensemble prediction systems to provide event-dependent probabilistic forecasts and risk assessment.
NASA Technical Reports Server (NTRS)
Pilkey, W. D.; Wang, B. P.; Yoo, Y.; Clark, B.
1973-01-01
A description and applications of a computer capability for determining the ultimate optimal behavior of a dynamically loaded structural-mechanical system are presented. This capability provides characteristics of the theoretically best, or limiting, design concept according to response criteria dictated by design requirements. Equations of motion of the system in first or second order form include incompletely specified elements whose characteristics are determined in the optimization of one or more performance indices subject to the response criteria in the form of constraints. The system is subject to deterministic transient inputs, and the computer capability is designed to operate with a large linear programming on-the-shelf software package which performs the desired optimization. The report contains user-oriented program documentation in engineering, problem-oriented form. Applications cover a wide variety of dynamics problems including those associated with such diverse configurations as a missile-silo system, impacting freight cars, and an aircraft ride control system.
Capturing planar shapes by approximating their outlines
NASA Astrophysics Data System (ADS)
Sarfraz, M.; Riyazuddin, M.; Baig, M. H.
2006-05-01
A non-deterministic evolutionary approach for approximating the outlines of planar shapes has been developed. Non-uniform Rational B-splines (NURBS) have been utilized as an underlying approximation curve scheme. Simulated Annealing heuristic is used as an evolutionary methodology. In addition to independent studies of the optimization of weight and knot parameters of the NURBS, a separate scheme has also been developed for the optimization of weights and knots simultaneously. The optimized NURBS models have been fitted over the contour data of the planar shapes for the ultimate and automatic output. The output results are visually pleasing with respect to the threshold provided by the user. A web-based system has also been developed for the effective and worldwide utilization. The objective of this system is to provide the facility to visualize the output to the whole world through internet by providing the freedom to the user for various desired input parameters setting in the algorithm designed.
NASA Technical Reports Server (NTRS)
Russell, Rick; Grundy, David; Jablonski, David; Martin, Christopher; Washabaugh, Andrew; Goldfine, Neil
2011-01-01
There are 3 mechanisms that affect the life of a COPV are: a) The age life of the overwrap; b) Cyclic fatigue of the metallic liner; c) Stress Rupture life. The first two mechanisms are understood through test and analysis. A COPV Stress Rupture is a sudden and catastrophic failure of the overwrap while holding at a stress level below the ultimate strength for an extended time. Currently there is no simple, deterministic method of determining the stress rupture life of a COPV, nor a screening technique to determine if a particular COPV is close to the time of a stress rupture failure. Conclusions: Demonstrated a correlation between MWM response and pressure or strain. Demonstrated the ability to monitor stress in COPV at different orientations and depths. FA41 provides best correlation with bottle pressure or stress.
A microbased shared virtual world prototype
NASA Technical Reports Server (NTRS)
Pitts, Gerald; Robinson, Mark; Strange, Steve
1993-01-01
Virtual reality (VR) allows sensory immersion and interaction with a computer-generated environment. The user adopts a physical interface with the computer, through Input/Output devices such as a head-mounted display, data glove, mouse, keyboard, or monitor, to experience an alternate universe. What this means is that the computer generates an environment which, in its ultimate extension, becomes indistinguishable from the real world. 'Imagine a wraparound television with three-dimensional programs, including three-dimensional sound, and solid objects that you can pick up and manipulate, even feel with your fingers and hands.... 'Imagine that you are the creator as well as the consumer of your artificial experience, with the power to use a gesture or word to remold the world you see and hear and feel. That part is not fiction... three-dimensional computer graphics, input/output devices, computer models that constitute a VR system make it possible, today, to immerse yourself in an artificial world and to reach in and reshape it.' Our research's goal was to propose a feasibility experiment in the construction of a networked virtual reality system, making use of current personal computer (PC) technology. The prototype was built using Borland C compiler, running on an IBM 486 33 MHz and a 386 33 MHz. Each game currently is represented as an IPX client on a non-dedicated Novell server. We initially posed the two questions: (1) Is there a need for networked virtual reality? (2) In what ways can the technology be made available to the most people possible?
Fast mental states decoding in mixed reality.
De Massari, Daniele; Pacheco, Daniel; Malekshahi, Rahim; Betella, Alberto; Verschure, Paul F M J; Birbaumer, Niels; Caria, Andrea
2014-01-01
The combination of Brain-Computer Interface (BCI) technology, allowing online monitoring and decoding of brain activity, with virtual and mixed reality (MR) systems may help to shape and guide implicit and explicit learning using ecological scenarios. Real-time information of ongoing brain states acquired through BCI might be exploited for controlling data presentation in virtual environments. Brain states discrimination during mixed reality experience is thus critical for adapting specific data features to contingent brain activity. In this study we recorded electroencephalographic (EEG) data while participants experienced MR scenarios implemented through the eXperience Induction Machine (XIM). The XIM is a novel framework modeling the integration of a sensing system that evaluates and measures physiological and psychological states with a number of actuators and effectors that coherently reacts to the user's actions. We then assessed continuous EEG-based discrimination of spatial navigation, reading and calculation performed in MR, using linear discriminant analysis (LDA) and support vector machine (SVM) classifiers. Dynamic single trial classification showed high accuracy of LDA and SVM classifiers in detecting multiple brain states as well as in differentiating between high and low mental workload, using a 5 s time-window shifting every 200 ms. Our results indicate overall better performance of LDA with respect to SVM and suggest applicability of our approach in a BCI-controlled MR scenario. Ultimately, successful prediction of brain states might be used to drive adaptation of data representation in order to boost information processing in MR.
Fast mental states decoding in mixed reality
De Massari, Daniele; Pacheco, Daniel; Malekshahi, Rahim; Betella, Alberto; Verschure, Paul F. M. J.; Birbaumer, Niels; Caria, Andrea
2014-01-01
The combination of Brain-Computer Interface (BCI) technology, allowing online monitoring and decoding of brain activity, with virtual and mixed reality (MR) systems may help to shape and guide implicit and explicit learning using ecological scenarios. Real-time information of ongoing brain states acquired through BCI might be exploited for controlling data presentation in virtual environments. Brain states discrimination during mixed reality experience is thus critical for adapting specific data features to contingent brain activity. In this study we recorded electroencephalographic (EEG) data while participants experienced MR scenarios implemented through the eXperience Induction Machine (XIM). The XIM is a novel framework modeling the integration of a sensing system that evaluates and measures physiological and psychological states with a number of actuators and effectors that coherently reacts to the user's actions. We then assessed continuous EEG-based discrimination of spatial navigation, reading and calculation performed in MR, using linear discriminant analysis (LDA) and support vector machine (SVM) classifiers. Dynamic single trial classification showed high accuracy of LDA and SVM classifiers in detecting multiple brain states as well as in differentiating between high and low mental workload, using a 5 s time-window shifting every 200 ms. Our results indicate overall better performance of LDA with respect to SVM and suggest applicability of our approach in a BCI-controlled MR scenario. Ultimately, successful prediction of brain states might be used to drive adaptation of data representation in order to boost information processing in MR. PMID:25505878
Dynamically consistent parameterization of mesoscale eddies. Part III: Deterministic approach
NASA Astrophysics Data System (ADS)
Berloff, Pavel
2018-07-01
This work continues development of dynamically consistent parameterizations for representing mesoscale eddy effects in non-eddy-resolving and eddy-permitting ocean circulation models and focuses on the classical double-gyre problem, in which the main dynamic eddy effects maintain eastward jet extension of the western boundary currents and its adjacent recirculation zones via eddy backscatter mechanism. Despite its fundamental importance, this mechanism remains poorly understood, and in this paper we, first, study it and, then, propose and test its novel parameterization. We start by decomposing the reference eddy-resolving flow solution into the large-scale and eddy components defined by spatial filtering, rather than by the Reynolds decomposition. Next, we find that the eastward jet and its recirculations are robustly present not only in the large-scale flow itself, but also in the rectified time-mean eddies, and in the transient rectified eddy component, which consists of highly anisotropic ribbons of the opposite-sign potential vorticity anomalies straddling the instantaneous eastward jet core and being responsible for its continuous amplification. The transient rectified component is separated from the flow by a novel remapping method. We hypothesize that the above three components of the eastward jet are ultimately driven by the small-scale transient eddy forcing via the eddy backscatter mechanism, rather than by the mean eddy forcing and large-scale nonlinearities. We verify this hypothesis by progressively turning down the backscatter and observing the induced flow anomalies. The backscatter analysis leads us to formulating the key eddy parameterization hypothesis: in an eddy-permitting model at least partially resolved eddy backscatter can be significantly amplified to improve the flow solution. Such amplification is a simple and novel eddy parameterization framework implemented here in terms of local, deterministic flow roughening controlled by single parameter. We test the parameterization skills in an hierarchy of non-eddy-resolving and eddy-permitting modifications of the original model and demonstrate, that indeed it can be highly efficient for restoring the eastward jet extension and its adjacent recirculation zones. The new deterministic parameterization framework not only combines remarkable simplicity with good performance but also is dynamically transparent, therefore, it provides a powerful alternative to the common eddy diffusion and emerging stochastic parameterizations.
NASA Astrophysics Data System (ADS)
Pankratov, Oleg; Kuvshinov, Alexey
2016-01-01
Despite impressive progress in the development and application of electromagnetic (EM) deterministic inverse schemes to map the 3-D distribution of electrical conductivity within the Earth, there is one question which remains poorly addressed—uncertainty quantification of the recovered conductivity models. Apparently, only an inversion based on a statistical approach provides a systematic framework to quantify such uncertainties. The Metropolis-Hastings (M-H) algorithm is the most popular technique for sampling the posterior probability distribution that describes the solution of the statistical inverse problem. However, all statistical inverse schemes require an enormous amount of forward simulations and thus appear to be extremely demanding computationally, if not prohibitive, if a 3-D set up is invoked. This urges development of fast and scalable 3-D modelling codes which can run large-scale 3-D models of practical interest for fractions of a second on high-performance multi-core platforms. But, even with these codes, the challenge for M-H methods is to construct proposal functions that simultaneously provide a good approximation of the target density function while being inexpensive to be sampled. In this paper we address both of these issues. First we introduce a variant of the M-H method which uses information about the local gradient and Hessian of the penalty function. This, in particular, allows us to exploit adjoint-based machinery that has been instrumental for the fast solution of deterministic inverse problems. We explain why this modification of M-H significantly accelerates sampling of the posterior probability distribution. In addition we show how Hessian handling (inverse, square root) can be made practicable by a low-rank approximation using the Lanczos algorithm. Ultimately we discuss uncertainty analysis based on stochastic inversion results. In addition, we demonstrate how this analysis can be performed within a deterministic approach. In the second part, we summarize modern trends in the development of efficient 3-D EM forward modelling schemes with special emphasis on recent advances in the integral equation approach.
Erikson, Li; Barnard, Patrick; O'Neill, Andrea; Wood, Nathan J.; Jones, Jeanne M.; Finzi Hart, Juliette; Vitousek, Sean; Limber, Patrick; Hayden, Maya; Fitzgibbon, Michael; Lovering, Jessica; Foxgrover, Amy C.
2018-01-01
This paper is the second of two that describes the Coastal Storm Modeling System (CoSMoS) approach for quantifying physical hazards and socio-economic hazard exposure in coastal zones affected by sea-level rise and changing coastal storms. The modelling approach, presented in Part 1, downscales atmospheric global-scale projections to local scale coastal flood impacts by deterministically computing the combined hazards of sea-level rise, waves, storm surges, astronomic tides, fluvial discharges, and changes in shoreline positions. The method is demonstrated through an application to Southern California, United States, where the shoreline is a mix of bluffs, beaches, highly managed coastal communities, and infrastructure of high economic value. Results show that inclusion of 100-year projected coastal storms will increase flooding by 9–350% (an additional average 53.0 ± 16.0 km2) in addition to a 25–500 cm sea-level rise. The greater flooding extents translate to a 55–110% increase in residential impact and a 40–90% increase in building replacement costs. To communicate hazards and ranges in socio-economic exposures to these hazards, a set of tools were collaboratively designed and tested with stakeholders and policy makers; these tools consist of two web-based mapping and analytic applications as well as virtual reality visualizations. To reach a larger audience and enhance usability of the data, outreach and engagement included workshop-style trainings for targeted end-users and innovative applications of the virtual reality visualizations.
Varley, Adam; Tyler, Andrew; Smith, Leslie; Dale, Paul; Davies, Mike
2015-07-15
The extensive use of radium during the 20th century for industrial, military and pharmaceutical purposes has led to a large number of contaminated legacy sites across Europe and North America. Sites that pose a high risk to the general public can present expensive and long-term remediation projects. Often the most pragmatic remediation approach is through routine monitoring operating gamma-ray detectors to identify, in real-time, the signal from the most hazardous heterogeneous contamination (hot particles); thus facilitating their removal and safe disposal. However, current detection systems do not fully utilise all spectral information resulting in low detection rates and ultimately an increased risk to the human health. The aim of this study was to establish an optimised detector-algorithm combination. To achieve this, field data was collected using two handheld detectors (sodium iodide and lanthanum bromide) and a number of Monte Carlo simulated hot particles were randomly injected into the field data. This allowed for the detection rate of conventional deterministic (gross counts) and machine learning (neural networks and support vector machines) algorithms to be assessed. The results demonstrated that a Neural Network operated on a sodium iodide detector provided the best detection capability. Compared to deterministic approaches, this optimised detection system could detect a hot particle on average 10cm deeper into the soil column or with half of the activity at the same depth. It was also found that noise presented by internal contamination restricted lanthanum bromide for this application. Copyright © 2015. Published by Elsevier B.V.
Schlaier, Juergen R; Beer, Anton L; Faltermeier, Rupert; Fellner, Claudia; Steib, Kathrin; Lange, Max; Greenlee, Mark W; Brawanski, Alexander T; Anthofer, Judith M
2017-06-01
This study compared tractography approaches for identifying cerebellar-thalamic fiber bundles relevant to planning target sites for deep brain stimulation (DBS). In particular, probabilistic and deterministic tracking of the dentate-rubro-thalamic tract (DRTT) and differences between the spatial courses of the DRTT and the cerebello-thalamo-cortical (CTC) tract were compared. Six patients with movement disorders were examined by magnetic resonance imaging (MRI), including two sets of diffusion-weighted images (12 and 64 directions). Probabilistic and deterministic tractography was applied on each diffusion-weighted dataset to delineate the DRTT. Results were compared with regard to their sensitivity in revealing the DRTT and additional fiber tracts and processing time. Two sets of regions-of-interests (ROIs) guided deterministic tractography of the DRTT or the CTC, respectively. Tract distances to an atlas-based reference target were compared. Probabilistic fiber tracking with 64 orientations detected the DRTT in all twelve hemispheres. Deterministic tracking detected the DRTT in nine (12 directions) and in only two (64 directions) hemispheres. Probabilistic tracking was more sensitive in detecting additional fibers (e.g. ansa lenticularis and medial forebrain bundle) than deterministic tracking. Probabilistic tracking lasted substantially longer than deterministic. Deterministic tracking was more sensitive in detecting the CTC than the DRTT. CTC tracts were located adjacent but consistently more posterior to DRTT tracts. These results suggest that probabilistic tracking is more sensitive and robust in detecting the DRTT but harder to implement than deterministic approaches. Although sensitivity of deterministic tracking is higher for the CTC than the DRTT, targets for DBS based on these tracts likely differ. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Deterministic quantum dense coding networks
NASA Astrophysics Data System (ADS)
Roy, Saptarshi; Chanda, Titas; Das, Tamoghna; Sen(De), Aditi; Sen, Ujjwal
2018-07-01
We consider the scenario of deterministic classical information transmission between multiple senders and a single receiver, when they a priori share a multipartite quantum state - an attempt towards building a deterministic dense coding network. Specifically, we prove that in the case of two or three senders and a single receiver, generalized Greenberger-Horne-Zeilinger (gGHZ) states are not beneficial for sending classical information deterministically beyond the classical limit, except when the shared state is the GHZ state itself. On the other hand, three- and four-qubit generalized W (gW) states with specific parameters as well as the four-qubit Dicke states can provide a quantum advantage of sending the information in deterministic dense coding. Interestingly however, numerical simulations in the three-qubit scenario reveal that the percentage of states from the GHZ-class that are deterministic dense codeable is higher than that of states from the W-class.
Principles to enable leaders to navigate the harsh realities of crisis and risk communication.
Reynolds, Barbara J
2010-07-01
Leadership during a crisis that involves the physical safety and emotional or financial wellbeing of those being led offers an intense environment that may not allow for on-the-job training. One of the challenges faced by crisis leaders is to communicate effectively the courses of action needed to allow for a reduction of harm to individuals and the ultimate restoration of the group, organisation or community. The six principles of crisis and emergency risk communication (CERC) give leaders tools to navigate the harsh realities of speaking to employees, media, partners and stakeholders during an intense crisis. CERC also helps leaders to avoid the five most common communication mistakes during crises. Much of the harmful individual and group behaviour predicted in a profound crisis can be mitigated with effective crisis and emergency risk communication. A leader must anticipate what mental stresses followers will be experiencing and apply appropriate communication strategies to attempt to manage these stresses among staff or the public and preserve or repair the organisation's reputation. In an emergency, the right message at the right time is a 'resource multiplier' - it helps leaders to get their job done.
The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy.
O'Brolcháin, Fiachra; Jacquemard, Tim; Monaghan, David; O'Connor, Noel; Novitzky, Peter; Gordijn, Bert
2016-02-01
The rapid evolution of information, communication and entertainment technologies will transform the lives of citizens and ultimately transform society. This paper focuses on ethical issues associated with the likely convergence of virtual realities (VR) and social networks (SNs), hereafter VRSNs. We examine a scenario in which a significant segment of the world's population has a presence in a VRSN. Given the pace of technological development and the popularity of these new forms of social interaction, this scenario is plausible. However, it brings with it ethical problems. Two central ethical issues are addressed: those of privacy and those of autonomy. VRSNs pose threats to both privacy and autonomy. The threats to privacy can be broadly categorized as threats to informational privacy, threats to physical privacy, and threats to associational privacy. Each of these threats is further subdivided. The threats to autonomy can be broadly categorized as threats to freedom, to knowledge and to authenticity. Again, these three threats are divided into subcategories. Having categorized the main threats posed by VRSNs, a number of recommendations are provided so that policy-makers, developers, and users can make the best possible use of VRSNs.
Toward Scientific Numerical Modeling
NASA Technical Reports Server (NTRS)
Kleb, Bil
2007-01-01
Ultimately, scientific numerical models need quantified output uncertainties so that modeling can evolve to better match reality. Documenting model input uncertainties and verifying that numerical models are translated into code correctly, however, are necessary first steps toward that goal. Without known input parameter uncertainties, model sensitivities are all one can determine, and without code verification, output uncertainties are simply not reliable. To address these two shortcomings, two proposals are offered: (1) an unobtrusive mechanism to document input parameter uncertainties in situ and (2) an adaptation of the Scientific Method to numerical model development and deployment. Because these two steps require changes in the computational simulation community to bear fruit, they are presented in terms of the Beckhard-Harris-Gleicher change model.
On the Existence and Uniqueness of the Scientific Method.
Wagensberg, Jorge
2014-01-01
The ultimate utility of science is widely agreed upon: the comprehension of reality. But there is much controversy about what scientific understanding actually means, and how we should proceed in order to gain new scientific understanding. Is there a method for acquiring new scientific knowledge? Is this method unique and universal? There has been no shortage of proposals, but neither has there been a shortage of skeptics about these proposals. This article proffers for discussion a potential scientific method that aspires to be unique and universal and is rooted in the recent and ancient history of scientific thinking. Curiously, conclusions can be inferred from this scientific method that also concern education and the transmission of science to others.
Brazhnik, Olga; Jones, John F.
2007-01-01
Producing reliable information is the ultimate goal of data processing. The ocean of data created with the advances of science and technologies calls for integration of data coming from heterogeneous sources that are diverse in their purposes, business rules, underlying models and enabling technologies. Reference models, Semantic Web, standards, ontology, and other technologies enable fast and efficient merging of heterogeneous data, while the reliability of produced information is largely defined by how well the data represent the reality. In this paper we initiate a framework for assessing the informational value of data that includes data dimensions; aligning data quality with business practices; identifying authoritative sources and integration keys; merging models; uniting updates of varying frequency and overlapping or gapped data sets. PMID:17071142
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weerakkody, Sean; Liu, Xiaofei; Sinopoli, Bruno
We consider the design and analysis of robust distributed control systems (DCSs) to ensure the detection of integrity attacks. DCSs are often managed by independent agents and are implemented using a diverse set of sensors and controllers. However, the heterogeneous nature of DCSs along with their scale leave such systems vulnerable to adversarial behavior. To mitigate this reality, we provide tools that allow operators to prevent zero dynamics attacks when as many as p agents and sensors are corrupted. Such a design ensures attack detectability in deterministic systems while removing the threat of a class of stealthy attacks in stochasticmore » systems. To achieve this goal, we use graph theory to obtain necessary and sufficient conditions for the presence of zero dynamics attacks in terms of the structural interactions between agents and sensors. We then formulate and solve optimization problems which minimize communication networks while also ensuring a resource limited adversary cannot perform a zero dynamics attacks. Polynomial time algorithms for design and analysis are provided.« less
Paracousti-UQ: A Stochastic 3-D Acoustic Wave Propagation Algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preston, Leiph
Acoustic full waveform algorithms, such as Paracousti, provide deterministic solutions in complex, 3-D variable environments. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected sound levels within an environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. Performing Monte Carlo (MC) simulations is one method of assessing this uncertainty, but it can quickly become computationally intractable for realistic problems. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a fractionmore » of the computational cost of MC. Paracousti-UQ solves the SPDE system of 3-D acoustic wave propagation equations and provides estimates of the uncertainty of the output simulated wave field (e.g., amplitudes, waveforms) based on estimated probability distributions of the input medium and source parameters. This report describes the derivation of the stochastic partial differential equations, their implementation, and comparison of Paracousti-UQ results with MC simulations using simple models.« less
Gomes, Mafalda; Matias, Alexandra; Macedo, Filipe
2015-12-01
Every day, medical practitioners face the dilemma of exposing pregnant or possibly pregnant patients to radiation from diagnostic examinations. Both doctors and patients often have questions about the risks of radiation. The most vulnerable period is between the 8th and 15th weeks of gestation. Deterministic effects like pregnancy loss, congenital malformations, growth retardation and neurobehavioral abnormalities have threshold doses above 100-200 mGy. The risk is considered negligible at 50 mGy and in reality no diagnostic examination exceeds this limit. The risk of carcinogenesis is slightly higher than in the general population. Intravenous iodinated contrast is discouraged, except in highly selected patients. Considering all the possible noxious effects of radiation exposure, measures to diminish radiation are essential and affect the fetal outcome. Nonionizing procedures should be considered whenever possible and every radiology center should have its own data analysis on fetal radiation exposure. In this review, we analyze existing literature on fetal risks due to radiation exposure, producing a clinical protocol to guide safe radiation use in a clinical setting.
The relationship between stochastic and deterministic quasi-steady state approximations.
Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R
2015-11-23
The quasi steady-state approximation (QSSA) is frequently used to reduce deterministic models of biochemical networks. The resulting equations provide a simplified description of the network in terms of non-elementary reaction functions (e.g. Hill functions). Such deterministic reductions are frequently a basis for heuristic stochastic models in which non-elementary reaction functions are used to define reaction propensities. Despite their popularity, it remains unclear when such stochastic reductions are valid. It is frequently assumed that the stochastic reduction can be trusted whenever its deterministic counterpart is accurate. However, a number of recent examples show that this is not necessarily the case. Here we explain the origin of these discrepancies, and demonstrate a clear relationship between the accuracy of the deterministic and the stochastic QSSA for examples widely used in biological systems. With an analysis of a two-state promoter model, and numerical simulations for a variety of other models, we find that the stochastic QSSA is accurate whenever its deterministic counterpart provides an accurate approximation over a range of initial conditions which cover the likely fluctuations from the quasi steady-state (QSS). We conjecture that this relationship provides a simple and computationally inexpensive way to test the accuracy of reduced stochastic models using deterministic simulations. The stochastic QSSA is one of the most popular multi-scale stochastic simulation methods. While the use of QSSA, and the resulting non-elementary functions has been justified in the deterministic case, it is not clear when their stochastic counterparts are accurate. In this study, we show how the accuracy of the stochastic QSSA can be tested using their deterministic counterparts providing a concrete method to test when non-elementary rate functions can be used in stochastic simulations.
Deterministic Walks with Choice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beeler, Katy E.; Berenhaut, Kenneth S.; Cooper, Joshua N.
2014-01-10
This paper studies deterministic movement over toroidal grids, integrating local information, bounded memory and choice at individual nodes. The research is motivated by recent work on deterministic random walks, and applications in multi-agent systems. Several results regarding passing tokens through toroidal grids are discussed, as well as some open questions.
Nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates
Melechko, Anatoli V [Oak Ridge, TN; McKnight, Timothy E. , Guillorn, Michael A.; Ilic, Bojan [Ithaca, NY; Merkulov, Vladimir I [Knoxville, TN; Doktycz, Mitchel J [Knoxville, TN; Lowndes, Douglas H [Knoxville, TN; Simpson, Michael L [Knoxville, TN
2011-05-17
Methods, manufactures, machines and compositions are described for nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates. A method includes depositing a catalyst particle on a surface of a substrate to define a deterministically located position; growing an aligned elongated nanostructure on the substrate, an end of the aligned elongated nanostructure coupled to the substrate at the deterministically located position; coating the aligned elongated nanostructure with a conduit material; removing a portion of the conduit material to expose the catalyst particle; removing the catalyst particle; and removing the elongated nanostructure to define a nanoconduit.
NASA Astrophysics Data System (ADS)
Itoh, Kosuke; Nakada, Tsutomu
2013-04-01
Deterministic nonlinear dynamical processes are ubiquitous in nature. Chaotic sounds generated by such processes may appear irregular and random in waveform, but these sounds are mathematically distinguished from random stochastic sounds in that they contain deterministic short-time predictability in their temporal fine structures. We show that the human brain distinguishes deterministic chaotic sounds from spectrally matched stochastic sounds in neural processing and perception. Deterministic chaotic sounds, even without being attended to, elicited greater cerebral cortical responses than the surrogate control sounds after about 150 ms in latency after sound onset. Listeners also clearly discriminated these sounds in perception. The results support the hypothesis that the human auditory system is sensitive to the subtle short-time predictability embedded in the temporal fine structure of sounds.
A deterministic particle method for one-dimensional reaction-diffusion equations
NASA Technical Reports Server (NTRS)
Mascagni, Michael
1995-01-01
We derive a deterministic particle method for the solution of nonlinear reaction-diffusion equations in one spatial dimension. This deterministic method is an analog of a Monte Carlo method for the solution of these problems that has been previously investigated by the author. The deterministic method leads to the consideration of a system of ordinary differential equations for the positions of suitably defined particles. We then consider the time explicit and implicit methods for this system of ordinary differential equations and we study a Picard and Newton iteration for the solution of the implicit system. Next we solve numerically this system and study the discretization error both analytically and numerically. Numerical computation shows that this deterministic method is automatically adaptive to large gradients in the solution.
Hybrid Monte Carlo/Deterministic Methods for Accelerating Active Interrogation Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peplow, Douglas E.; Miller, Thomas Martin; Patton, Bruce W
2013-01-01
The potential for smuggling special nuclear material (SNM) into the United States is a major concern to homeland security, so federal agencies are investigating a variety of preventive measures, including detection and interdiction of SNM during transport. One approach for SNM detection, called active interrogation, uses a radiation source, such as a beam of neutrons or photons, to scan cargo containers and detect the products of induced fissions. In realistic cargo transport scenarios, the process of inducing and detecting fissions in SNM is difficult due to the presence of various and potentially thick materials between the radiation source and themore » SNM, and the practical limitations on radiation source strength and detection capabilities. Therefore, computer simulations are being used, along with experimental measurements, in efforts to design effective active interrogation detection systems. The computer simulations mostly consist of simulating radiation transport from the source to the detector region(s). Although the Monte Carlo method is predominantly used for these simulations, difficulties persist related to calculating statistically meaningful detector responses in practical computing times, thereby limiting their usefulness for design and evaluation of practical active interrogation systems. In previous work, the benefits of hybrid methods that use the results of approximate deterministic transport calculations to accelerate high-fidelity Monte Carlo simulations have been demonstrated for source-detector type problems. In this work, the hybrid methods are applied and evaluated for three example active interrogation problems. Additionally, a new approach is presented that uses multiple goal-based importance functions depending on a particle s relevance to the ultimate goal of the simulation. Results from the examples demonstrate that the application of hybrid methods to active interrogation problems dramatically increases their calculational efficiency.« less
Coupled Effects of non-Newtonian Rheology and Aperture Variability on Flow in a Single Fracture
NASA Astrophysics Data System (ADS)
Di Federico, V.; Felisa, G.; Lauriola, I.; Longo, S.
2017-12-01
Modeling of non-Newtonian flow in fractured media is essential in hydraulic fracturing and drilling operations, EOR, environmental remediation, and to understand magma intrusions. An important step in the modeling effort is a detailed understanding of flow in a single fracture, as the fracture aperture is spatially variable. A large bibliography exists on Newtonian and non-Newtonian flow in variable aperture fractures. Ultimately, stochastic or deterministic modeling leads to the flowrate under a given pressure gradient as a function of the parameters describing the aperture variability and the fluid rheology. Typically, analytical or numerical studies are performed adopting a power-law (Oswald-de Waele) model. Yet the power-law model, routinely used e.g. for hydro-fracturing modeling, does not characterize real fluids at low and high shear rates. A more appropriate rheological model is provided by e.g. the four-parameter Carreau constitutive equation, which is in turn approximated by the more tractable truncated power-law model. Moreover, fluids of interest may exhibit yield stress, which requires the Bingham or Herschel-Bulkely model. This study employs different rheological models in the context of flow in variable aperture fractures, with the aim of understanding the coupled effect of rheology and aperture spatial variability with a simplified model. The aperture variation, modeled within a stochastic or deterministic framework, is taken to be one-dimensional and i) perpendicular; ii) parallel to the flow direction; for stochastic modeling, the influence of different distribution functions is examined. Results for the different rheological models are compared with those obtained for the pure power-law. The adoption of the latter model leads to overestimation of the flowrate, more so for large aperture variability. The presence of yield stress also induces significant changes in the resulting flowrate for assigned external pressure gradient.
NASA Astrophysics Data System (ADS)
Ellis, G. F. R.
In the early part of this century, physicists, led notably by Albert Einstein and the pioneers of quantum theory-in particular Neils Bohr, Werner Heisenberg, and Paul Dirac-discovered that the underlying nature of physical reality is stranger than anyone had ever imagined. A series of brilliant insights led to the realisation, on the one hand, of the relative nature of space and time measurements, and hence of our basic concepts of space and time (ultimately leading to the discovery of nuclear energy), and on the other hand, of the quantum nature of matter, with its associated quantum statistics and uncertainty of prediction (leading to transistors and lasers). Combining these views ultimately led to a realisation of the necessity of the existence of anti-matter, and of the dynamic nature of the vacuum. Further developments led to an understanding of the existence of symmetries characterising the various families of elementary particles, and of the unified nature of the fundamental interactions when described as gauge theories with forces mediated by exchange of gauge bosons. These properties have all been confirmed by carefully controlled experiments.
Compressive behavior of laminated neoprene bridge bearing pads under thermal aging condition
NASA Astrophysics Data System (ADS)
Jun, Xie; Zhang, Yannian; Shan, Chunhong
2017-10-01
The present study was conducted to obtain a better understanding of the variation rule of mechanical properties of laminated neoprene bridge bearing pads under thermal aging condition using compression tests. A total of 5 specimens were processed in a high-temperature chamber. After that, the specimens were tested subjected to axial load. The parameter mainly considered time of thermal aging processing for specimens. The results of compression tests show that the specimens after thermal aging processing are more probably brittle failure than the standard specimen. Moreover, the exposure of steel plate, cracks and other failure phenomena are more serious than the standard specimen. The compressive capacity, ultimate compressive strength, compressive elastic modulus of the laminated neoprene bridge bearing pads decreased dramatically with the increasing in the aging time of thermal aging processing. The attenuation trends of ultimate compressive strength, compressive elastic modulus of laminated neoprene bridge bearing pads under thermal aging condition accord with power function. The attenuation models are acquired by regressing data of experiment with the least square method. The attenuation models conform to reality well which shows that this model is applicable and has vast prospect in assessing the performance of laminated neoprene bridge bearing pads under thermal aging condition.
The fractal-multifractal method and temporal resolution: Application to precipitation and streamflow
NASA Astrophysics Data System (ADS)
Maskey, M.; Puente, C. E.; Sivakumar, B.
2017-12-01
In the past, we have established that the deterministic fractal-multifractal (FM) method is a promising geometric tool to analyze hydro-climatic variables, such as precipitation, river flow, and temperature. In this study, we address the issue of temporal resolution to advance the suitability and usefulness of the FM approach in hydro-climate. Specifically, we elucidate the evolution of FM geometric parameters as computed at different time scales ranging from a day to a month (30-day) in increments of a day. For this purpose, both rainfall and river discharge records at Sacramento, California gathered over a year are encoded at different time scales. The analysis reveals that: (a) the FM approach yields faithful encodings of both kinds of data sets at the resolutions considered with reasonably small errors; and (b) the "best" FM parameters ultimately converge when the resolution is increased, thus allowing visualizing both hydrologic attributes. By addressing the scalability of the geometric patterns, these results further advance the suitability of the FM approach.
Study of dynamics of X-14B VTOL aircraft
NASA Technical Reports Server (NTRS)
Loscutoff, W. V.; Mitchiner, J. L.; Roesener, R. A.; Seevers, J. A.
1973-01-01
Research was initiated to investigate certain facets of modern control theory and their integration with a digital computer to provide a tractable flight control system for a VTOL aircraft. Since the hover mode is the most demanding phase in the operation of a VTOL aircraft, the research efforts were concentrated in this mode of aircraft operation. Research work on three different aspects of the operation of the X-14B VTOL aircraft is discussed. A general theory for optimal, prespecified, closed-loop control is developed. The ultimate goal was optimal decoupling of the modes of the VTOL aircraft to simplify the pilot's task of handling the aircraft. Modern control theory is used to design deterministic state estimators which provide state variables not measured directly, but which are needed for state variable feedback control. The effect of atmospheric turbulence on the X-14B is investigated. A maximum magnitude gust envelope within which the aircraft could operate stably with the available control power is determined.
NASA Astrophysics Data System (ADS)
Fabianová, Jana; Kačmáry, Peter; Molnár, Vieroslav; Michalik, Peter
2016-10-01
Forecasting is one of the logistics activities and a sales forecast is the starting point for the elaboration of business plans. Forecast accuracy affects the business outcomes and ultimately may significantly affect the economic stability of the company. The accuracy of the prediction depends on the suitability of the use of forecasting methods, experience, quality of input data, time period and other factors. The input data are usually not deterministic but they are often of random nature. They are affected by uncertainties of the market environment, and many other factors. Taking into account the input data uncertainty, the forecast error can by reduced. This article deals with the use of the software tool for incorporating data uncertainty into forecasting. Proposals are presented of a forecasting approach and simulation of the impact of uncertain input parameters to the target forecasted value by this case study model. The statistical analysis and risk analysis of the forecast results is carried out including sensitivity analysis and variables impact analysis.
Deterministic and Stochastic Analysis of a Prey-Dependent Predator-Prey System
ERIC Educational Resources Information Center
Maiti, Alakes; Samanta, G. P.
2005-01-01
This paper reports on studies of the deterministic and stochastic behaviours of a predator-prey system with prey-dependent response function. The first part of the paper deals with the deterministic analysis of uniform boundedness, permanence, stability and bifurcation. In the second part the reproductive and mortality factors of the prey and…
ShinyGPAS: interactive genomic prediction accuracy simulator based on deterministic formulas.
Morota, Gota
2017-12-20
Deterministic formulas for the accuracy of genomic predictions highlight the relationships among prediction accuracy and potential factors influencing prediction accuracy prior to performing computationally intensive cross-validation. Visualizing such deterministic formulas in an interactive manner may lead to a better understanding of how genetic factors control prediction accuracy. The software to simulate deterministic formulas for genomic prediction accuracy was implemented in R and encapsulated as a web-based Shiny application. Shiny genomic prediction accuracy simulator (ShinyGPAS) simulates various deterministic formulas and delivers dynamic scatter plots of prediction accuracy versus genetic factors impacting prediction accuracy, while requiring only mouse navigation in a web browser. ShinyGPAS is available at: https://chikudaisei.shinyapps.io/shinygpas/ . ShinyGPAS is a shiny-based interactive genomic prediction accuracy simulator using deterministic formulas. It can be used for interactively exploring potential factors that influence prediction accuracy in genome-enabled prediction, simulating achievable prediction accuracy prior to genotyping individuals, or supporting in-class teaching. ShinyGPAS is open source software and it is hosted online as a freely available web-based resource with an intuitive graphical user interface.
NASA Astrophysics Data System (ADS)
Staver, John R.
2010-03-01
Science and religion exhibit multiple relationships as ways of knowing. These connections have been characterized as cousinly, mutually respectful, non-overlapping, competitive, proximate-ultimate, dominant-subordinate, and opposing-conflicting. Some of these ties create stress, and tension between science and religion represents a significant chapter in humans' cultural heritage before and since the Enlightenment. Truth, knowledge, and their relation are central to science and religion as ways of knowing, as social institutions, and to their interaction. In religion, truth is revealed through God's word. In science, truth is sought after via empirical methods. Discord can be viewed as a competition for social legitimization between two social institutions whose goals are explaining the world and how it works. Under this view, the root of the discord is truth as correspondence. In this concept of truth, knowledge corresponds to the facts of reality, and conflict is inevitable for many because humans want to ask which one—science or religion—gets the facts correct. But, the root paradox, also known as the problem of the criterion, suggests that seeking to know nature as it is represents a fruitless endeavor. The discord can be set on new ground and resolved by taking a moderately skeptical line of thought, one which employs truth as coherence and a moderate form of constructivist epistemology. Quantum mechanics and evolution as scientific theories and scientific research on human consciousness and vision provide support for this line of argument. Within a constructivist perspective, scientists would relinquish only the pursuit of knowing reality as it is. Scientists would retain everything else. Believers who hold that religion explains reality would come to understand that God never revealed His truth of nature; rather, He revealed His truth in how we are to conduct our lives.
Zou, Yi-Bo; Chen, Yi-Min; Gao, Ming-Ke; Liu, Quan; Jiang, Si-Yu; Lu, Jia-Hui; Huang, Chen; Li, Ze-Yu; Zhang, Dian-Hua
2017-08-01
Coronary heart disease preoperative diagnosis plays an important role in the treatment of vascular interventional surgery. Actually, most doctors are used to diagnosing the position of the vascular stenosis and then empirically estimating vascular stenosis by selective coronary angiography images instead of using mouse, keyboard and computer during preoperative diagnosis. The invasive diagnostic modality is short of intuitive and natural interaction and the results are not accurate enough. Aiming at above problems, the coronary heart disease preoperative gesture interactive diagnostic system based on Augmented Reality is proposed. The system uses Leap Motion Controller to capture hand gesture video sequences and extract the features which that are the position and orientation vector of the gesture motion trajectory and the change of the hand shape. The training planet is determined by K-means algorithm and then the effect of gesture training is improved by multi-features and multi-observation sequences for gesture training. The reusability of gesture is improved by establishing the state transition model. The algorithm efficiency is improved by gesture prejudgment which is used by threshold discriminating before recognition. The integrity of the trajectory is preserved and the gesture motion space is extended by employing space rotation transformation of gesture manipulation plane. Ultimately, the gesture recognition based on SRT-HMM is realized. The diagnosis and measurement of the vascular stenosis are intuitively and naturally realized by operating and measuring the coronary artery model with augmented reality and gesture interaction techniques. All of the gesture recognition experiments show the distinguish ability and generalization ability of the algorithm and gesture interaction experiments prove the availability and reliability of the system.
NASA Astrophysics Data System (ADS)
Kersten, T. P.; Büyüksalih, G.; Tschirschwitz, F.; Kan, T.; Deggim, S.; Kaya, Y.; Baskaraca, A. P.
2017-05-01
Recent advances in contemporary Virtual Reality (VR) technologies are going to have a significant impact on veryday life. Through VR it is possible to virtually explore a computer-generated environment as a different reality, and to immerse oneself into the past or in a virtual museum without leaving the current real-life situation. For such the ultimate VR experience, the user should only see the virtual world. Currently, the user must wear a VR headset which fits around the head and over the eyes to visually separate themselves from the physical world. Via the headset images are fed to the eyes through two small lenses. Cultural heritage monuments are ideally suited both for thorough multi-dimensional geometric documentation and for realistic interactive visualisation in immersive VR applications. Additionally, the game industry offers tools for interactive visualisation of objects to motivate users to virtually visit objects and places. In this paper the generation of a virtual 3D model of the Selimiye mosque in the city of Edirne, Turkey and its processing for data integration into the game engine Unity is presented. The project has been carried out as a co-operation between BİMTAŞ, a company of the Greater Municipality of Istanbul, Turkey and the Photogrammetry & Laser Scanning Lab of the HafenCity University Hamburg, Germany to demonstrate an immersive and interactive visualisation using the new VR system HTC Vive. The workflow from data acquisition to VR visualisation, including the necessary programming for navigation, is described. Furthermore, the possible use (including simultaneous multiple users environments) of such a VR visualisation for a CH monument is discussed in this contribution.
Virtual Reality As a Training Tool to Treat Physical Inactivity in Children.
Kiefer, Adam W; Pincus, David; Richardson, Michael J; Myer, Gregory D
2017-01-01
Lack of adequate physical activity in children is an epidemic that can result in obesity and other poor health outcomes across the lifespan. Physical activity interventions focused on motor skill competence continue to be developed, but some interventions, such as neuromuscular training (NMT), may be limited in how early they can be implemented due to dependence on the child's level of cognitive and perceptual-motor development. Early implementation of motor-rich activities that support motor skill development in children is critical for the development of healthy levels of physical activity that carry through into adulthood. Virtual reality (VR) training may be beneficial in this regard. VR training, when grounded in an information-based theory of perceptual-motor behavior that modifies the visual information in the virtual world, can promote early development of motor skills in youth akin to more natural, real-world development as opposed to strictly formalized training. This approach can be tailored to the individual child and training scenarios can increase in complexity as the child develops. Ultimately, training in VR may help serve as a precursor to "real-world" NMT, and once the child reaches the appropriate training age can also augment more complex NMT regimens performed outside of the virtual environment.
Overcoming O: Dewey and the Problem of Bion's Metaphysics.
Soffer-Dudek, Nir
2015-10-01
Bion guides us to eschew memory, desire, and understanding in order to become one with O-the ultimate reality of the analytic moment. However, his directions are valid only to the extent that such a meta-reality actually exists. Otherwise there is nothing to unite with and no reason to shun memory or desire. The present work inquires whether we may provide Bion's technique with a less speculative philosophy, specifically Dewey's pragmatist theory of aesthetics. It begins with reviewing the similarities between the two writers' methods, highlighting their shared emphasis on openness to the unknown. Yet listening to their intonations reveals that they actually convey opposite ideas as to what this "unknown" may be. Whereas Dewey sanguinely portrays the possibilities of the "yet-unknown," Bion emphasizes the dread of our inescapable encounter with the unknowable. This dread is embodied in his concept of O. Thus, rather than being merely a metaphysical speculation, O communicates Bion's conviction that fear forms the core of our existence. Banishing O from the counseling room may indeed aid his method in becoming accessible to a wider audience; at the same time, however, doing so might also deprive it of the very context that gives it meaning. © 2015 by the American Psychoanalytic Association.
Virtual Reality As a Training Tool to Treat Physical Inactivity in Children
Kiefer, Adam W.; Pincus, David; Richardson, Michael J.; Myer, Gregory D.
2017-01-01
Lack of adequate physical activity in children is an epidemic that can result in obesity and other poor health outcomes across the lifespan. Physical activity interventions focused on motor skill competence continue to be developed, but some interventions, such as neuromuscular training (NMT), may be limited in how early they can be implemented due to dependence on the child’s level of cognitive and perceptual-motor development. Early implementation of motor-rich activities that support motor skill development in children is critical for the development of healthy levels of physical activity that carry through into adulthood. Virtual reality (VR) training may be beneficial in this regard. VR training, when grounded in an information-based theory of perceptual-motor behavior that modifies the visual information in the virtual world, can promote early development of motor skills in youth akin to more natural, real-world development as opposed to strictly formalized training. This approach can be tailored to the individual child and training scenarios can increase in complexity as the child develops. Ultimately, training in VR may help serve as a precursor to “real-world” NMT, and once the child reaches the appropriate training age can also augment more complex NMT regimens performed outside of the virtual environment. PMID:29376045
Keller, Benjamin A; Salcedo, Edgardo S; Williams, Timothy K; Neff, Lucas P; Carden, Anthony J; Li, Yiran; Gotlib, Oren; Tran, Nam K; Galante, Joseph M
2016-09-01
Resuscitative endovascular balloon occlusion of the aorta (REBOA) is an adjunct technique for salvaging patients with noncompressible torso hemorrhage. Current REBOA training paradigms require large animals, virtual reality simulators, or human cadavers for acquisition of skills. These training strategies are expensive and resource intensive, which may prevent widespread dissemination of REBOA. We have developed a low-cost, near-physiologic, pulsatile REBOA simulator by connecting an anatomic vascular circuit constructed out of latex and polyvinyl chloride tubing to a commercially available pump. This pulsatile simulator is capable of generating cardiac outputs ranging from 1.7 to 6.8 L/min with corresponding arterial blood pressures of 54 to 226/14 to 121 mmHg. The simulator accommodates a 12 French introducer sheath and a CODA balloon catheter. Upon balloon inflation, the arterial waveform distal to the occlusion flattens, distal pulsation within the simulator is lost, and systolic blood pressures proximal to the balloon catheter increase by up to 62 mmHg. Further development and validation of this simulator will allow for refinement, reduction, and replacement of large animal models, costly virtual reality simulators, and perfused cadavers for training purposes. This will ultimately facilitate the low-cost, high-fidelity REBOA simulation needed for the widespread dissemination of this life-saving technique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y M; Bush, K; Han, B
Purpose: Accurate and fast dose calculation is a prerequisite of precision radiation therapy in modern photon and particle therapy. While Monte Carlo (MC) dose calculation provides high dosimetric accuracy, the drastically increased computational time hinders its routine use. Deterministic dose calculation methods are fast, but problematic in the presence of tissue density inhomogeneity. We leverage the useful features of deterministic methods and MC to develop a hybrid dose calculation platform with autonomous utilization of MC and deterministic calculation depending on the local geometry, for optimal accuracy and speed. Methods: Our platform utilizes a Geant4 based “localized Monte Carlo” (LMC) methodmore » that isolates MC dose calculations only to volumes that have potential for dosimetric inaccuracy. In our approach, additional structures are created encompassing heterogeneous volumes. Deterministic methods calculate dose and energy fluence up to the volume surfaces, where the energy fluence distribution is sampled into discrete histories and transported using MC. Histories exiting the volume are converted back into energy fluence, and transported deterministically. By matching boundary conditions at both interfaces, deterministic dose calculation account for dose perturbations “downstream” of localized heterogeneities. Hybrid dose calculation was performed for water and anthropomorphic phantoms. Results: We achieved <1% agreement between deterministic and MC calculations in the water benchmark for photon and proton beams, and dose differences of 2%–15% could be observed in heterogeneous phantoms. The saving in computational time (a factor ∼4–7 compared to a full Monte Carlo dose calculation) was found to be approximately proportional to the volume of the heterogeneous region. Conclusion: Our hybrid dose calculation approach takes advantage of the computational efficiency of deterministic method and accuracy of MC, providing a practical tool for high performance dose calculation in modern RT. The approach is generalizable to all modalities where heterogeneities play a large role, notably particle therapy.« less
The past, present and future of cyber-physical systems: a focus on models.
Lee, Edward A
2015-02-26
This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical.
The Past, Present and Future of Cyber-Physical Systems: A Focus on Models
Lee, Edward A.
2015-01-01
This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical. PMID:25730486
Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.
2016-04-01
Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.
Stability analysis of multi-group deterministic and stochastic epidemic models with vaccination rate
NASA Astrophysics Data System (ADS)
Wang, Zhi-Gang; Gao, Rui-Mei; Fan, Xiao-Ming; Han, Qi-Xing
2014-09-01
We discuss in this paper a deterministic multi-group MSIR epidemic model with a vaccination rate, the basic reproduction number ℛ0, a key parameter in epidemiology, is a threshold which determines the persistence or extinction of the disease. By using Lyapunov function techniques, we show if ℛ0 is greater than 1 and the deterministic model obeys some conditions, then the disease will prevail, the infective persists and the endemic state is asymptotically stable in a feasible region. If ℛ0 is less than or equal to 1, then the infective disappear so the disease dies out. In addition, stochastic noises around the endemic equilibrium will be added to the deterministic MSIR model in order that the deterministic model is extended to a system of stochastic ordinary differential equations. In the stochastic version, we carry out a detailed analysis on the asymptotic behavior of the stochastic model. In addition, regarding the value of ℛ0, when the stochastic system obeys some conditions and ℛ0 is greater than 1, we deduce the stochastic system is stochastically asymptotically stable. Finally, the deterministic and stochastic model dynamics are illustrated through computer simulations.
Deterministic and stochastic CTMC models from Zika disease transmission
NASA Astrophysics Data System (ADS)
Zevika, Mona; Soewono, Edy
2018-03-01
Zika infection is one of the most important mosquito-borne diseases in the world. Zika virus (ZIKV) is transmitted by many Aedes-type mosquitoes including Aedes aegypti. Pregnant women with the Zika virus are at risk of having a fetus or infant with a congenital defect and suffering from microcephaly. Here, we formulate a Zika disease transmission model using two approaches, a deterministic model and a continuous-time Markov chain stochastic model. The basic reproduction ratio is constructed from a deterministic model. Meanwhile, the CTMC stochastic model yields an estimate of the probability of extinction and outbreaks of Zika disease. Dynamical simulations and analysis of the disease transmission are shown for the deterministic and stochastic models.
Distinguishing between stochasticity and determinism: Examples from cell cycle duration variability.
Pearl Mizrahi, Sivan; Sandler, Oded; Lande-Diner, Laura; Balaban, Nathalie Q; Simon, Itamar
2016-01-01
We describe a recent approach for distinguishing between stochastic and deterministic sources of variability, focusing on the mammalian cell cycle. Variability between cells is often attributed to stochastic noise, although it may be generated by deterministic components. Interestingly, lineage information can be used to distinguish between variability and determinism. Analysis of correlations within a lineage of the mammalian cell cycle duration revealed its deterministic nature. Here, we discuss the sources of such variability and the possibility that the underlying deterministic process is due to the circadian clock. Finally, we discuss the "kicked cell cycle" model and its implication on the study of the cell cycle in healthy and cancerous tissues. © 2015 WILEY Periodicals, Inc.
From what might have been to what must have been: counterfactual thinking creates meaning.
Kray, Laura J; George, Linda G; Liljenquist, Katie A; Galinsky, Adam D; Tetlock, Philip E; Roese, Neal J
2010-01-01
Four experiments explored whether 2 uniquely human characteristics-counterfactual thinking (imagining alternatives to the past) and the fundamental drive to create meaning in life-are causally related. Rather than implying a random quality to life, the authors hypothesized and found that counterfactual thinking heightens the meaningfulness of key life experiences. Reflecting on alternative pathways to pivotal turning points even produced greater meaning than directly reflecting on the meaning of the event itself. Fate perceptions ("it was meant to be") and benefit-finding (recognition of positive consequences) were identified as independent causal links between counterfactual thinking and the construction of meaning. Through counterfactual reflection, the upsides to reality are identified, a belief in fate emerges, and ultimately more meaning is derived from important life events.
Gene Editing: A View Through the Prism of Inherited Metabolic Disorders.
Davison, James
2018-04-01
Novel technological developments mean that gene editing - making deliberately targeted alterations in specific genes - is now a clinical reality. The inherited metabolic disorders, a group of clinically significant, monogenic disorders, provide a useful paradigm to explore some of the many ethical issues that arise from this technological capability. Fundamental questions about the significance of the genome, and of manipulating it by selection or editing, are reviewed, and a particular focus on the legislative process that has permitted the development of mitochondrial donation techniques is considered. Ultimately, decisions about what we should do with gene editing must be determined by reference to other non-genomic texts that determine what it is to be human - rather than simply to undertake gene editing because it can be done.
Interstellar Flight, Imagination and Myth Creation as an Effective Means for Enduring Inspiration
NASA Astrophysics Data System (ADS)
Padowitz, G. H.
Interstellar travel to faraway star systems is humanity's most crucial mission, but we habitually focus on technological and funding challenges instead of deeply exploring the rare essence of creativity that is the source that enables us to ultimately solve all problems. Certainly, if Interstellar space flight is to succeed, inspiring and maintaining global and multigenerational support is primary to long-term development. To attract and sustain such extraordinary support the creative power of the imagination must be harnessed through independent artists. By first attracting and encouraging visionaries it's possible that we can awaken in the public a new, invigorating sense of adventure with lasting power. Going beyond our solar system to a nearby star is in reality a mythic quest and should be treated as such.
NASA Astrophysics Data System (ADS)
Trell, Erik; Edeagu, Samuel; Animalu, Alexander
2017-01-01
From a brief recapitulation of the foundational works of Marius Sophus Lie and Herrmann Günther Grassmann, and including missing African links, a rhapsodic survey is made of the straight line of extension and existence that runs as the very fibre of generation and creation throughout Nature's all utterances, which must therefore ultimately be the web of Reality itself of which the Arts and Sciences are interpreters on equal explorer terms. Assuming their direct approach, the straight line and its archaic and algebraic and artistic bearings and convolutions have been followed towards their inner reaches, which earlier resulted in a retrieval of the baryon and meson elementary particles and now equally straightforward the electron geodesics and the organic build of the periodic system of the elements.
Gender bias torture in place of work.
Pathak, P R
1999-11-01
Gender bias torture specially sexual harassment of women at the work place is now a hard reality, the ultimate form of control that repressed men, especially those in position of authority, can have today. They are generally being allowed to get away with it. Mainly because women are fearful and totally unorganized, managements are complacent and the law takes much too long to work if it ever does. Global picture is horrifying. Child abuse, women abuse and even some men are sexually abused. The fear of loss of job, hostility at work and social stigma still prevent women from complaining about sexual harassment. It has been recognized as human rights violation by the Supreme Court which has even drawn up legally binding guidelines directing employers to implement preventive and remedial measures in the work place.
Discrete stochastic simulation methods for chemically reacting systems.
Cao, Yang; Samuels, David C
2009-01-01
Discrete stochastic chemical kinetics describe the time evolution of a chemically reacting system by taking into account the fact that, in reality, chemical species are present with integer populations and exhibit some degree of randomness in their dynamical behavior. In recent years, with the development of new techniques to study biochemistry dynamics in a single cell, there are increasing studies using this approach to chemical kinetics in cellular systems, where the small copy number of some reactant species in the cell may lead to deviations from the predictions of the deterministic differential equations of classical chemical kinetics. This chapter reviews the fundamental theory related to stochastic chemical kinetics and several simulation methods based on that theory. We focus on nonstiff biochemical systems and the two most important discrete stochastic simulation methods: Gillespie's stochastic simulation algorithm (SSA) and the tau-leaping method. Different implementation strategies of these two methods are discussed. Then we recommend a relatively simple and efficient strategy that combines the strengths of the two methods: the hybrid SSA/tau-leaping method. The implementation details of the hybrid strategy are given here and a related software package is introduced. Finally, the hybrid method is applied to simple biochemical systems as a demonstration of its application.
Cancer: shift of the paradigm.
Lichtenstein, Anatoly V
2008-12-01
Cancer is usually considered to be a by-product of design limitations of a multicellular organism and its intrinsic fallibility. However, recent data prompt a revision of some established notions about carcinogenesis and form a new paradigm of carcinogenesis as a highly conserved biological phenomenon - a programmed death of an organism. This altruistic program, which is unleashed when mutagenesis surpasses a certain critical threshold, gives a population the important benefit acting as a guardian of the gene pool against the spread of certain mutant genes. A growing body of evidence supports this point of view: (i) epigenetic changes leading to cancer arise early, simultaneously in many cells and look like deterministic regulation; (ii) concept of cancer stem cell suggests a view of carcinogenesis not as vague transformation but as well known differentiation; (iii) tumor/host relations usually perceived as antagonistic are, in reality, synergistic; (iv) death of an individual from cancer is predetermined and results apparently from a specific activity (killer function) of cancer cell and (v) evolutionary conservation indicates that cancer comes with a general advantage that explains its evolutionary success. A holistic approach to carcinogenesis suggests new avenues of research and new therapeutic strategy.
Inferring extinction risks from sighting records.
Thompson, C J; Lee, T E; Stone, L; McCarthy, M A; Burgman, M A
2013-12-07
Estimating the probability that a species is extinct based on historical sighting records is important when deciding how much effort and money to invest in conservation policies. The framework we offer is more general than others in the literature to date. Our formulation allows for definite and uncertain observations, and thus better accommodates the realities of sighting record quality. Typically, the probability of observing a species given it is extant/extinct is challenging to define, especially when the possibility of a false observation is included. As such, we assume that observation probabilities derive from a representative probability density function. We incorporate this randomness in two different ways ("quenched" versus "annealed") using a framework that is equivalent to a Bayes formulation. The two methods can lead to significantly different estimates for extinction. In the case of definite sightings only, we provide an explicit deterministic calculation (in which observation probabilities are point estimates). Furthermore, our formulation replicates previous work in certain limiting cases. In the case of uncertain sightings, we allow for the possibility of several independent observational types (specimen, photographs, etc.). The method is applied to the Caribbean monk seal, Monachus tropicalis (which has only definite sightings), and synthetic data, with uncertain sightings. © 2013 Elsevier Ltd. All rights reserved.
Flexible engineering designs for urban water management in Lusaka, Zambia.
Tembo, Lucy; Pathirana, Assela; van der Steen, Peter; Zevenbergen, Chris
2015-01-01
Urban water systems are often designed using deterministic single values as design parameters. Subsequently the different design alternatives are compared using a discounted cash flow analysis that assumes that all parameters remain as-predicted for the entire project period. In reality the future is unknown and at best a possible range of values for design parameters can be estimated. A Monte Carlo simulation could then be used to calculate the expected Net Present Value of project alternatives, as well as so-called target curves (cumulative frequency distribution of possible Net Present Values). The same analysis could be done after flexibilities were incorporated in the design, either by using decision rules to decide about the moment of capacity increase, or by buying Real Options (in this case land) to cater for potential capacity increases in the future. This procedure was applied to a sanitation and wastewater treatment case in Lusaka, Zambia. It included various combinations of on-site anaerobic baffled reactors and off-site waste stabilisation ponds. For the case study, it was found that the expected net value of wastewater treatment systems can be increased by 35-60% by designing a small flexible system with Real Options, rather than a large inflexible system.
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan D.; ...
2015-03-17
Despite growing recognition that deterministic and stochastic factors simultaneously influence bacterial communities, little is known about mechanisms shifting their relative importance. To better understand underlying mechanisms, we developed a conceptual model linking ecosystem development during primary succession to shifts in the stochastic/deterministic balance. To evaluate the conceptual model we coupled spatiotemporal data on soil bacterial communities with environmental conditions spanning 105 years of salt marsh development. At the local scale there was a progression from stochasticity to determinism due to Na accumulation with increasing ecosystem age, supporting a main element of the conceptual model. At the regional-scale, soil organic mattermore » (SOM) governed the relative influence of stochasticity and the type of deterministic ecological selection, suggesting scale-dependency in how deterministic ecological selection is imposed. Analysis of a new ecological simulation model supported these conceptual inferences. Looking forward, we propose an extended conceptual model that integrates primary and secondary succession in microbial systems.« less
Guymon, Gary L.; Yen, Chung-Cheng
1990-01-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
NASA Astrophysics Data System (ADS)
Guymon, Gary L.; Yen, Chung-Cheng
1990-07-01
The applicability of a deterministic-probabilistic model for predicting water tables in southern Owens Valley, California, is evaluated. The model is based on a two-layer deterministic model that is cascaded with a two-point probability model. To reduce the potentially large number of uncertain variables in the deterministic model, lumping of uncertain variables was evaluated by sensitivity analysis to reduce the total number of uncertain variables to three variables: hydraulic conductivity, storage coefficient or specific yield, and source-sink function. Results demonstrate that lumping of uncertain parameters reduces computational effort while providing sufficient precision for the case studied. Simulated spatial coefficients of variation for water table temporal position in most of the basin is small, which suggests that deterministic models can predict water tables in these areas with good precision. However, in several important areas where pumping occurs or the geology is complex, the simulated spatial coefficients of variation are over estimated by the two-point probability method.
NASA Technical Reports Server (NTRS)
Bollman, W. E.; Chadwick, C.
1982-01-01
A number of interplanetary missions now being planned involve placing deterministic maneuvers along the flight path to alter the trajectory. Lee and Boain (1973) examined the statistics of trajectory correction maneuver (TCM) magnitude with no deterministic ('bias') component. The Delta v vector magnitude statistics were generated for several values of random Delta v standard deviations using expansions in terms of infinite hypergeometric series. The present investigation uses a different technique (Monte Carlo simulation) to generate Delta v magnitude statistics for a wider selection of random Delta v standard deviations and also extends the analysis to the case of nonzero deterministic Delta v's. These Delta v magnitude statistics are plotted parametrically. The plots are useful in assisting the analyst in quickly answering questions about the statistics of Delta v magnitude for single TCM's consisting of both a deterministic and a random component. The plots provide quick insight into the nature of the Delta v magnitude distribution for the TCM.
NASA Astrophysics Data System (ADS)
García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.
2018-07-01
In the past few decades, it has been recognized that 1 / f fluctuations are ubiquitous in nature. The most widely used mathematical models to capture the long-term memory properties of 1 / f fluctuations have been stochastic fractal models. However, physical systems do not usually consist of just stochastic fractal dynamics, but they often also show some degree of deterministic behavior. The present paper proposes a model based on fractal stochastic and deterministic components that can provide a valuable basis for the study of complex systems with long-term correlations. The fractal stochastic component is assumed to be a fractional Brownian motion process and the deterministic component is assumed to be a band-limited signal. We also provide a method that, under the assumptions of this model, is able to characterize the fractal stochastic component and to provide an estimate of the deterministic components present in a given time series. The method is based on a Bayesian wavelet shrinkage procedure that exploits the self-similar properties of the fractal processes in the wavelet domain. This method has been validated over simulated signals and over real signals with economical and biological origin. Real examples illustrate how our model may be useful for exploring the deterministic-stochastic duality of complex systems, and uncovering interesting patterns present in time series.
Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model
Nené, Nuno R.; Dunham, Alistair S.; Illingworth, Christopher J. R.
2018-01-01
A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. PMID:29500183
Ibrahim, Ahmad M.; Wilson, Paul P.H.; Sawan, Mohamed E.; ...
2015-06-30
The CADIS and FW-CADIS hybrid Monte Carlo/deterministic techniques dramatically increase the efficiency of neutronics modeling, but their use in the accurate design analysis of very large and geometrically complex nuclear systems has been limited by the large number of processors and memory requirements for their preliminary deterministic calculations and final Monte Carlo calculation. Three mesh adaptivity algorithms were developed to reduce the memory requirements of CADIS and FW-CADIS without sacrificing their efficiency improvement. First, a macromaterial approach enhances the fidelity of the deterministic models without changing the mesh. Second, a deterministic mesh refinement algorithm generates meshes that capture as muchmore » geometric detail as possible without exceeding a specified maximum number of mesh elements. Finally, a weight window coarsening algorithm decouples the weight window mesh and energy bins from the mesh and energy group structure of the deterministic calculations in order to remove the memory constraint of the weight window map from the deterministic mesh resolution. The three algorithms were used to enhance an FW-CADIS calculation of the prompt dose rate throughout the ITER experimental facility. Using these algorithms resulted in a 23.3% increase in the number of mesh tally elements in which the dose rates were calculated in a 10-day Monte Carlo calculation and, additionally, increased the efficiency of the Monte Carlo simulation by a factor of at least 3.4. The three algorithms enabled this difficult calculation to be accurately solved using an FW-CADIS simulation on a regular computer cluster, eliminating the need for a world-class super computer.« less
Improving ground-penetrating radar data in sedimentary rocks using deterministic deconvolution
Xia, J.; Franseen, E.K.; Miller, R.D.; Weis, T.V.; Byrnes, A.P.
2003-01-01
Resolution is key to confidently identifying unique geologic features using ground-penetrating radar (GPR) data. Source wavelet "ringing" (related to bandwidth) in a GPR section limits resolution because of wavelet interference, and can smear reflections in time and/or space. The resultant potential for misinterpretation limits the usefulness of GPR. Deconvolution offers the ability to compress the source wavelet and improve temporal resolution. Unlike statistical deconvolution, deterministic deconvolution is mathematically simple and stable while providing the highest possible resolution because it uses the source wavelet unique to the specific radar equipment. Source wavelets generated in, transmitted through and acquired from air allow successful application of deterministic approaches to wavelet suppression. We demonstrate the validity of using a source wavelet acquired in air as the operator for deterministic deconvolution in a field application using "400-MHz" antennas at a quarry site characterized by interbedded carbonates with shale partings. We collected GPR data on a bench adjacent to cleanly exposed quarry faces in which we placed conductive rods to provide conclusive groundtruth for this approach to deconvolution. The best deconvolution results, which are confirmed by the conductive rods for the 400-MHz antenna tests, were observed for wavelets acquired when the transmitter and receiver were separated by 0.3 m. Applying deterministic deconvolution to GPR data collected in sedimentary strata at our study site resulted in an improvement in resolution (50%) and improved spatial location (0.10-0.15 m) of geologic features compared to the same data processed without deterministic deconvolution. The effectiveness of deterministic deconvolution for increased resolution and spatial accuracy of specific geologic features is further demonstrated by comparing results of deconvolved data with nondeconvolved data acquired along a 30-m transect immediately adjacent to a fresh quarry face. The results at this site support using deterministic deconvolution, which incorporates the GPR instrument's unique source wavelet, as a standard part of routine GPR data processing. ?? 2003 Elsevier B.V. All rights reserved.
Expansion or extinction: deterministic and stochastic two-patch models with Allee effects.
Kang, Yun; Lanchier, Nicolas
2011-06-01
We investigate the impact of Allee effect and dispersal on the long-term evolution of a population in a patchy environment. Our main focus is on whether a population already established in one patch either successfully invades an adjacent empty patch or undergoes a global extinction. Our study is based on the combination of analytical and numerical results for both a deterministic two-patch model and a stochastic counterpart. The deterministic model has either two, three or four attractors. The existence of a regime with exactly three attractors only appears when patches have distinct Allee thresholds. In the presence of weak dispersal, the analysis of the deterministic model shows that a high-density and a low-density populations can coexist at equilibrium in nearby patches, whereas the analysis of the stochastic model indicates that this equilibrium is metastable, thus leading after a large random time to either a global expansion or a global extinction. Up to some critical dispersal, increasing the intensity of the interactions leads to an increase of both the basin of attraction of the global extinction and the basin of attraction of the global expansion. Above this threshold, for both the deterministic and the stochastic models, the patches tend to synchronize as the intensity of the dispersal increases. This results in either a global expansion or a global extinction. For the deterministic model, there are only two attractors, while the stochastic model no longer exhibits a metastable behavior. In the presence of strong dispersal, the limiting behavior is entirely determined by the value of the Allee thresholds as the global population size in the deterministic and the stochastic models evolves as dictated by their single-patch counterparts. For all values of the dispersal parameter, Allee effects promote global extinction in terms of an expansion of the basin of attraction of the extinction equilibrium for the deterministic model and an increase of the probability of extinction for the stochastic model.
Estimating the epidemic threshold on networks by deterministic connections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Kezan, E-mail: lkzzr@sohu.com; Zhu, Guanghu; Fu, Xinchu
2014-12-15
For many epidemic networks some connections between nodes are treated as deterministic, while the remainder are random and have different connection probabilities. By applying spectral analysis to several constructed models, we find that one can estimate the epidemic thresholds of these networks by investigating information from only the deterministic connections. Nonetheless, in these models, generic nonuniform stochastic connections and heterogeneous community structure are also considered. The estimation of epidemic thresholds is achieved via inequalities with upper and lower bounds, which are found to be in very good agreement with numerical simulations. Since these deterministic connections are easier to detect thanmore » those stochastic connections, this work provides a feasible and effective method to estimate the epidemic thresholds in real epidemic networks.« less
Experimental demonstration on the deterministic quantum key distribution based on entangled photons.
Chen, Hua; Zhou, Zhi-Yuan; Zangana, Alaa Jabbar Jumaah; Yin, Zhen-Qiang; Wu, Juan; Han, Yun-Guang; Wang, Shuang; Li, Hong-Wei; He, De-Yong; Tawfeeq, Shelan Khasro; Shi, Bao-Sen; Guo, Guang-Can; Chen, Wei; Han, Zheng-Fu
2016-02-10
As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified "Ping-Pong"(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based quantum communications.
Experimental demonstration on the deterministic quantum key distribution based on entangled photons
Chen, Hua; Zhou, Zhi-Yuan; Zangana, Alaa Jabbar Jumaah; Yin, Zhen-Qiang; Wu, Juan; Han, Yun-Guang; Wang, Shuang; Li, Hong-Wei; He, De-Yong; Tawfeeq, Shelan Khasro; Shi, Bao-Sen; Guo, Guang-Can; Chen, Wei; Han, Zheng-Fu
2016-01-01
As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based quantum communications. PMID:26860582
Characterization of normality of chaotic systems including prediction and detection of anomalies
NASA Astrophysics Data System (ADS)
Engler, Joseph John
Accurate prediction and control pervades domains such as engineering, physics, chemistry, and biology. Often, it is discovered that the systems under consideration cannot be well represented by linear, periodic nor random data. It has been shown that these systems exhibit deterministic chaos behavior. Deterministic chaos describes systems which are governed by deterministic rules but whose data appear to be random or quasi-periodic distributions. Deterministically chaotic systems characteristically exhibit sensitive dependence upon initial conditions manifested through rapid divergence of states initially close to one another. Due to this characterization, it has been deemed impossible to accurately predict future states of these systems for longer time scales. Fortunately, the deterministic nature of these systems allows for accurate short term predictions, given the dynamics of the system are well understood. This fact has been exploited in the research community and has resulted in various algorithms for short term predictions. Detection of normality in deterministically chaotic systems is critical in understanding the system sufficiently to able to predict future states. Due to the sensitivity to initial conditions, the detection of normal operational states for a deterministically chaotic system can be challenging. The addition of small perturbations to the system, which may result in bifurcation of the normal states, further complicates the problem. The detection of anomalies and prediction of future states of the chaotic system allows for greater understanding of these systems. The goal of this research is to produce methodologies for determining states of normality for deterministically chaotic systems, detection of anomalous behavior, and the more accurate prediction of future states of the system. Additionally, the ability to detect subtle system state changes is discussed. The dissertation addresses these goals by proposing new representational techniques and novel prediction methodologies. The value and efficiency of these methods are explored in various case studies. Presented is an overview of chaotic systems with examples taken from the real world. A representation schema for rapid understanding of the various states of deterministically chaotic systems is presented. This schema is then used to detect anomalies and system state changes. Additionally, a novel prediction methodology which utilizes Lyapunov exponents to facilitate longer term prediction accuracy is presented and compared with other nonlinear prediction methodologies. These novel methodologies are then demonstrated on applications such as wind energy, cyber security and classification of social networks.
Inferring Fitness Effects from Time-Resolved Sequence Data with a Delay-Deterministic Model.
Nené, Nuno R; Dunham, Alistair S; Illingworth, Christopher J R
2018-05-01
A common challenge arising from the observation of an evolutionary system over time is to infer the magnitude of selection acting upon a specific genetic variant, or variants, within the population. The inference of selection may be confounded by the effects of genetic drift in a system, leading to the development of inference procedures to account for these effects. However, recent work has suggested that deterministic models of evolution may be effective in capturing the effects of selection even under complex models of demography, suggesting the more general application of deterministic approaches to inference. Responding to this literature, we here note a case in which a deterministic model of evolution may give highly misleading inferences, resulting from the nondeterministic properties of mutation in a finite population. We propose an alternative approach that acts to correct for this error, and which we denote the delay-deterministic model. Applying our model to a simple evolutionary system, we demonstrate its performance in quantifying the extent of selection acting within that system. We further consider the application of our model to sequence data from an evolutionary experiment. We outline scenarios in which our model may produce improved results for the inference of selection, noting that such situations can be easily identified via the use of a regular deterministic model. Copyright © 2018 Nené et al.
Liu, Charles Y; Spicer, Mark; Apuzzo, Michael L J
2003-01-01
The future development of the neurosurgical operative environment is driven principally by concurrent development in science and technology. In the new millennium, these developments are taking on a Jules Verne quality, with the ability to construct and manipulate the human organism and its surroundings at the level of atoms and molecules seemingly at hand. Thus, an examination of currents in technology advancement from the neurosurgical perspective can provide insight into the evolution of the neurosurgical operative environment. In the future, the optimal design solution for the operative environment requirements of specialized neurosurgery may take the form of composites of venues that are currently mutually distinct. Advances in microfabrication technology and laser optical manipulators are expanding the scope and role of robotics, with novel opportunities for bionic integration. Assimilation of biosensor technology into the operative environment promises to provide neurosurgeons of the future with a vastly expanded set of physiological data, which will require concurrent simplification and optimization of analysis and presentation schemes to facilitate practical usefulness. Nanotechnology derivatives are shattering the maximum limits of resolution and magnification allowed by conventional microscopes. Furthermore, quantum computing and molecular electronics promise to greatly enhance computational power, allowing the emerging reality of simulation and virtual neurosurgery for rehearsal and training purposes. Progressive minimalism is evident throughout, leading ultimately to a paradigm shift as the nanoscale is approached. At the interface between the old and new technological paradigms, issues related to integration may dictate the ultimate emergence of the products of the new paradigm. Once initiated, however, history suggests that the process of change will proceed rapidly and dramatically, with the ultimate neurosurgical operative environment of the future being far more complex in functional capacity but strikingly simple in apparent form.
Virtual reality hardware for use in interactive 3D data fusion and visualization
NASA Astrophysics Data System (ADS)
Gourley, Christopher S.; Abidi, Mongi A.
1997-09-01
Virtual reality has become a tool for use in many areas of research. We have designed and built a VR system for use in range data fusion and visualization. One major VR tool is the CAVE. This is the ultimate visualization tool, but comes with a large price tag. Our design uses a unique CAVE whose graphics are powered by a desktop computer instead of a larger rack machine making it much less costly. The system consists of a screen eight feet tall by twenty-seven feet wide giving a variable field-of-view currently set at 160 degrees. A silicon graphics Indigo2 MaxImpact with the impact channel option is used for display. This gives the capability to drive three projectors at a resolution of 640 by 480 for use in displaying the virtual environment and one 640 by 480 display for a user control interface. This machine is also the first desktop package which has built-in hardware texture mapping. This feature allows us to quickly fuse the range and intensity data and other multi-sensory data. The final goal is a complete 3D texture mapped model of the environment. A dataglove, magnetic tracker, and spaceball are to be used for manipulation of the data and navigation through the virtual environment. This system gives several users the ability to interactively create 3D models from multiple range images.
NASA Astrophysics Data System (ADS)
Barrow, John D.; Davies, Paul C. W.; Harper, Charles L., Jr.
2004-06-01
This preview of the future of physics comprises contributions from recognized authorities inspired by the pioneering work of John Wheeler. Quantum theory represents a unifying theme within the book, as it relates to the topics of the nature of physical reality, cosmic inflation, the arrow of time, models of the universe, superstrings, quantum gravity and cosmology. Attempts to formulate a final unification theory of physics are also considered, along with the existence of hidden dimensions of space, hidden cosmic matter, and the strange world of quantum technology. John Archibald Wheeler is one of the most influential scientists of the twentieth century. His extraordinary career has spanned momentous advances in physics, from the birth of the nuclear age to the conception of the quantum computer. Famous for coining the term "black hole," Professor Wheeler helped lay the foundations for the rebirth of gravitation as a mainstream branch of science, triggering the explosive growth in astrophysics and cosmology that followed. His early contributions to physics include the S matrix, the theory of nuclear rotation (with Edward Teller), the theory of nuclear fission (with Niels Bohr), action-at-a-distance electrodynamics (with Richard Feynman), positrons as backward-in-time electrons, the universal Fermi interaction (with Jayme Tiomno), muonic atoms, and the collective model of the nucleus. His inimitable style of thinking, quirky wit, and love of the bizarre have inspired generations of physicists.
Controllability of Deterministic Networks with the Identical Degree Sequence
Ma, Xiujuan; Zhao, Haixing; Wang, Binghong
2015-01-01
Controlling complex network is an essential problem in network science and engineering. Recent advances indicate that the controllability of complex network is dependent on the network's topology. Liu and Barabási, et.al speculated that the degree distribution was one of the most important factors affecting controllability for arbitrary complex directed network with random link weights. In this paper, we analysed the effect of degree distribution to the controllability for the deterministic networks with unweighted and undirected. We introduce a class of deterministic networks with identical degree sequence, called (x,y)-flower. We analysed controllability of the two deterministic networks ((1, 3)-flower and (2, 2)-flower) by exact controllability theory in detail and give accurate results of the minimum number of driver nodes for the two networks. In simulation, we compare the controllability of (x,y)-flower networks. Our results show that the family of (x,y)-flower networks have the same degree sequence, but their controllability is totally different. So the degree distribution itself is not sufficient to characterize the controllability of deterministic networks with unweighted and undirected. PMID:26020920
Inverse kinematic problem for a random gradient medium in geometric optics approximation
NASA Astrophysics Data System (ADS)
Petersen, N. V.
1990-03-01
Scattering at random inhomogeneities in a gradient medium results in systematic deviations of the rays and travel times of refracted body waves from those corresponding to the deterministic velocity component. The character of the difference depends on the parameters of the deterministic and random velocity component. However, at great distances to the source, independently of the velocity parameters (weakly or strongly inhomogeneous medium), the most probable depth of the ray turning point is smaller than that corresponding to the deterministic velocity component, the most probable travel times also being lower. The relative uncertainty in the deterministic velocity component, derived from the mean travel times using methods developed for laterally homogeneous media (for instance, the Herglotz-Wiechert method), is systematic in character, but does not exceed the contrast of velocity inhomogeneities by magnitude. The gradient of the deterministic velocity component has a significant effect on the travel-time fluctuations. The variance at great distances to the source is mainly controlled by shallow inhomogeneities. The travel-time flucutations are studied only for weakly inhomogeneous media.
The role of religious values in decisions about genetics and the public's health.
Modell, Stephen M; Citrin, Toby; King, Susan B; Kardia, Sharon L R
2014-06-01
The latest health care legislation, which promotes prevention and health screening, ultimately depends for its success on recognition of people's values concerning the technologies being employed, not just the interventions' technical virtues. Values concerning the deterministic nature of a condition and what groups should be targeted rest on a sense of what is morally, often religiously right in a given health circumstance. This paper looks at a number of leading-edge case examples--breast cancer genetic screening and family decision-making, and newborn screening and biobanks--in examining how the choices made at the individual, family, and societal levels rest on faith in a higher source of efficacy and moral perspectives on the measures that can be taken. Qualitative responses expressing people's attitudes toward these technologies underscore the importance of considering faith-based values in individual decisions and collective policies on their use. These examples are considered in the context of the historic interplay between science and religion and recent definitions and models of health which incorporate physical, emotional, and social elements, and most importantly, are expanding to incorporate the religious and spiritual values domains.
Design Considerations of Polishing Lap for Computer-Controlled Cylindrical Polishing Process
NASA Technical Reports Server (NTRS)
Khan, Gufran S.; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian
2010-01-01
The future X-ray observatory missions, such as International X-ray Observatory, require grazing incidence replicated optics of extremely large collecting area (3 m2) in combination with angular resolution of less than 5 arcsec half-power diameter. The resolution of a mirror shell depends ultimately on the quality of the cylindrical mandrels from which they are being replicated. Mid-spatial-frequency axial figure error is a dominant contributor in the error budget of the mandrel. This paper presents our efforts to develop a deterministic cylindrical polishing process in order to keep the mid-spatial-frequency axial figure errors to a minimum. Simulation studies have been performed to optimize the operational parameters as well as the polishing lap configuration. Furthermore, depending upon the surface error profile, a model for localized polishing based on dwell time approach is developed. Using the inputs from the mathematical model, a mandrel, having conical approximated Wolter-1 geometry, has been polished on a newly developed computer-controlled cylindrical polishing machine. We report our first experimental results and discuss plans for further improvements in the polishing process.
Approximate Dynamic Programming: Combining Regional and Local State Following Approximations.
Deptula, Patryk; Rosenfeld, Joel A; Kamalapurkar, Rushikesh; Dixon, Warren E
2018-06-01
An infinite-horizon optimal regulation problem for a control-affine deterministic system is solved online using a local state following (StaF) kernel and a regional model-based reinforcement learning (R-MBRL) method to approximate the value function. Unlike traditional methods such as R-MBRL that aim to approximate the value function over a large compact set, the StaF kernel approach aims to approximate the value function in a local neighborhood of the state that travels within a compact set. In this paper, the value function is approximated using a state-dependent convex combination of the StaF-based and the R-MBRL-based approximations. As the state enters a neighborhood containing the origin, the value function transitions from being approximated by the StaF approach to the R-MBRL approach. Semiglobal uniformly ultimately bounded (SGUUB) convergence of the system states to the origin is established using a Lyapunov-based analysis. Simulation results are provided for two, three, six, and ten-state dynamical systems to demonstrate the scalability and performance of the developed method.
Quasi-Static Probabilistic Structural Analyses Process and Criteria
NASA Technical Reports Server (NTRS)
Goldberg, B.; Verderaime, V.
1999-01-01
Current deterministic structural methods are easily applied to substructures and components, and analysts have built great design insights and confidence in them over the years. However, deterministic methods cannot support systems risk analyses, and it was recently reported that deterministic treatment of statistical data is inconsistent with error propagation laws that can result in unevenly conservative structural predictions. Assuming non-nal distributions and using statistical data formats throughout prevailing stress deterministic processes lead to a safety factor in statistical format, which integrated into the safety index, provides a safety factor and first order reliability relationship. The embedded safety factor in the safety index expression allows a historically based risk to be determined and verified over a variety of quasi-static metallic substructures consistent with the traditional safety factor methods and NASA Std. 5001 criteria.
Effect of Uncertainty on Deterministic Runway Scheduling
NASA Technical Reports Server (NTRS)
Gupta, Gautam; Malik, Waqar; Jung, Yoon C.
2012-01-01
Active runway scheduling involves scheduling departures for takeoffs and arrivals for runway crossing subject to numerous constraints. This paper evaluates the effect of uncertainty on a deterministic runway scheduler. The evaluation is done against a first-come- first-serve scheme. In particular, the sequence from a deterministic scheduler is frozen and the times adjusted to satisfy all separation criteria; this approach is tested against FCFS. The comparison is done for both system performance (throughput and system delay) and predictability, and varying levels of congestion are considered. The modeling of uncertainty is done in two ways: as equal uncertainty in availability at the runway as for all aircraft, and as increasing uncertainty for later aircraft. Results indicate that the deterministic approach consistently performs better than first-come-first-serve in both system performance and predictability.
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology.
Schaff, James C; Gao, Fei; Li, Ye; Novak, Igor L; Slepchenko, Boris M
2016-12-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium 'sparks' as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell.
Efficient room-temperature source of polarized single photons
Lukishova, Svetlana G.; Boyd, Robert W.; Stroud, Carlos R.
2007-08-07
An efficient technique for producing deterministically polarized single photons uses liquid-crystal hosts of either monomeric or oligomeric/polymeric form to preferentially align the single emitters for maximum excitation efficiency. Deterministic molecular alignment also provides deterministically polarized output photons; using planar-aligned cholesteric liquid crystal hosts as 1-D photonic-band-gap microcavities tunable to the emitter fluorescence band to increase source efficiency, using liquid crystal technology to prevent emitter bleaching. Emitters comprise soluble dyes, inorganic nanocrystals or trivalent rare-earth chelates.
NASA Astrophysics Data System (ADS)
Chuah, Kee Man; Chen, Chwen Jen; Teh, Chee Siong
Virtual reality (VR) has been prevalently used as a tool to help students learn and to simulate situations that are too hazardous to practice in real life. The present study aims to explore the capability of VR to achieve these two purposes and demonstrate a novel application of the result, using VR to help school students learn about road safety skills, which are impractical to be carried out in real-life situations. This paper describes the system design of the VR-based learning environment known as Virtual Simulated Traffics for Road Safety Education (ViSTREET) and its various features. An overview of the technical procedures for its development is also included. Ultimately, this paper highlights the potential use of VR in addressing the learning problem concerning road safety education programme in Malaysia.
Bub, Barry
2007-01-01
The reality and prevalence of suicide presents us with myriad questions and levels of concern. What does the dying person do in the face of seemingly endless adversity, loss, fear of abandonment, disfigurement, dependence, and/or unmitigated pain? Everyone working in end-of-life palliative care keeps these very concerns foremost in heart and mind. Yet how can providers process the unfinished spiritual and emotional business that remains when one of our former patients indeed nosedives into Jobian loss and makes the ultimate decision? This dilemma has been a constant companion in the end-of-life terrain, and our guest author Dr Barry Bub brings a wealth of insight into the healing power of authentic emotion in his narrative about the death of the gourmet chef named Ben.
Drug-induced ego states. I. Cocaine: phenomenology and implications.
Spotts, J V; Shontz, F C
1984-04-01
The ego state experienced by chronic users of cocaine is described in terms of sensorimotor functioning, cognitive functioning, emotionality, spatiality , temporality , causality, and materiality . At low use levels the state is pleasurable, but at high levels fear, anxiety, and paranoia increase, and ultimately reality contact breaks down. Q-sort, Semantic Differential, and other data suggest that low-level users take cocaine to overcome personal insecurities and relieve boredom. Heavy users take it to support overvaulting ambitions and intense strivings for self-sufficiency. Psychotherapy with such persons must deal with their counterdependency , anger, and despair, and with their underlying sense of betrayal . These persons have unacknowledged needs for spiritual experience that must be dealt with openly. A description of persons most vulnerable to heavy use of cocaine is provided, and recommendations for research and social policy are presented.
Monte Carlo simulations of medical imaging modalities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estes, G.P.
Because continuous-energy Monte Carlo radiation transport calculations can be nearly exact simulations of physical reality (within data limitations, geometric approximations, transport algorithms, etc.), it follows that one should be able to closely approximate the results of many experiments from first-principles computations. This line of reasoning has led to various MCNP studies that involve simulations of medical imaging modalities and other visualization methods such as radiography, Anger camera, computerized tomography (CT) scans, and SABRINA particle track visualization. It is the intent of this paper to summarize some of these imaging simulations in the hope of stimulating further work, especially as computermore » power increases. Improved interpretation and prediction of medical images should ultimately lead to enhanced medical treatments. It is also reasonable to assume that such computations could be used to design new or more effective imaging instruments.« less
Reconstructing each cell's genome within complex microbial communities-dream or reality?
Clingenpeel, Scott; Clum, Alicia; Schwientek, Patrick; Rinke, Christian; Woyke, Tanja
2014-01-01
As the vast majority of microorganisms have yet to be cultivated in a laboratory setting, access to their genetic makeup has largely been limited to cultivation-independent methods. These methods, namely metagenomics and more recently single-cell genomics, have become cornerstones for microbial ecology and environmental microbiology. One ultimate goal is the recovery of genome sequences from each cell within an environment to move toward a better understanding of community metabolic potential and to provide substrate for experimental work. As single-cell sequencing has the ability to decipher all sequence information contained in an individual cell, this method holds great promise in tackling such challenge. Methodological limitations and inherent biases however do exist, which will be discussed here based on environmental and benchmark data, to assess how far we are from reaching this goal.
Nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates
Melechko, Anatoli V [Oak Ridge, TN; McKnight, Timothy E [Greenback, TN; Guillorn, Michael A [Ithaca, NY; Ilic, Bojan [Ithaca, NY; Merkulov, Vladimir I [Knoxville, TN; Doktycz, Mitchel J [Knoxville, TN; Lowndes, Douglas H [Knoxville, TN; Simpson, Michael L [Knoxville, TN
2011-08-23
Methods, manufactures, machines and compositions are described for nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates. An apparatus, includes a substrate and a nanoreplicant structure coupled to a surface of the substrate.
Numerical Approach to Spatial Deterministic-Stochastic Models Arising in Cell Biology
Gao, Fei; Li, Ye; Novak, Igor L.; Slepchenko, Boris M.
2016-01-01
Hybrid deterministic-stochastic methods provide an efficient alternative to a fully stochastic treatment of models which include components with disparate levels of stochasticity. However, general-purpose hybrid solvers for spatially resolved simulations of reaction-diffusion systems are not widely available. Here we describe fundamentals of a general-purpose spatial hybrid method. The method generates realizations of a spatially inhomogeneous hybrid system by appropriately integrating capabilities of a deterministic partial differential equation solver with a popular particle-based stochastic simulator, Smoldyn. Rigorous validation of the algorithm is detailed, using a simple model of calcium ‘sparks’ as a testbed. The solver is then applied to a deterministic-stochastic model of spontaneous emergence of cell polarity. The approach is general enough to be implemented within biologist-friendly software frameworks such as Virtual Cell. PMID:27959915
Stochasticity and determinism in models of hematopoiesis.
Kimmel, Marek
2014-01-01
This chapter represents a novel view of modeling in hematopoiesis, synthesizing both deterministic and stochastic approaches. Whereas the stochastic models work in situations where chance dominates, for example when the number of cells is small, or under random mutations, the deterministic models are more important for large-scale, normal hematopoiesis. New types of models are on the horizon. These models attempt to account for distributed environments such as hematopoietic niches and their impact on dynamics. Mixed effects of such structures and chance events are largely unknown and constitute both a challenge and promise for modeling. Our discussion is presented under the separate headings of deterministic and stochastic modeling; however, the connections between both are frequently mentioned. Four case studies are included to elucidate important examples. We also include a primer of deterministic and stochastic dynamics for the reader's use.
Hybrid deterministic/stochastic simulation of complex biochemical systems.
Lecca, Paola; Bagagiolo, Fabio; Scarpa, Marina
2017-11-21
In a biological cell, cellular functions and the genetic regulatory apparatus are implemented and controlled by complex networks of chemical reactions involving genes, proteins, and enzymes. Accurate computational models are indispensable means for understanding the mechanisms behind the evolution of a complex system, not always explored with wet lab experiments. To serve their purpose, computational models, however, should be able to describe and simulate the complexity of a biological system in many of its aspects. Moreover, it should be implemented by efficient algorithms requiring the shortest possible execution time, to avoid enlarging excessively the time elapsing between data analysis and any subsequent experiment. Besides the features of their topological structure, the complexity of biological networks also refers to their dynamics, that is often non-linear and stiff. The stiffness is due to the presence of molecular species whose abundance fluctuates by many orders of magnitude. A fully stochastic simulation of a stiff system is computationally time-expensive. On the other hand, continuous models are less costly, but they fail to capture the stochastic behaviour of small populations of molecular species. We introduce a new efficient hybrid stochastic-deterministic computational model and the software tool MoBioS (MOlecular Biology Simulator) implementing it. The mathematical model of MoBioS uses continuous differential equations to describe the deterministic reactions and a Gillespie-like algorithm to describe the stochastic ones. Unlike the majority of current hybrid methods, the MoBioS algorithm divides the reactions' set into fast reactions, moderate reactions, and slow reactions and implements a hysteresis switching between the stochastic model and the deterministic model. Fast reactions are approximated as continuous-deterministic processes and modelled by deterministic rate equations. Moderate reactions are those whose reaction waiting time is greater than the fast reaction waiting time but smaller than the slow reaction waiting time. A moderate reaction is approximated as a stochastic (deterministic) process if it was classified as a stochastic (deterministic) process at the time at which it crosses the threshold of low (high) waiting time. A Gillespie First Reaction Method is implemented to select and execute the slow reactions. The performances of MoBios were tested on a typical example of hybrid dynamics: that is the DNA transcription regulation. The simulated dynamic profile of the reagents' abundance and the estimate of the error introduced by the fully deterministic approach were used to evaluate the consistency of the computational model and that of the software tool.
Failed rib region prediction in a human body model during crash events with precrash braking.
Guleyupoglu, B; Koya, B; Barnard, R; Gayzik, F S
2018-02-28
The objective of this study is 2-fold. We used a validated human body finite element model to study the predicted chest injury (focusing on rib fracture as a function of element strain) based on varying levels of simulated precrash braking. Furthermore, we compare deterministic and probabilistic methods of rib injury prediction in the computational model. The Global Human Body Models Consortium (GHBMC) M50-O model was gravity settled in the driver position of a generic interior equipped with an advanced 3-point belt and airbag. Twelve cases were investigated with permutations for failure, precrash braking system, and crash severity. The severities used were median (17 kph), severe (34 kph), and New Car Assessment Program (NCAP; 56.4 kph). Cases with failure enabled removed rib cortical bone elements once 1.8% effective plastic strain was exceeded. Alternatively, a probabilistic framework found in the literature was used to predict rib failure. Both the probabilistic and deterministic methods take into consideration location (anterior, lateral, and posterior). The deterministic method is based on a rubric that defines failed rib regions dependent on a threshold for contiguous failed elements. The probabilistic method depends on age-based strain and failure functions. Kinematics between both methods were similar (peak max deviation: ΔX head = 17 mm; ΔZ head = 4 mm; ΔX thorax = 5 mm; ΔZ thorax = 1 mm). Seat belt forces at the time of probabilistic failed region initiation were lower than those at deterministic failed region initiation. The probabilistic method for rib fracture predicted more failed regions in the rib (an analog for fracture) than the deterministic method in all but 1 case where they were equal. The failed region patterns between models are similar; however, there are differences that arise due to stress reduced from element elimination that cause probabilistic failed regions to continue to rise after no deterministic failed region would be predicted. Both the probabilistic and deterministic methods indicate similar trends with regards to the effect of precrash braking; however, there are tradeoffs. The deterministic failed region method is more spatially sensitive to failure and is more sensitive to belt loads. The probabilistic failed region method allows for increased capability in postprocessing with respect to age. The probabilistic failed region method predicted more failed regions than the deterministic failed region method due to force distribution differences.
Protocol-based care: the standardisation of decision-making?
Rycroft-Malone, Jo; Fontenla, Marina; Seers, Kate; Bick, Debra
2009-05-01
To explore how protocol-based care affects clinical decision-making. In the context of evidence-based practice, protocol-based care is a mechanism for facilitating the standardisation of care and streamlining decision-making through rationalising the information with which to make judgements and ultimately decisions. However, whether protocol-based care does, in the reality of practice, standardise decision-making is unknown. This paper reports on a study that explored the impact of protocol-based care on nurses' decision-making. Theoretically informed by realistic evaluation and the promoting action on research implementation in health services framework, a case study design using ethnographic methods was used. Two sites were purposively sampled; a diabetic and endocrine unit and a cardiac medical unit. Within each site, data collection included observation, postobservation semi-structured interviews with staff and patients, field notes, feedback sessions and document review. Data were inductively and thematically analysed. Decisions made by nurses in both sites were varied according to many different and interacting factors. While several standardised care approaches were available for use, in reality, a variety of information sources informed decision-making. The primary approach to knowledge exchange and acquisition was person-to-person; decision-making was a social activity. Rarely were standardised care approaches obviously referred to; nurses described following a mental flowchart, not necessarily linked to a particular guideline or protocol. When standardised care approaches were used, it was reported that they were used flexibly and particularised. While the logic of protocol-based care is algorithmic, in the reality of clinical practice, other sources of information supported nurses' decision-making process. This has significant implications for the political goal of standardisation. The successful implementation and judicious use of tools such as protocols and guidelines will likely be dependant on approaches that facilitate the development of nurses' decision-making processes in parallel to paying attention to the influence of context.
Psychosomatic medicine and the philosophy of life.
Schwartz, Michael A; Wiggins, Osborne P
2010-01-21
Basing ourselves on the writings of Hans Jonas, we offer to psychosomatic medicine a philosophy of life that surmounts the mind-body dualism which has plagued Western thought since the origins of modern science in seventeenth century Europe. Any present-day account of reality must draw upon everything we know about the living and the non-living. Since we are living beings ourselves, we know what it means to be alive from our own first-hand experience. Therefore, our philosophy of life, in addition to starting with what empirical science tells us about inorganic and organic reality, must also begin from our own direct experience of life in ourselves and in others; it can then show how the two meet in the living being. Since life is ultimately one reality, our theory must reintegrate psyche with soma such that no component of the whole is short-changed, neither the objective nor the subjective. In this essay, we lay out the foundational components of such a theory by clarifying the defining features of living beings as polarities. We describe three such polarities: 1) Being vs. non-being: Always threatened by non-being, the organism must constantly re-assert its being through its own activity. 2) World-relatedness vs. self-enclosure: Living beings are both enclosed with themselves, defined by the boundaries that separate them from their environment, while they are also ceaselessly reaching out to their environment and engaging in transactions with it. 3) Dependence vs. independence: Living beings are both dependent on the material components that constitute them at any given moment and independent of any particular groupings of these components over time.We then discuss important features of the polarities of life: Metabolism; organic structure; enclosure by a semi-permeable membrane; distinction between "self" and "other"; autonomy; neediness; teleology; sensitivity; values. Moral needs and values already arise at the most basic levels of life, even if only human beings can recognize such values as moral requirements and develop responses to them.
Psychosomatic medicine and the philosophy of life
2010-01-01
Basing ourselves on the writings of Hans Jonas, we offer to psychosomatic medicine a philosophy of life that surmounts the mind-body dualism which has plagued Western thought since the origins of modern science in seventeenth century Europe. Any present-day account of reality must draw upon everything we know about the living and the non-living. Since we are living beings ourselves, we know what it means to be alive from our own first-hand experience. Therefore, our philosophy of life, in addition to starting with what empirical science tells us about inorganic and organic reality, must also begin from our own direct experience of life in ourselves and in others; it can then show how the two meet in the living being. Since life is ultimately one reality, our theory must reintegrate psyche with soma such that no component of the whole is short-changed, neither the objective nor the subjective. In this essay, we lay out the foundational components of such a theory by clarifying the defining features of living beings as polarities. We describe three such polarities: 1) Being vs. non-being: Always threatened by non-being, the organism must constantly re-assert its being through its own activity. 2) World-relatedness vs. self-enclosure: Living beings are both enclosed with themselves, defined by the boundaries that separate them from their environment, while they are also ceaselessly reaching out to their environment and engaging in transactions with it. 3) Dependence vs. independence: Living beings are both dependent on the material components that constitute them at any given moment and independent of any particular groupings of these components over time. We then discuss important features of the polarities of life: Metabolism; organic structure; enclosure by a semi-permeable membrane; distinction between "self" and "other"; autonomy; neediness; teleology; sensitivity; values. Moral needs and values already arise at the most basic levels of life, even if only human beings can recognize such values as moral requirements and develop responses to them. PMID:20089202
Schrag, Yann; Tremea, Alessandro; Lagger, Cyril; Ohana, Noé; Mohr, Christine
2016-01-01
Studies indicated that people behave less responsibly after exposure to information containing deterministic statements as compared to free will statements or neutral statements. Thus, deterministic primes should lead to enhanced risk-taking behavior. We tested this prediction in two studies with healthy participants. In experiment 1, we tested 144 students (24 men) in the laboratory using the Iowa Gambling Task. In experiment 2, we tested 274 participants (104 men) online using the Balloon Analogue Risk Task. In the Iowa Gambling Task, the free will priming condition resulted in more risky decisions than both the deterministic and neutral priming conditions. We observed no priming effects on risk-taking behavior in the Balloon Analogue Risk Task. To explain these unpredicted findings, we consider the somatic marker hypothesis, a gain frequency approach as well as attention to gains and / or inattention to losses. In addition, we highlight the necessity to consider both pro free will and deterministic priming conditions in future studies. Importantly, our and previous results indicate that the effects of pro free will and deterministic priming do not oppose each other on a frequently assumed continuum. PMID:27018854
Schrag, Yann; Tremea, Alessandro; Lagger, Cyril; Ohana, Noé; Mohr, Christine
2016-01-01
Studies indicated that people behave less responsibly after exposure to information containing deterministic statements as compared to free will statements or neutral statements. Thus, deterministic primes should lead to enhanced risk-taking behavior. We tested this prediction in two studies with healthy participants. In experiment 1, we tested 144 students (24 men) in the laboratory using the Iowa Gambling Task. In experiment 2, we tested 274 participants (104 men) online using the Balloon Analogue Risk Task. In the Iowa Gambling Task, the free will priming condition resulted in more risky decisions than both the deterministic and neutral priming conditions. We observed no priming effects on risk-taking behavior in the Balloon Analogue Risk Task. To explain these unpredicted findings, we consider the somatic marker hypothesis, a gain frequency approach as well as attention to gains and / or inattention to losses. In addition, we highlight the necessity to consider both pro free will and deterministic priming conditions in future studies. Importantly, our and previous results indicate that the effects of pro free will and deterministic priming do not oppose each other on a frequently assumed continuum.
Ion implantation for deterministic single atom devices
NASA Astrophysics Data System (ADS)
Pacheco, J. L.; Singh, M.; Perry, D. L.; Wendt, J. R.; Ten Eyck, G.; Manginell, R. P.; Pluym, T.; Luhman, D. R.; Lilly, M. P.; Carroll, M. S.; Bielejec, E.
2017-12-01
We demonstrate a capability of deterministic doping at the single atom level using a combination of direct write focused ion beam and solid-state ion detectors. The focused ion beam system can position a single ion to within 35 nm of a targeted location and the detection system is sensitive to single low energy heavy ions. This platform can be used to deterministically fabricate single atom devices in materials where the nanostructure and ion detectors can be integrated, including donor-based qubits in Si and color centers in diamond.
Counterfactual Quantum Deterministic Key Distribution
NASA Astrophysics Data System (ADS)
Zhang, Sheng; Wang, Jian; Tang, Chao-Jing
2013-01-01
We propose a new counterfactual quantum cryptography protocol concerning about distributing a deterministic key. By adding a controlled blocking operation module to the original protocol [T.G. Noh, Phys. Rev. Lett. 103 (2009) 230501], the correlation between the polarizations of the two parties, Alice and Bob, is extended, therefore, one can distribute both deterministic keys and random ones using our protocol. We have also given a simple proof of the security of our protocol using the technique we ever applied to the original protocol. Most importantly, our analysis produces a bound tighter than the existing ones.
Ion implantation for deterministic single atom devices
Pacheco, J. L.; Singh, M.; Perry, D. L.; ...
2017-12-04
Here, we demonstrate a capability of deterministic doping at the single atom level using a combination of direct write focused ion beam and solid-state ion detectors. The focused ion beam system can position a single ion to within 35 nm of a targeted location and the detection system is sensitive to single low energy heavy ions. This platform can be used to deterministically fabricate single atom devices in materials where the nanostructure and ion detectors can be integrated, including donor-based qubits in Si and color centers in diamond.
Deterministic quantum splitter based on time-reversed Hong-Ou-Mandel interference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Jun; Lee, Kim Fook; Kumar, Prem
2007-09-15
By utilizing a fiber-based indistinguishable photon-pair source in the 1.55 {mu}m telecommunications band [J. Chen et al., Opt. Lett. 31, 2798 (2006)], we present the first, to the best of our knowledge, deterministic quantum splitter based on the principle of time-reversed Hong-Ou-Mandel quantum interference. The deterministically separated identical photons' indistinguishability is then verified by using a conventional Hong-Ou-Mandel quantum interference, which exhibits a near-unity dip visibility of 94{+-}1%, making this quantum splitter useful for various quantum information processing applications.
Yet one more dwell time algorithm
NASA Astrophysics Data System (ADS)
Haberl, Alexander; Rascher, Rolf
2017-06-01
The current demand of even more powerful and efficient microprocessors, for e.g. deep learning, has led to an ongoing trend of reducing the feature size of the integrated circuits. These processors are patterned with EUV-lithography which enables 7 nm chips [1]. To produce mirrors which satisfy the needed requirements is a challenging task. Not only increasing requirements on the imaging properties, but also new lens shapes, such as aspheres or lenses with free-form surfaces, require innovative production processes. However, these lenses need new deterministic sub-aperture polishing methods that have been established in the past few years. These polishing methods are characterized, by an empirically determined TIF and local stock removal. Such a deterministic polishing method is ion-beam-figuring (IBF). The beam profile of an ion beam is adjusted to a nearly ideal Gaussian shape by various parameters. With the known removal function, a dwell time profile can be generated for each measured error profile. Such a profile is always generated pixel-accurately to the predetermined error profile, with the aim always of minimizing the existing surface structures up to the cut-off frequency of the tool used [2]. The processing success of a correction-polishing run depends decisively on the accuracy of the previously computed dwell-time profile. So the used algorithm to calculate the dwell time has to accurately reflect the reality. But furthermore the machine operator should have no influence on the dwell-time calculation. Conclusively there mustn't be any parameters which have an influence on the calculation result. And lastly it should take a minimum of machining time to get a minimum of remaining error structures. Unfortunately current dwell time algorithm calculations are divergent, user-dependent, tending to create high processing times and need several parameters to bet set. This paper describes an, realistic, convergent and user independent dwell time algorithm. The typical processing times are reduced to about 80 % up to 50 % compared to conventional algorithms (Lucy-Richardson, Van-Cittert …) as used in established machines. To verify its effectiveness a plane surface was machined on an IBF.
NASA Astrophysics Data System (ADS)
Preston, L. A.
2017-12-01
Marine hydrokinetic (MHK) devices offer a clean, renewable alternative energy source for the future. Responsible utilization of MHK devices, however, requires that the effects of acoustic noise produced by these devices on marine life and marine-related human activities be well understood. Paracousti is a 3-D full waveform acoustic modeling suite that can accurately propagate MHK noise signals in the complex bathymetry found in the near-shore to open ocean environment and considers real properties of the seabed, water column, and air-surface interface. However, this is a deterministic simulation that assumes the environment and source are exactly known. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected noise levels within the marine environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. One method is to use Monte Carlo (MC) techniques where simulation results from a large number of deterministic solutions are aggregated to provide statistical properties of the output signal. However, MC methods can be computationally prohibitive since they can require tens of thousands or more simulations to build up an accurate representation of those statistical properties. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a small fraction of the computational cost of MC. We are developing a SPDE solver for the 3-D acoustic wave propagation problem called Paracousti-UQ to help regulators and operators assess the statistical properties of environmental noise produced by MHK devices. In this presentation, we present the SPDE method and compare statistical distributions of simulated acoustic signals in simple models to MC simulations to show the accuracy and efficiency of the SPDE method. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc. for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525.
Måren, Inger Elisabeth; Kapfer, Jutta; Aarrestad, Per Arild; Grytnes, John-Arvid; Vandvik, Vigdis
2018-01-01
Successional dynamics in plant community assembly may result from both deterministic and stochastic ecological processes. The relative importance of different ecological processes is expected to vary over the successional sequence, between different plant functional groups, and with the disturbance levels and land-use management regimes of the successional systems. We evaluate the relative importance of stochastic and deterministic processes in bryophyte and vascular plant community assembly after fire in grazed and ungrazed anthropogenic coastal heathlands in Northern Europe. A replicated series of post-fire successions (n = 12) were initiated under grazed and ungrazed conditions, and vegetation data were recorded in permanent plots over 13 years. We used redundancy analysis (RDA) to test for deterministic successional patterns in species composition repeated across the replicate successional series and analyses of co-occurrence to evaluate to what extent species respond synchronously along the successional gradient. Change in species co-occurrences over succession indicates stochastic successional dynamics at the species level (i.e., species equivalence), whereas constancy in co-occurrence indicates deterministic dynamics (successional niche differentiation). The RDA shows high and deterministic vascular plant community compositional change, especially early in succession. Co-occurrence analyses indicate stochastic species-level dynamics the first two years, which then give way to more deterministic replacements. Grazed and ungrazed successions are similar, but the early stage stochasticity is higher in ungrazed areas. Bryophyte communities in ungrazed successions resemble vascular plant communities. In contrast, bryophytes in grazed successions showed consistently high stochasticity and low determinism in both community composition and species co-occurrence. In conclusion, stochastic and individualistic species responses early in succession give way to more niche-driven dynamics in later successional stages. Grazing reduces predictability in both successional trends and species-level dynamics, especially in plant functional groups that are not well adapted to disturbance. © 2017 The Authors. Ecology, published by Wiley Periodicals, Inc., on behalf of the Ecological Society of America.
Deterministic multidimensional nonuniform gap sampling.
Worley, Bradley; Powers, Robert
2015-12-01
Born from empirical observations in nonuniformly sampled multidimensional NMR data relating to gaps between sampled points, the Poisson-gap sampling method has enjoyed widespread use in biomolecular NMR. While the majority of nonuniform sampling schemes are fully randomly drawn from probability densities that vary over a Nyquist grid, the Poisson-gap scheme employs constrained random deviates to minimize the gaps between sampled grid points. We describe a deterministic gap sampling method, based on the average behavior of Poisson-gap sampling, which performs comparably to its random counterpart with the additional benefit of completely deterministic behavior. We also introduce a general algorithm for multidimensional nonuniform sampling based on a gap equation, and apply it to yield a deterministic sampling scheme that combines burst-mode sampling features with those of Poisson-gap schemes. Finally, we derive a relationship between stochastic gap equations and the expectation value of their sampling probability densities. Copyright © 2015 Elsevier Inc. All rights reserved.
A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration
NASA Technical Reports Server (NTRS)
Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce
2008-01-01
Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.
First Order Reliability Application and Verification Methods for Semistatic Structures
NASA Technical Reports Server (NTRS)
Verderaime, Vincent
1994-01-01
Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored by conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments, its stress audits are shown to be arbitrary and incomplete, and it compromises high strength materials performance. A reliability method is proposed which combines first order reliability principles with deterministic design variables and conventional test technique to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety index expression. The application is reduced to solving for a factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and with the pace of semistatic structural designs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, David R; Bartholomew, David B; Moon, Justin
2009-09-08
An apparatus for fixing computational latency within a deterministic region on a network comprises a network interface modem, a high priority module and at least one deterministic peripheral device. The network interface modem is in communication with the network. The high priority module is in communication with the network interface modem. The at least one deterministic peripheral device is connected to the high priority module. The high priority module comprises a packet assembler/disassembler, and hardware for performing at least one operation. Also disclosed is an apparatus for executing at least one instruction on a downhole device within a deterministic region,more » the apparatus comprising a control device, a downhole network, and a downhole device. The control device is near the surface of a downhole tool string. The downhole network is integrated into the tool string. The downhole device is in communication with the downhole network.« less
Stochastic Petri Net extension of a yeast cell cycle model.
Mura, Ivan; Csikász-Nagy, Attila
2008-10-21
This paper presents the definition, solution and validation of a stochastic model of the budding yeast cell cycle, based on Stochastic Petri Nets (SPN). A specific family of SPNs is selected for building a stochastic version of a well-established deterministic model. We describe the procedure followed in defining the SPN model from the deterministic ODE model, a procedure that can be largely automated. The validation of the SPN model is conducted with respect to both the results provided by the deterministic one and the experimental results available from literature. The SPN model catches the behavior of the wild type budding yeast cells and a variety of mutants. We show that the stochastic model matches some characteristics of budding yeast cells that cannot be found with the deterministic model. The SPN model fine-tunes the simulation results, enriching the breadth and the quality of its outcome.
Effect of sample volume on metastable zone width and induction time
NASA Astrophysics Data System (ADS)
Kubota, Noriaki
2012-04-01
The metastable zone width (MSZW) and the induction time, measured for a large sample (say>0.1 L) are reproducible and deterministic, while, for a small sample (say<1 mL), these values are irreproducible and stochastic. Such behaviors of MSZW and induction time were theoretically discussed both with stochastic and deterministic models. Equations for the distribution of stochastic MSZW and induction time were derived. The average values of stochastic MSZW and induction time both decreased with an increase in sample volume, while, the deterministic MSZW and induction time remained unchanged. Such different behaviors with variation in sample volume were explained in terms of detection sensitivity of crystallization events. The average values of MSZW and induction time in the stochastic model were compared with the deterministic MSZW and induction time, respectively. Literature data reported for paracetamol aqueous solution were explained theoretically with the presented models.
Blocksome, Michael A.; Mamidala, Amith R.
2015-07-07
Fencing direct memory access (`DMA`) data transfers in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI including data communications endpoints, each endpoint including specifications of a client, a context, and a task, the endpoints coupled for data communications through the PAMI and through DMA controllers operatively coupled to a deterministic data communications network through which the DMA controllers deliver data communications deterministically, including initiating execution through the PAMI of an ordered sequence of active DMA instructions for DMA data transfers between two endpoints, effecting deterministic DMA data transfers through a DMA controller and the deterministic data communications network; and executing through the PAMI, with no FENCE accounting for DMA data transfers, an active FENCE instruction, the FENCE instruction completing execution only after completion of all DMA instructions initiated prior to execution of the FENCE instruction for DMA data transfers between the two endpoints.
Blocksome, Michael A.; Mamidala, Amith R.
2015-07-14
Fencing direct memory access (`DMA`) data transfers in a parallel active messaging interface (`PAMI`) of a parallel computer, the PAMI including data communications endpoints, each endpoint including specifications of a client, a context, and a task, the endpoints coupled for data communications through the PAMI and through DMA controllers operatively coupled to a deterministic data communications network through which the DMA controllers deliver data communications deterministically, including initiating execution through the PAMI of an ordered sequence of active DMA instructions for DMA data transfers between two endpoints, effecting deterministic DMA data transfers through a DMA controller and the deterministic data communications network; and executing through the PAMI, with no FENCE accounting for DMA data transfers, an active FENCE instruction, the FENCE instruction completing execution only after completion of all DMA instructions initiated prior to execution of the FENCE instruction for DMA data transfers between the two endpoints.
Global Oil: Relax, the End Is Not Near
NASA Astrophysics Data System (ADS)
Fisher, W. L.
2004-12-01
Global oil production will peak within the next 25 to 30 years, but peaking will be a function of demand, not supply, as the methane economy comes into full play. Analysts who predict near-term peaking of global oil production generally use some variant of Hubbert's symmetrical life cycle method. The amount of ultimately recoverable oil is assumed to be known and peaking will occur when half that amount is exhausted. In reality, ultimate recovery volumes are not known, but estimated, and estimates vary by a factor of two; production profiles are not necessarily symmetrical. Further assumptions are that the resource base is inelastic, not significantly expandable through technology and new concepts. The historical experience is quite to the contrary. Projections of near-term peaking ignore or discount field reserve growth, the most dynamic element in reserve additions of the past 25 years and one with the future potential equal to that of new field discovery. Future global demand for oil will likely amount to between 1.5 to 2.0 trillion barrels, well within the more realistic estimates of global recovery volumes. The real challenge is not oil, but natural gas where future global demand will likely exceed 25,000 trillion cubic feet. But it too will be met in sufficient volumes to bring us into the hydrogen economy some 50 to 60 years from now.
Realistic Simulation for Body Area and Body-To-Body Networks
Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D’Errico, Raffaele
2016-01-01
In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices. PMID:27104537
Realistic Simulation for Body Area and Body-To-Body Networks.
Alam, Muhammad Mahtab; Ben Hamida, Elyes; Ben Arbia, Dhafer; Maman, Mickael; Mani, Francesco; Denis, Benoit; D'Errico, Raffaele
2016-04-20
In this paper, we present an accurate and realistic simulation for body area networks (BAN) and body-to-body networks (BBN) using deterministic and semi-deterministic approaches. First, in the semi-deterministic approach, a real-time measurement campaign is performed, which is further characterized through statistical analysis. It is able to generate link-correlated and time-varying realistic traces (i.e., with consistent mobility patterns) for on-body and body-to-body shadowing and fading, including body orientations and rotations, by means of stochastic channel models. The full deterministic approach is particularly targeted to enhance IEEE 802.15.6 proposed channel models by introducing space and time variations (i.e., dynamic distances) through biomechanical modeling. In addition, it helps to accurately model the radio link by identifying the link types and corresponding path loss factors for line of sight (LOS) and non-line of sight (NLOS). This approach is particularly important for links that vary over time due to mobility. It is also important to add that the communication and protocol stack, including the physical (PHY), medium access control (MAC) and networking models, is developed for BAN and BBN, and the IEEE 802.15.6 compliance standard is provided as a benchmark for future research works of the community. Finally, the two approaches are compared in terms of the successful packet delivery ratio, packet delay and energy efficiency. The results show that the semi-deterministic approach is the best option; however, for the diversity of the mobility patterns and scenarios applicable, biomechanical modeling and the deterministic approach are better choices.
Berga, Mercè; Östman, Örjan; Lindström, Eva S; Langenheder, Silke
2015-07-01
Effects of dispersal and the presence of predators on diversity, assembly and functioning of bacterial communities are well studied in isolation. In reality, however, dispersal and trophic interactions act simultaneously and can therefore have combined effects, which are poorly investigated. We performed an experiment with aquatic metacommunities consisting of three environmentally different patches and manipulated dispersal rates among them as well as the presence or absence of the keystone species Daphnia magna. Daphnia magna reduced both local and regional diversity, whereas dispersal increased local diversity but decreased beta-diversity having no net effect on regional diversity. Dispersal modified the assembly mechanisms of bacterial communities by increasing the degree of determinism. Additionally, the combination of the D. magna and dispersal increased the importance of deterministic processes, presumably because predator-tolerant taxa were spread in the metacommunity via dispersal. Moreover, the presence of D. magna affected community composition, increased community respiration rates but did not affect bacterial production or abundance, whereas dispersal slightly increased bacterial production. In conclusion, our study suggests that predation by a keystone species such as D. magna and dispersal additively influence bacterial diversity, assembly processes and ecosystem functioning. © 2014 Society for Applied Microbiology and John Wiley & Sons Ltd.
Kaeuffer, Renaud; Peichel, Catherine L.; Bolnick, Daniel I.; Hendry, Andrew P.
2015-01-01
Convergent (or parallel) evolution provides strong evidence for a deterministic role of natural selection: similar phenotypes evolve when independent populations colonize similar environments. In reality, however, independent populations in similar environments always show some differences: some non-convergent evolution is present. It is therefore important to explicitly quantify the convergent and non-convergent aspects of trait variation, and to investigate the ecological and genetic explanations for each. We performed such an analysis for threespine stickleback (Gasterosteus aculeatus) populations inhabiting lake and stream habitats in independent watersheds. Morphological traits differed in the degree to which lake-stream divergence was convergent across watersheds. Some aspects of this variation were correlated with ecological variables related to diet, presumably reflecting the strength and specifics of divergent selection. Furthermore, a genetic scan revealed some markers that diverged between lakes and streams in many of the watersheds and some that diverged in only a few watersheds. Moreover, some of the lake-stream divergence in genetic markers was associated within some of the lake-stream divergence in morphological traits. Our results suggest that convergent evolution, and deviations from it, are primarily the result of natural selection, which corresponds in only some respect to the dichotomous habitat classifications frequently used in such studies. PMID:22276537
Adamson, M W; Morozov, A Y; Kuzenkov, O A
2016-09-01
Mathematical models in biology are highly simplified representations of a complex underlying reality and there is always a high degree of uncertainty with regards to model function specification. This uncertainty becomes critical for models in which the use of different functions fitting the same dataset can yield substantially different predictions-a property known as structural sensitivity. Thus, even if the model is purely deterministic, then the uncertainty in the model functions carries through into uncertainty in model predictions, and new frameworks are required to tackle this fundamental problem. Here, we consider a framework that uses partially specified models in which some functions are not represented by a specific form. The main idea is to project infinite dimensional function space into a low-dimensional space taking into account biological constraints. The key question of how to carry out this projection has so far remained a serious mathematical challenge and hindered the use of partially specified models. Here, we propose and demonstrate a potentially powerful technique to perform such a projection by using optimal control theory to construct functions with the specified global properties. This approach opens up the prospect of a flexible and easy to use method to fulfil uncertainty analysis of biological models.
Natural selection and self-organization in complex adaptive systems.
Di Bernardo, Mirko
2010-01-01
The central theme of this work is self-organization "interpreted" both from the point of view of theoretical biology, and from a philosophical point of view. By analysing, on the one hand, those which are now considered--not only in the field of physics--some of the most important discoveries, that is complex systems and deterministic chaos and, on the other hand, the new frontiers of systemic biology, this work highlights how large thermodynamic systems which are open can spontaneously stay in an orderly regime. Such systems can represent the natural source of the order required for a stable self-organization, for homoeostasis and for hereditary variations. The order, emerging in enormous randomly interconnected nets of binary variables, is almost certainly only the precursor of similar orders emerging in all the varieties of complex systems. Hence, this work, by finding new foundations for the order pervading the living world, advances the daring hypothesis according to which Darwinian natural selection is not the only source of order in the biosphere. Thus, the article, by examining the passage from Prigogine's dissipative structures theory to the contemporary theory of biological complexity, highlights the development of a coherent and continuous line of research which is set to individuate the general principles marking the profound reality of that mysterious self-organization characterizing the complexity of life.
A European model and case studies for aggregate exposure assessment of pesticides.
Kennedy, Marc C; Glass, C Richard; Bokkers, Bas; Hart, Andy D M; Hamey, Paul Y; Kruisselbrink, Johannes W; de Boer, Waldo J; van der Voet, Hilko; Garthwaite, David G; van Klaveren, Jacob D
2015-05-01
Exposures to plant protection products (PPPs) are assessed using risk analysis methods to protect public health. Traditionally, single sources, such as food or individual occupational sources, have been addressed. In reality, individuals can be exposed simultaneously to multiple sources. Improved regulation therefore requires the development of new tools for estimating the population distribution of exposures aggregated within an individual. A new aggregate model is described, which allows individual users to include as much, or as little, information as is available or relevant for their particular scenario. Depending on the inputs provided by the user, the outputs can range from simple deterministic values through to probabilistic analyses including characterisations of variability and uncertainty. Exposures can be calculated for multiple compounds, routes and sources of exposure. The aggregate model links to the cumulative dietary exposure model developed in parallel and is implemented in the web-based software tool MCRA. Case studies are presented to illustrate the potential of this model, with inputs drawn from existing European data sources and models. These cover exposures to UK arable spray operators, Italian vineyard spray operators, Netherlands users of a consumer spray and UK bystanders/residents. The model could also be adapted to handle non-PPP compounds. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
({The) Solar System Large Planets influence on a new Maunder Miniμm}
NASA Astrophysics Data System (ADS)
Yndestad, Harald; Solheim, Jan-Erik
2016-04-01
In 1890´s G. Spörer and E. W. Maunder (1890) reported that the solar activity stopped in a period of 70 years from 1645 to 1715. Later a reconstruction of the solar activity confirms the grand minima Maunder (1640-1720), Spörer (1390-1550), Wolf (1270-1340), and the minima Oort (1010-1070) and Dalton (1785-1810) since the year 1000 A.D. (Usoskin et al. 2007). These minimum periods have been associated with less irradiation from the Sun and cold climate periods on Earth. An identification of a three grand Maunder type periods and two Dalton type periods in a period thousand years, indicates that sooner or later there will be a colder climate on Earth from a new Maunder- or Dalton- type period. The cause of these minimum periods, are not well understood. An expected new Maunder-type period is based on the properties of solar variability. If the solar variability has a deterministic element, we can estimate better a new Maunder grand minimum. A random solar variability can only explain the past. This investigation is based on the simple idea that if the solar variability has a deterministic property, it must have a deterministic source, as a first cause. If this deterministic source is known, we can compute better estimates the next expected Maunder grand minimum period. The study is based on a TSI ACRIM data series from 1700, a TSI ACRIM data series from 1000 A.D., sunspot data series from 1611 and a Solar Barycenter orbit data series from 1000. The analysis method is based on a wavelet spectrum analysis, to identify stationary periods, coincidence periods and their phase relations. The result shows that the TSI variability and the sunspots variability have deterministic oscillations, controlled by the large planets Jupiter, Uranus and Neptune, as the first cause. A deterministic model of TSI variability and sunspot variability confirms the known minimum and grand minimum periods since 1000. From this deterministic model we may expect a new Maunder type sunspot minimum period from about 2018 to 2055. The deterministic model of a TSI ACRIM data series from 1700 computes a new Maunder type grand minimum period from 2015 to 2071. A model of the longer TSI ACRIM data series from 1000 computes a new Dalton to Maunder type minimum irradiation period from 2047 to 2068.
Deterministic Computer-Controlled Polishing Process for High-Energy X-Ray Optics
NASA Technical Reports Server (NTRS)
Khan, Gufran S.; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian
2010-01-01
A deterministic computer-controlled polishing process for large X-ray mirror mandrels is presented. Using tool s influence function and material removal rate extracted from polishing experiments, design considerations of polishing laps and optimized operating parameters are discussed
Deterministic and efficient quantum cryptography based on Bell's theorem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen Zengbing; Pan Jianwei; Physikalisches Institut, Universitaet Heidelberg, Philosophenweg 12, 69120 Heidelberg
2006-05-15
We propose a double-entanglement-based quantum cryptography protocol that is both efficient and deterministic. The proposal uses photon pairs with entanglement both in polarization and in time degrees of freedom; each measurement in which both of the two communicating parties register a photon can establish one and only one perfect correlation, and thus deterministically create a key bit. Eavesdropping can be detected by violation of local realism. A variation of the protocol shows a higher security, similar to the six-state protocol, under individual attacks. Our scheme allows a robust implementation under the current technology.
Heart rate variability as determinism with jump stochastic parameters.
Zheng, Jiongxuan; Skufca, Joseph D; Bollt, Erik M
2013-08-01
We use measured heart rate information (RR intervals) to develop a one-dimensional nonlinear map that describes short term deterministic behavior in the data. Our study suggests that there is a stochastic parameter with persistence which causes the heart rate and rhythm system to wander about a bifurcation point. We propose a modified circle map with a jump process noise term as a model which can qualitatively capture such this behavior of low dimensional transient determinism with occasional (stochastically defined) jumps from one deterministic system to another within a one parameter family of deterministic systems.
NASA Astrophysics Data System (ADS)
Kim, Hojin; Choi, In Ho; Lee, Sanghyun; Won, Dong-Joon; Oh, Yong Suk; Kwon, Donghoon; Sung, Hyung Jin; Jeon, Sangmin; Kim, Joonwon
2017-04-01
This paper presents a deterministic bead-in-droplet ejection (BIDE) technique that regulates the precise distribution of microbeads in an ejected droplet. The deterministic BIDE was realized through the effective integration of a microfluidic single-particle handling technique with a liquid dispensing system. The integrated bead dispenser facilitates the transfer of the desired number of beads into a dispensing volume and the on-demand ejection of bead-encapsulated droplets. Single bead-encapsulated droplets were ejected every 3 s without any failure. Multiple-bead dispensing with deterministic control of the number of beads was demonstrated to emphasize the originality and quality of the proposed dispensing technique. The dispenser was mounted using a plug-socket type connection, and the dispensing process was completely automated using a programmed sequence without any microscopic observation. To demonstrate a potential application of the technique, bead-based streptavidin-biotin binding assay in an evaporating droplet was conducted using ultralow numbers of beads. The results evidenced the number of beads in the droplet crucially influences the reliability of the assay. Therefore, the proposed deterministic bead-in-droplet technology can be utilized to deliver desired beads onto a reaction site, particularly to reliably and efficiently enrich and detect target biomolecules.
Mi, Xiangcheng; Swenson, Nathan G; Jia, Qi; Rao, Mide; Feng, Gang; Ren, Haibao; Bebber, Daniel P; Ma, Keping
2016-09-07
Deterministic and stochastic processes jointly determine the community dynamics of forest succession. However, it has been widely held in previous studies that deterministic processes dominate forest succession. Furthermore, inference of mechanisms for community assembly may be misleading if based on a single axis of diversity alone. In this study, we evaluated the relative roles of deterministic and stochastic processes along a disturbance gradient by integrating species, functional, and phylogenetic beta diversity in a subtropical forest chronosequence in Southeastern China. We found a general pattern of increasing species turnover, but little-to-no change in phylogenetic and functional turnover over succession at two spatial scales. Meanwhile, the phylogenetic and functional beta diversity were not significantly different from random expectation. This result suggested a dominance of stochastic assembly, contrary to the general expectation that deterministic processes dominate forest succession. On the other hand, we found significant interactions of environment and disturbance and limited evidence for significant deviations of phylogenetic or functional turnover from random expectations for different size classes. This result provided weak evidence of deterministic processes over succession. Stochastic assembly of forest succession suggests that post-disturbance restoration may be largely unpredictable and difficult to control in subtropical forests.
Kim, Hojin; Choi, In Ho; Lee, Sanghyun; Won, Dong-Joon; Oh, Yong Suk; Kwon, Donghoon; Sung, Hyung Jin; Jeon, Sangmin; Kim, Joonwon
2017-04-10
This paper presents a deterministic bead-in-droplet ejection (BIDE) technique that regulates the precise distribution of microbeads in an ejected droplet. The deterministic BIDE was realized through the effective integration of a microfluidic single-particle handling technique with a liquid dispensing system. The integrated bead dispenser facilitates the transfer of the desired number of beads into a dispensing volume and the on-demand ejection of bead-encapsulated droplets. Single bead-encapsulated droplets were ejected every 3 s without any failure. Multiple-bead dispensing with deterministic control of the number of beads was demonstrated to emphasize the originality and quality of the proposed dispensing technique. The dispenser was mounted using a plug-socket type connection, and the dispensing process was completely automated using a programmed sequence without any microscopic observation. To demonstrate a potential application of the technique, bead-based streptavidin-biotin binding assay in an evaporating droplet was conducted using ultralow numbers of beads. The results evidenced the number of beads in the droplet crucially influences the reliability of the assay. Therefore, the proposed deterministic bead-in-droplet technology can be utilized to deliver desired beads onto a reaction site, particularly to reliably and efficiently enrich and detect target biomolecules.
Weinberg, Seth H.; Smith, Gregory D.
2012-01-01
Cardiac myocyte calcium signaling is often modeled using deterministic ordinary differential equations (ODEs) and mass-action kinetics. However, spatially restricted “domains” associated with calcium influx are small enough (e.g., 10−17 liters) that local signaling may involve 1–100 calcium ions. Is it appropriate to model the dynamics of subspace calcium using deterministic ODEs or, alternatively, do we require stochastic descriptions that account for the fundamentally discrete nature of these local calcium signals? To address this question, we constructed a minimal Markov model of a calcium-regulated calcium channel and associated subspace. We compared the expected value of fluctuating subspace calcium concentration (a result that accounts for the small subspace volume) with the corresponding deterministic model (an approximation that assumes large system size). When subspace calcium did not regulate calcium influx, the deterministic and stochastic descriptions agreed. However, when calcium binding altered channel activity in the model, the continuous deterministic description often deviated significantly from the discrete stochastic model, unless the subspace volume is unrealistically large and/or the kinetics of the calcium binding are sufficiently fast. This principle was also demonstrated using a physiologically realistic model of calmodulin regulation of L-type calcium channels introduced by Yue and coworkers. PMID:23509597
Kim, Hojin; Choi, In Ho; Lee, Sanghyun; Won, Dong-Joon; Oh, Yong Suk; Kwon, Donghoon; Sung, Hyung Jin; Jeon, Sangmin; Kim, Joonwon
2017-01-01
This paper presents a deterministic bead-in-droplet ejection (BIDE) technique that regulates the precise distribution of microbeads in an ejected droplet. The deterministic BIDE was realized through the effective integration of a microfluidic single-particle handling technique with a liquid dispensing system. The integrated bead dispenser facilitates the transfer of the desired number of beads into a dispensing volume and the on-demand ejection of bead-encapsulated droplets. Single bead–encapsulated droplets were ejected every 3 s without any failure. Multiple-bead dispensing with deterministic control of the number of beads was demonstrated to emphasize the originality and quality of the proposed dispensing technique. The dispenser was mounted using a plug-socket type connection, and the dispensing process was completely automated using a programmed sequence without any microscopic observation. To demonstrate a potential application of the technique, bead-based streptavidin–biotin binding assay in an evaporating droplet was conducted using ultralow numbers of beads. The results evidenced the number of beads in the droplet crucially influences the reliability of the assay. Therefore, the proposed deterministic bead-in-droplet technology can be utilized to deliver desired beads onto a reaction site, particularly to reliably and efficiently enrich and detect target biomolecules. PMID:28393911
NASA Astrophysics Data System (ADS)
Mukherjee, L.; Zhai, P.; Hu, Y.; Winker, D. M.
2016-12-01
Among the primary factors, which determine the polarized radiation, field of a turbid medium are the single scattering properties of the medium. When multiple types of scatterers are present, the single scattering properties of the scatterers need to be properly mixed in order to find the solutions to the vector radiative transfer theory (VRT). The VRT solvers can be divided into two types: deterministic and stochastic. The deterministic solver can only accept one set of single scattering property in its smallest discretized spatial volume. When the medium contains more than one kind of scatterer, their single scattering properties are averaged, and then used as input for the deterministic solver. The stochastic solver, can work with different kinds of scatterers explicitly. In this work, two different mixing schemes are studied using the Successive Order of Scattering (SOS) method and Monte Carlo (MC) methods. One scheme is used for deterministic and the other is used for the stochastic Monte Carlo method. It is found that the solutions from the two VRT solvers using two different mixing schemes agree with each other extremely well. This confirms the equivalence to the two mixing schemes and also provides a benchmark for the VRT solution for the medium studied.
Surface-Modified Nanocarriers for Nose-to-Brain Delivery: From Bioadhesion to Targeting
Clementino, Adryana; Buttini, Francesca; Colombo, Gaia; Pescina, Silvia; Stanisçuaski Guterres, Silvia; Nicoli, Sara
2018-01-01
In the field of nasal drug delivery, nose-to-brain delivery is among the most fascinating applications, directly targeting the central nervous system, bypassing the blood brain barrier. Its benefits include dose lowering and direct brain distribution of potent drugs, ultimately reducing systemic side effects. Recently, nasal administration of insulin showed promising results in clinical trials for the treatment of Alzheimer’s disease. Nanomedicines could further contribute to making nose-to-brain delivery a reality. While not disregarding the need for devices enabling a formulation deposition in the nose’s upper part, surface modification of nanomedicines appears the key strategy to optimize drug delivery from the nasal cavity to the brain. In this review, nanomedicine delivery based on particle engineering exploiting surface electrostatic charges, mucoadhesive polymers, or chemical moieties targeting the nasal epithelium will be discussed and critically evaluated in relation to nose-to-brain delivery. PMID:29543755
'Am I being over-sensitive?' Women's experience of sexual harassment during medical training.
Hinze, Susan W
2004-01-01
Despite larger numbers of women in medicine and strong statements against gender discrimination in written policies and the medical literature, sexual harassment persists in medical training. This study examines the everyday lives of women and men resident physicians to understand the context within which harassment unfolds. The narratives explored here reveal how attention is deflected from the problem of sexual harassment through a focus on women's 'sensitivity'. Women resist by refusing to name sexual harassment as problematic, and by defining sexual harassment as 'small stuff' in the context of a rigorous training program. Ultimately, both tactics of resistance fail. Closer examination of the relations shaping everyday actions is key, as is viewing the rigid hierarchy of authority and power in medical training through a gender lens. I conclude with a discussion of how reforms in medical education must tend to the gendered, everyday realities of women and men in training.
Humanlike robots: the upcoming revolution in robotics
NASA Astrophysics Data System (ADS)
Bar-Cohen, Yoseph
2009-08-01
Humans have always sought to imitate the human appearance, functions and intelligence. Human-like robots, which for many years have been a science fiction, are increasingly becoming an engineering reality resulting from the many advances in biologically inspired technologies. These biomimetic technologies include artificial intelligence, artificial vision and hearing as well as artificial muscles, also known as electroactive polymers (EAP). Robots, such as the vacuum cleaner Rumba and the robotic lawnmower, that don't have human shape, are already finding growing use in homes worldwide. As opposed to other human-made machines and devices, this technology raises also various questions and concerns and they need to be addressed as the technology advances. These include the need to prevent accidents, deliberate harm, or their use in crime. In this paper the state-of-the-art of the ultimate goal of biomimetics, the development of humanlike robots, the potentials and the challenges are reviewed.
The Tie That Binds:. A Fundamental Unit of `Change' in Space and Time
NASA Astrophysics Data System (ADS)
Beichler, James E.
2013-09-01
Why, despite all efforts to the contrary, have attempts at unification based on the supposedly more fundamental quantum theory failed miserably? The truth is that the essential idea or concept of the quantum itself has never been fully understood. What is the quantum, or rather, what is its ultimate nature? Science may be able to work adequately with the quantum; in a sense science is quite articulate in the language of the quantum, i.e., its mathematical interpretation of the quantum mechanics, but science has no idea of the true physical nature of the quantum. Scientists and philosophers have wasted energy and efforts on irrelevant issues such as the debate over determinism and indeterminism instead of carefully analyzing the physical source of the quantum. Only with a true understanding of the physical nature of the quantum will the unification of the quantum and relativity ever become a reality.
Ingham, Karen
2010-01-01
The correspondences and disparities between how artists and anatomists view the body have historically been a source of creative collaboration, but how is this imaginative interdisciplinarity sustained and expressed in a contemporary context? In this review I suggest that contemporary artists engaging with the body, and the corresponding biomedical and architectural spaces where the body is investigated, are engendering innovative and challenging artworks that stimulate new relationships between art and anatomy. Citing a number of examples from key artists and referencing some of my own practice-based research, I posit that creative cross-fertilization provokes a discourse between mediated public perceptions of disease, death and the disposal of morbid remains, and the contemporary reality of biomedical practice. This is a dialogue that is complex, rich and diverse, and ultimately rewarding for both art and anatomy. PMID:19929908
A theoretical framework for psychiatric nursing practice.
Onega, L L
1991-01-01
Traditionally, specific theoretical frameworks which are congruent with psychiatric nursing practice have been poorly articulated. The purpose of this paper is to identify and discuss a philosophical base, a theoretical framework, application to psychiatric nursing, and issues related to psychiatric nursing knowledge development and practice. A philosophical framework that is likely to be congruent with psychiatric nursing, which is based on the nature of human beings, health, psychiatric nursing and reality, is identified. Aaron Antonovsky's Salutogenic Model is discussed and applied to psychiatric nursing. This model provides a helpful way for psychiatric nurses to organize their thinking processes and ultimately improve the health care services that they offer to their clients. Goal setting and nursing interventions using this model are discussed. Additionally, application of the use of Antonovsky's model is made to nursing research areas such as hardiness, uncertainty, suffering, empathy and literary works. Finally, specific issues related to psychiatric nursing are addressed.
The dreaming brain/mind, consciousness and psychosis.
Limosani, Ivan; D'Agostino, Armando; Manzone, Maria Laura; Scarone, Silvio
2011-12-01
Several independent lines of research in neurobiology seem to support the phenomenologically-grounded view of the dreaming brain/mind as a useful model for psychosis. Hallucinatory phenomena and thought disorders found in psychosis share several peculiarities with dreaming, where internally generated, vivid sensorimotor imagery along with often heightened and incongruous emotion are paired with a decrease in ego functions which ultimately leads to a severe impairment in reality testing. Contemporary conceptualizations of severe mental disorders view psychosis as one psychopathological dimension that may be found across several diagnostic categories. Some experimental data have shown cognitive bizarreness to be equally elevated in dreams and in the waking cognition of acutely psychotic subjects and in patients treated with pro-dopaminergic drugs, independent of the underlying disorder. Further studies into the neurofunctional underpinnings of both conditions will help to clarify the use and validity of this model. Copyright © 2010 Elsevier Inc. All rights reserved.
Making Patient Engagement a Reality.
Pushparajah, Daphnee S
2018-02-01
Patients are increasingly recognised as the true customers of healthcare. By providing insights and perspectives, patients can help the wider healthcare community better understand their needs and ultimately enhance the value of healthcare solutions being developed. In the development of new medicines, for example, meaningful patient engagement can enable the pharmaceutical industry, healthcare providers and other stakeholders to achieve more meaningful health outcomes. While both the pharmaceutical industry and regulators have achieved some progress in incorporating patient perspectives into their activities, the lack of standardised best practices and metrics has made it challenging to achieve consistency and measure success in patient engagement. Practical guidance for patient engagement can facilitate better interactions between patients or patient groups and other collaborators, e.g. industry, regulators and other healthcare stakeholders. Accordingly, UCB has developed an internal model for Patient Group Engagement incorporating four key principles, based on shared ambition, transparency, accountability and respect, essential for effective collaborations.
Did I Say Cosmology? On Modern Cosmologies and Ancient World-views
NASA Astrophysics Data System (ADS)
Iwaniszewski, S.
2009-08-01
The modern cosmology that emerged from observational astronomy in 16th century Europe meant a radical break-away from earlier conceptions of the world. While all ancient and nonwestern worldviews usually describe a multidimensional reality in which diverse environmental, economic, sociopolitical and ideological factors intersect, modern cosmologies espouse the vision of a radically different universe which is completely dehumanized, ethically indifferent and universally valid. Despite these differences cosmology and worldview tend to be used interchangeably to depict ancient and nonwestern worldviews.Any correspondences which can be found between different parts of ancient and/or nonwestern worldviews and modern cosmologies tend to transfer modern conceptions to the premodern world. Ignoring ancient cultural contexts, we risk imposing modern cosmological concepts on past worldview categories. While we have to describe ancient astronomies in our own terms, our ultimate goal is to understand them on their own terms.
Somatic cell cloning: the ultimate form of nuclear reprogramming?
Piedrahita, Jorge A; Mir, Bashir; Dindot, Scott; Walker, Shawn
2004-05-01
With the increasing difficulties associated with meeting the required needs for organs used in transplantation, alternative approaches need to be considered. These include the use of stem cells as potential sources of specialized cells, the ability to transdifferentiate cell types in culture, and the development of complete organs that can be used in humans. All of the above goals will require a complete understanding of the factors affecting cell differentiation and nuclear reprogramming. To make this a reality, however, techniques associated with cloning and genetic modifications in somatic cells need to be continued to be developed and optimized. This includes not only an enhancement of the rate of homologous recombination in somatic cells, but also a thorough understanding of the nuclear reprogramming process taking place during nuclear transfer. The understanding of this process is likely to have an effect beyond the area of nuclear transfer and assist with better methods for transdifferentiation of mammalian cells.
Humanlike Robots - The Upcoming Revolution in Robotics
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph
2009-01-01
Humans have always sought to imitate the human appearance, functions and intelligence. Human-like robots, which for many years have been a science fiction, are increasingly becoming an engineering reality resulting from the many advances in biologically inspired technologies. These biomimetic technologies include artificial intelligence, artificial vision and hearing as well as artificial muscles, also known as electroactive polymers (EAP). Robots, such as the vacuum cleaner Rumba and the robotic lawnmower, that don't have human shape, are already finding growing use in homes worldwide. As opposed to other human-made machines and devices, this technology raises also various questions and concerns and they need to be addressed as the technology advances. These include the need to prevent accidents, deliberate harm, or their use in crime. In this paper the state-of-the-art of the ultimate goal of biomimetics, the development of humanlike robots, the potentials and the challenges are reviewed.
Evolution of the antipsychiatry movement into mental health consumerism.
Rissmiller, David J; Rissmiller, Joshua H
2006-06-01
This essay reviews the history and evolution of the antipsychiatry movement. Radical antipsychiatry over several decades has changed from an antiestablishment campus-based movement to a patient-based consumerist movement. The antecedents of the movement are traced to a crisis in self-conception between biological and psychoanalytic psychiatry occurring during a decade characterized by other radical movements. It was promoted through the efforts of its four seminal thinkers: Michel Foucault in France, R. D. Laing in Great Britain, Thomas Szasz in the United States, and Franco Basaglia in Italy. They championed the concept that personal reality and freedom were independent of any definition of normalcy that organized psychiatry tried to impose. The original antipsychiatry movement made major contributions but also had significant weaknesses that ultimately undermined it. Today, antipsychiatry adherents have a broader base and no longer focus on dismantling organized psychiatry but look to promote radical consumerist reform.
Measuring the qualities of nurses: development and testing of the Qualities of Nurses Scale.
Johnson, Maree; Cowin, Leanne
2013-01-01
This paper reports on the creation, development and testing of a new instrument to measure qualities of nurses, known as the Qualities of Nurses (QON) scale, applicable to student nurses. High attrition rates within nursing programs and during early postgraduate years are an international phenomena. Mismatches between idealized perceptions of nursing and the realities of education and clinical experiences have been identified as contributing factors. A survey method was used to elicit responses to scale items from 678 first-year nursing students at a large university. A one-factor 12-item solution explaining 47 percent of variance in the construct was demonstrated. The QON can assist in the initial assessment and ongoing monitoring of changes in students' perceptions of nurses. Using the QON, researchers and educators can identify initial student nurses' perceptions and any changes associated in educational or other events that ultimately could be manipulated to reduce attrition.
Long-range PV R&D and the electric utilities
NASA Astrophysics Data System (ADS)
Peterson, Terry M.
1997-04-01
In the short term, photovoltaics will probably continue to enjoy great success in niche markets and non-utility businesses, but see relatively little use within utilities. Deregulation is driving major restructuring of the electric-utility sector, causing great uncertainty among its planners and executives, and leading them to favor cost-cutting over other corporate strategies. However, the competitive motives at the root of that restructuring will ultimately induce resourceful utility executives to seek novel non-commodity energy-service businesses to sustain their companies' success in the deregulated industry of the future. In that industry, technology innovation will play a very important role. Specifically, photovoltaics will be highly valued in light of its unsurpassed modularity, extreme siting ease, very low operation and maintenance costs, and public popularity. The eventual leaders in wielding that powerful technology likely will be among those who recognize those assets earliest and strive to bring its promises to reality through innovative applications.
Tensile Fracture of Ductile Materials. M.S. Thesis
NASA Technical Reports Server (NTRS)
Pai, D. M.
1984-01-01
For brittle materials, circular voids play an important role relative to fracture, intensifing both tensile and compressive stresses. A maximum intensified tensile stress failure criterion applies quite well to brittle materials. An attempt was made to explore the possibility of extending the approach to the tensile fracture of ductile materials. The three dimensional voids that exist in reality are modelled by circular holes in sheet metal. Mathematical relationships are sought between the shape and size of the hole, after the material is plastically deformed, and the amount of deformation induced. Then, the effect of hole shape, size and orientation on the mechanical properties is considered experimentally. The presence of the voids does not affect the ultimate tensile strength of the ductile materials because plastic flow wipes out the stress intensification caused by them. However, the shape and orientation of the defect is found to play an important role in affecting the strain at fracture.
Ingham, Karen
2010-02-01
The correspondences and disparities between how artists and anatomists view the body have historically been a source of creative collaboration, but how is this imaginative interdisciplinarity sustained and expressed in a contemporary context? In this review I suggest that contemporary artists engaging with the body, and the corresponding biomedical and architectural spaces where the body is investigated, are engendering innovative and challenging artworks that stimulate new relationships between art and anatomy. Citing a number of examples from key artists and referencing some of my own practice-based research, I posit that creative cross-fertilization provokes a discourse between mediated public perceptions of disease, death and the disposal of morbid remains, and the contemporary reality of biomedical practice. This is a dialogue that is complex, rich and diverse, and ultimately rewarding for both art and anatomy.
Evolutionary perspectives on wildlife disease: concepts and applications
Vander Wal, Eric; Garant, Dany; Pelletier, Fanie
2014-01-01
Wildlife disease has the potential to cause significant ecological, socioeconomic, and health impacts. As a result, all tools available need to be employed when host–pathogen dynamics merit conservation or management interventions. Evolutionary principles, such as evolutionary history, phenotypic and genetic variation, and selection, have the potential to unravel many of the complex ecological realities of infectious disease in the wild. Despite this, their application to wildlife disease ecology and management remains in its infancy. In this article, we outline the impetus behind applying evolutionary principles to disease ecology and management issues in the wild. We then introduce articles from this special issue on Evolutionary Perspectives on Wildlife Disease: Concepts and Applications, outlining how each is exemplar of a practical wildlife disease challenge that can be enlightened by applied evolution. Ultimately, we aim to bring new insights to wildlife disease ecology and its management using tools and techniques commonly employed in evolutionary ecology. PMID:25469154
Three-dimensional computer visualization of forensic pathology data.
March, Jack; Schofield, Damian; Evison, Martin; Woodford, Noel
2004-03-01
Despite a decade of use in US courtrooms, it is only recently that forensic computer animations have become an increasingly important form of communication in legal spheres within the United Kingdom. Aims Research at the University of Nottingham has been influential in the critical investigation of forensic computer graphics reconstruction methodologies and techniques and in raising the profile of this novel form of data visualization within the United Kingdom. The case study presented demonstrates research undertaken by Aims Research and the Department of Forensic Pathology at the University of Sheffield, which aims to apply, evaluate, and develop novel 3-dimensional computer graphics (CG) visualization and virtual reality (VR) techniques in the presentation and investigation of forensic information concerning the human body. The inclusion of such visualizations within other CG or VR environments may ultimately provide the potential for alternative exploratory directions, processes, and results within forensic pathology investigations.
Kiviniemi, Marc T; Mackenzie, Sara L C
2017-01-01
The rapid development of the undergraduate major in public health over the past 15 years has led to a debate about the most appropriate framing for the degree. Should it be viewed as a liberal education degree (akin to academic disciplines such as psychology and political science) or as a professional training degree (akin to disciplines such as nursing and management)? This paper presents an overview of both the liberal education and the professional training degree approaches to the undergraduate public health degree. The reality of public health work in the modern era and the constraints on undergraduate-level training lead to our conclusion that the liberal education framing is a more optimal way to design the degree program. Such a framework optimizes career opportunities, especially long-term opportunities, for graduates, acknowledges the reality of the complex and diverse career paths that one can take under the general umbrella of public health, and accounts for the important role of critical thinking skills in undergraduate education. Ultimately, the distinction between liberal education and professional training may be fuzzier than the debate often highlights-an intentional, well-designed, and thoughtfully implemented undergraduate public health curriculum can address the range of student needs underlying both the liberal education and professional training approaches to the degree, thus optimizing both learning goals and career outcomes for undergraduate public health students.
Kiviniemi, Marc T.; Mackenzie, Sara L. C.
2017-01-01
The rapid development of the undergraduate major in public health over the past 15 years has led to a debate about the most appropriate framing for the degree. Should it be viewed as a liberal education degree (akin to academic disciplines such as psychology and political science) or as a professional training degree (akin to disciplines such as nursing and management)? This paper presents an overview of both the liberal education and the professional training degree approaches to the undergraduate public health degree. The reality of public health work in the modern era and the constraints on undergraduate-level training lead to our conclusion that the liberal education framing is a more optimal way to design the degree program. Such a framework optimizes career opportunities, especially long-term opportunities, for graduates, acknowledges the reality of the complex and diverse career paths that one can take under the general umbrella of public health, and accounts for the important role of critical thinking skills in undergraduate education. Ultimately, the distinction between liberal education and professional training may be fuzzier than the debate often highlights—an intentional, well-designed, and thoughtfully implemented undergraduate public health curriculum can address the range of student needs underlying both the liberal education and professional training approaches to the degree, thus optimizing both learning goals and career outcomes for undergraduate public health students. PMID:28239603
Deterministic models for traffic jams
NASA Astrophysics Data System (ADS)
Nagel, Kai; Herrmann, Hans J.
1993-10-01
We study several deterministic one-dimensional traffic models. For integer positions and velocities we find the typical high and low density phases separated by a simple transition. If positions and velocities are continuous variables the model shows self-organized critically driven by the slowest car.
Soil pH mediates the balance between stochastic and deterministic assembly of bacteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tripathi, Binu M.; Stegen, James C.; Kim, Mincheol
Little is known about the factors affecting the relative influence of stochastic and deterministic processes that governs the assembly of microbial communities in successional soils. Here, we conducted a meta-analysis of bacterial communities using six different successional soils data sets, scattered across different regions, with different pH conditions in early and late successional soils. We found that soil pH was the best predictor of bacterial community assembly and the relative importance of stochastic and deterministic processes along successional soils. Extreme acidic or alkaline pH conditions lead to assembly of phylogenetically more clustered bacterial communities through deterministic processes, whereas pH conditionsmore » close to neutral lead to phylogenetically less clustered bacterial communities with more stochasticity. We suggest that the influence of pH, rather than successional age, is the main driving force in producing trends in phylogenetic assembly of bacteria, and that pH also influences the relative balance of stochastic and deterministic processes along successional soils. Given that pH had a much stronger association with community assembly than did successional age, we evaluated whether the inferred influence of pH was maintained when studying globally-distributed samples collected without regard for successional age. This dataset confirmed the strong influence of pH, suggesting that the influence of soil pH on community assembly processes occurs globally. Extreme pH conditions likely exert more stringent limits on survival and fitness, imposing strong selective pressures through ecological and evolutionary time. Taken together, these findings suggest that the degree to which stochastic vs. deterministic processes shape soil bacterial community assembly is a consequence of soil pH rather than successional age.« less
The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments
NASA Astrophysics Data System (ADS)
Chen, Fajing; Jiao, Meiyan; Chen, Jing
2013-04-01
Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.
Benedetti-Cecchi, Lisandro; Canepa, Antonio; Fuentes, Veronica; Tamburello, Laura; Purcell, Jennifer E; Piraino, Stefano; Roberts, Jason; Boero, Ferdinando; Halpin, Patrick
2015-01-01
Jellyfish outbreaks are increasingly viewed as a deterministic response to escalating levels of environmental degradation and climate extremes. However, a comprehensive understanding of the influence of deterministic drivers and stochastic environmental variations favouring population renewal processes has remained elusive. This study quantifies the deterministic and stochastic components of environmental change that lead to outbreaks of the jellyfish Pelagia noctiluca in the Mediterranen Sea. Using data of jellyfish abundance collected at 241 sites along the Catalan coast from 2007 to 2010 we: (1) tested hypotheses about the influence of time-varying and spatial predictors of jellyfish outbreaks; (2) evaluated the relative importance of stochastic vs. deterministic forcing of outbreaks through the environmental bootstrap method; and (3) quantified return times of extreme events. Outbreaks were common in May and June and less likely in other summer months, which resulted in a negative relationship between outbreaks and SST. Cross- and along-shore advection by geostrophic flow were important concentrating forces of jellyfish, but most outbreaks occurred in the proximity of two canyons in the northern part of the study area. This result supported the recent hypothesis that canyons can funnel P. noctiluca blooms towards shore during upwelling. This can be a general, yet unappreciated mechanism leading to outbreaks of holoplanktonic jellyfish species. The environmental bootstrap indicated that stochastic environmental fluctuations have negligible effects on return times of outbreaks. Our analysis emphasized the importance of deterministic processes leading to jellyfish outbreaks compared to the stochastic component of environmental variation. A better understanding of how environmental drivers affect demographic and population processes in jellyfish species will increase the ability to anticipate jellyfish outbreaks in the future.
Cognitive Diagnostic Analysis Using Hierarchically Structured Skills
ERIC Educational Resources Information Center
Su, Yu-Lan
2013-01-01
This dissertation proposes two modified cognitive diagnostic models (CDMs), the deterministic, inputs, noisy, "and" gate with hierarchy (DINA-H) model and the deterministic, inputs, noisy, "or" gate with hierarchy (DINO-H) model. Both models incorporate the hierarchical structures of the cognitive skills in the model estimation…
Deterministic Mean-Field Ensemble Kalman Filtering
Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul
2016-05-03
The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d
Active temporal multiplexing of indistinguishable heralded single photons
Xiong, C.; Zhang, X.; Liu, Z.; Collins, M. J.; Mahendra, A.; Helt, L. G.; Steel, M. J.; Choi, D. -Y.; Chae, C. J.; Leong, P. H. W.; Eggleton, B. J.
2016-01-01
It is a fundamental challenge in quantum optics to deterministically generate indistinguishable single photons through non-deterministic nonlinear optical processes, due to the intrinsic coupling of single- and multi-photon-generation probabilities in these processes. Actively multiplexing photons generated in many temporal modes can decouple these probabilities, but key issues are to minimize resource requirements to allow scalability, and to ensure indistinguishability of the generated photons. Here we demonstrate the multiplexing of photons from four temporal modes solely using fibre-integrated optics and off-the-shelf electronic components. We show a 100% enhancement to the single-photon output probability without introducing additional multi-photon noise. Photon indistinguishability is confirmed by a fourfold Hong–Ou–Mandel quantum interference with a 91±16% visibility after subtracting multi-photon noise due to high pump power. Our demonstration paves the way for scalable multiplexing of many non-deterministic photon sources to a single near-deterministic source, which will be of benefit to future quantum photonic technologies. PMID:26996317
Frisenda, Riccardo; Navarro-Moratalla, Efrén; Gant, Patricia; Pérez De Lara, David; Jarillo-Herrero, Pablo; Gorbachev, Roman V; Castellanos-Gomez, Andres
2018-01-02
Designer heterostructures can now be assembled layer-by-layer with unmatched precision thanks to the recently developed deterministic placement methods to transfer two-dimensional (2D) materials. This possibility constitutes the birth of a very active research field on the so-called van der Waals heterostructures. Moreover, these deterministic placement methods also open the door to fabricate complex devices, which would be otherwise very difficult to achieve by conventional bottom-up nanofabrication approaches, and to fabricate fully-encapsulated devices with exquisite electronic properties. The integration of 2D materials with existing technologies such as photonic and superconducting waveguides and fiber optics is another exciting possibility. Here, we review the state-of-the-art of the deterministic placement methods, describing and comparing the different alternative methods available in the literature, and we illustrate their potential to fabricate van der Waals heterostructures, to integrate 2D materials into complex devices and to fabricate artificial bilayer structures where the layers present a user-defined rotational twisting angle.
First-order reliability application and verification methods for semistatic structures
NASA Astrophysics Data System (ADS)
Verderaime, V.
1994-11-01
Escalating risks of aerostructures stimulated by increasing size, complexity, and cost should no longer be ignored in conventional deterministic safety design methods. The deterministic pass-fail concept is incompatible with probability and risk assessments; stress audits are shown to be arbitrary and incomplete, and the concept compromises the performance of high-strength materials. A reliability method is proposed that combines first-order reliability principles with deterministic design variables and conventional test techniques to surmount current deterministic stress design and audit deficiencies. Accumulative and propagation design uncertainty errors are defined and appropriately implemented into the classical safety-index expression. The application is reduced to solving for a design factor that satisfies the specified reliability and compensates for uncertainty errors, and then using this design factor as, and instead of, the conventional safety factor in stress analyses. The resulting method is consistent with current analytical skills and verification practices, the culture of most designers, and the development of semistatic structural designs.
Yin, Shen; Gao, Huijun; Qiu, Jianbin; Kaynak, Okyay
2017-11-01
Data-driven fault detection plays an important role in industrial systems due to its applicability in case of unknown physical models. In fault detection, disturbances must be taken into account as an inherent characteristic of processes. Nevertheless, fault detection for nonlinear processes with deterministic disturbances still receive little attention, especially in data-driven field. To solve this problem, a just-in-time learning-based data-driven (JITL-DD) fault detection method for nonlinear processes with deterministic disturbances is proposed in this paper. JITL-DD employs JITL scheme for process description with local model structures to cope with processes dynamics and nonlinearity. The proposed method provides a data-driven fault detection solution for nonlinear processes with deterministic disturbances, and owns inherent online adaptation and high accuracy of fault detection. Two nonlinear systems, i.e., a numerical example and a sewage treatment process benchmark, are employed to show the effectiveness of the proposed method.
Deterministic Mean-Field Ensemble Kalman Filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul
The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d
NASA Astrophysics Data System (ADS)
Wang, Fengyu
Traditional deterministic reserve requirements rely on ad-hoc, rule of thumb methods to determine adequate reserve in order to ensure a reliable unit commitment. Since congestion and uncertainties exist in the system, both the quantity and the location of reserves are essential to ensure system reliability and market efficiency. The modeling of operating reserves in the existing deterministic reserve requirements acquire the operating reserves on a zonal basis and do not fully capture the impact of congestion. The purpose of a reserve zone is to ensure that operating reserves are spread across the network. Operating reserves are shared inside each reserve zone, but intra-zonal congestion may block the deliverability of operating reserves within a zone. Thus, improving reserve policies such as reserve zones may improve the location and deliverability of reserve. As more non-dispatchable renewable resources are integrated into the grid, it will become increasingly difficult to predict the transfer capabilities and the network congestion. At the same time, renewable resources require operators to acquire more operating reserves. With existing deterministic reserve requirements unable to ensure optimal reserve locations, the importance of reserve location and reserve deliverability will increase. While stochastic programming can be used to determine reserve by explicitly modelling uncertainties, there are still scalability as well as pricing issues. Therefore, new methods to improve existing deterministic reserve requirements are desired. One key barrier of improving existing deterministic reserve requirements is its potential market impacts. A metric, quality of service, is proposed in this thesis to evaluate the price signal and market impacts of proposed hourly reserve zones. Three main goals of this thesis are: 1) to develop a theoretical and mathematical model to better locate reserve while maintaining the deterministic unit commitment and economic dispatch structure, especially with the consideration of renewables, 2) to develop a market settlement scheme of proposed dynamic reserve policies such that the market efficiency is improved, 3) to evaluate the market impacts and price signal of the proposed dynamic reserve policies.
Parameter Estimation in Epidemiology: from Simple to Complex Dynamics
NASA Astrophysics Data System (ADS)
Aguiar, Maíra; Ballesteros, Sebastién; Boto, João Pedro; Kooi, Bob W.; Mateus, Luís; Stollenwerk, Nico
2011-09-01
We revisit the parameter estimation framework for population biological dynamical systems, and apply it to calibrate various models in epidemiology with empirical time series, namely influenza and dengue fever. When it comes to more complex models like multi-strain dynamics to describe the virus-host interaction in dengue fever, even most recently developed parameter estimation techniques, like maximum likelihood iterated filtering, come to their computational limits. However, the first results of parameter estimation with data on dengue fever from Thailand indicate a subtle interplay between stochasticity and deterministic skeleton. The deterministic system on its own already displays complex dynamics up to deterministic chaos and coexistence of multiple attractors.
Inherent Conservatism in Deterministic Quasi-Static Structural Analysis
NASA Technical Reports Server (NTRS)
Verderaime, V.
1997-01-01
The cause of the long-suspected excessive conservatism in the prevailing structural deterministic safety factor has been identified as an inherent violation of the error propagation laws when reducing statistical data to deterministic values and then combining them algebraically through successive structural computational processes. These errors are restricted to the applied stress computations, and because mean and variations of the tolerance limit format are added, the errors are positive, serially cumulative, and excessively conservative. Reliability methods circumvent these errors and provide more efficient and uniform safe structures. The document is a tutorial on the deficiencies and nature of the current safety factor and of its improvement and transition to absolute reliability.
Advanced Aero-Propulsive Mid-Lift-to-Drag Ratio Entry Vehicle for Future Exploration Missions
NASA Technical Reports Server (NTRS)
Campbell, C. H.; Stosaric, R. R; Cerimele, C. J.; Wong, K. A.; Valle, G. D.; Garcia, J. A.; Melton, J. E.; Munk, M. M.; Blades, E.; Kuruvila, G.;
2012-01-01
NASA is currently looking well into the future toward realizing Exploration mission possibilities to destinations including the Earth-Moon Lagrange points, Near-Earth Asteroids (NEAs) and the Moon. These are stepping stones to our ultimate destination Mars. New ideas will be required to conquer the significant challenges that await us, some just conceptions and others beginning to be realized. Bringing these ideas to fruition and enabling further expansion into space will require varying degrees of change, from engineering and integration approaches used in spacecraft design and operations, to high-level architectural capabilities bounded only by the limits of our ideas. The most profound change will be realized by paradigm change, thus enabling our ultimate goals to be achieved. Inherent to achieving these goals, higher entry, descent, and landing (EDL) performance has been identified as a high priority. Increased EDL performance will be enabled by highly-capable thermal protection systems (TPS), the ability to deliver larger and heavier payloads, increased surface access, and tighter landing footprints to accommodate multiple asset, single-site staging. In addition, realizing reduced cost access to space will demand more efficient approaches and reusable launch vehicle systems. Current operational spacecraft and launch vehicles do not incorporate the technologies required for these far-reaching missions and goals, nor what is needed to achieve the desired launch vehicle cost savings. To facilitate these missions and provide for safe and more reliable capabilities, NASA and its partners will need to make ideas reality by gaining knowledge through the design, development, manufacturing, implementation and flight testing of robotic and human spacecraft. To accomplish these goals, an approach is recommended for integrated development and implementation of three paradigm-shifting capabilities into an advanced entry vehicle system with additional application to launch vehicle stage return, thus making ideas reality. These paradigm shifts include the technology maturation of advanced flexible thermal protection materials onto mid lift-to-drag ratio entry vehicles, the development of integrated supersonic aero-propulsive maneuvering, and the implementation of advanced asymmetric launch shrouds. These paradigms have significant overlap with launch vehicle stage return already being developed by the Air Force and several commercial space efforts. Completing the realization of these combined paradigms holds the key to a high-performing entry vehicle system capability that fully leverages multiple technology benefits to accomplish NASA's Exploration missions to atmospheric planetary destinations.
Sung, Ji Ho; Heo, Hoseok; Hwang, Inchan; Lim, Myungsoo; Lee, Donghun; Kang, Kibum; Choi, Hee Cheul; Park, Jae-Hoon; Jhi, Seung-Hoon; Jo, Moon-Ho
2014-07-09
Material design for direct heat-to-electricity conversion with substantial efficiency essentially requires cooperative control of electrical and thermal transport. Bismuth telluride (Bi2Te3) and antimony telluride (Sb2Te3), displaying the highest thermoelectric power at room temperature, are also known as topological insulators (TIs) whose electronic structures are modified by electronic confinements and strong spin-orbit interaction in a-few-monolayers thickness regime, thus possibly providing another degree of freedom for electron and phonon transport at surfaces. Here, we explore novel thermoelectric conversion in the atomic monolayer steps of a-few-layer topological insulating Bi2Te3 (n-type) and Sb2Te3 (p-type). Specifically, by scanning photoinduced thermoelectric current imaging at the monolayer steps, we show that efficient thermoelectric conversion is accomplished by optothermal motion of hot electrons (Bi2Te3) and holes (Sb2Te3) through 2D subbands and topologically protected surface states in a geometrically deterministic manner. Our discovery suggests that the thermoelectric conversion can be interiorly achieved at the atomic steps of a homogeneous medium by direct exploiting of quantum nature of TIs, thus providing a new design rule for the compact thermoelectric circuitry at the ultimate size limit.
Observation of entanglement between a quantum dot spin and a single photon.
Gao, W B; Fallahi, P; Togan, E; Miguel-Sanchez, J; Imamoglu, A
2012-11-15
Entanglement has a central role in fundamental tests of quantum mechanics as well as in the burgeoning field of quantum information processing. Particularly in the context of quantum networks and communication, a main challenge is the efficient generation of entanglement between stationary (spin) and propagating (photon) quantum bits. Here we report the observation of quantum entanglement between a semiconductor quantum dot spin and the colour of a propagating optical photon. The demonstration of entanglement relies on the use of fast, single-photon detection, which allows us to project the photon into a superposition of red and blue frequency components. Our results extend the previous demonstrations of single-spin/single-photon entanglement in trapped ions, neutral atoms and nitrogen-vacancy centres to the domain of artificial atoms in semiconductor nanostructures that allow for on-chip integration of electronic and photonic elements. As a result of its fast optical transitions and favourable selection rules, the scheme we implement could in principle generate nearly deterministic entangled spin-photon pairs at a rate determined ultimately by the high spontaneous emission rate. Our observation constitutes a first step towards implementation of a quantum network with nodes consisting of semiconductor spin quantum bits.
Reaction-diffusion controlled growth of complex structures
NASA Astrophysics Data System (ADS)
Noorduin, Willem; Mahadevan, L.; Aizenberg, Joanna
2013-03-01
Understanding how the emergence of complex forms and shapes in biominerals came about is both of fundamental and practical interest. Although biomineralization processes and organization strategies to give higher order architectures have been studied extensively, synthetic approaches to mimic these self-assembled structures are highly complex and have been difficult to emulate, let alone replicate. The emergence of solution patterns has been found in reaction-diffusion systems such as Turing patterns and the BZ reaction. Intrigued by this spontaneous formation of complexity we explored if similar processes can lead to patterns in the solid state. We here identify a reaction-diffusion system in which the shape of the solidified products is a direct readout of the environmental conditions. Based on insights in the underlying mechanism, we developed a toolbox of engineering strategies to deterministically sculpt patterns and shapes, and combine different morphologies to create a landscape of hierarchical multi scale-complex tectonic architectures with unprecedented levels of complexity. These findings may hold profound implications for understanding, mimicking and ultimately expanding upon nature's morphogenesis strategies, allowing the synthesis of advanced highly complex microscale materials and devices. WLN acknowledges the Netherlands Organization for Scientific Research for financial support
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Sizonenko, A. B.; Zhdanov, A. A.
2018-05-01
Discrete time series or mappings are proposed for describing the dynamics of a nonlinear system. The article considers the problems of forecasting the dynamics of the system from the time series generated by it. In particular, the commercial rate of drilling oil and gas wells can be considered as a series where each next value depends on the previous one. The main parameter here is the technical drilling speed. With the aim of eliminating the measurement error and presenting the commercial speed of the object to the current with a good accuracy, future or any of the elapsed time points, the use of the Kalman filter is suggested. For the transition from a deterministic model to a probabilistic one, the use of ensemble modeling is suggested. Ensemble systems can provide a wide range of visual output, which helps the user to evaluate the measure of confidence in the model. In particular, the availability of information on the estimated calendar duration of the construction of oil and gas wells will allow drilling companies to optimize production planning by rationalizing the approach to loading drilling rigs, which ultimately leads to maximization of profit and an increase of their competitiveness.
Sampled-Data Consensus of Linear Multi-agent Systems With Packet Losses.
Zhang, Wenbing; Tang, Yang; Huang, Tingwen; Kurths, Jurgen
In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function approach and the decomposition method, the design problem of a distributed controller is solved in terms of convex optimization. The interplay among the allowable bound of the sampling interval, the probability of random packet losses, and the rate of deterministic packet losses are explicitly derived to characterize consensus conditions. The obtained criteria are closely related to the maximum eigenvalue of the Laplacian matrix versus the second minimum eigenvalue of the Laplacian matrix, which reveals the intrinsic effect of communication topologies on consensus performance. Finally, simulations are given to show the effectiveness of the proposed results.In this paper, the consensus problem is studied for a class of multi-agent systems with sampled data and packet losses, where random and deterministic packet losses are considered, respectively. For random packet losses, a Bernoulli-distributed white sequence is used to describe packet dropouts among agents in a stochastic way. For deterministic packet losses, a switched system with stable and unstable subsystems is employed to model packet dropouts in a deterministic way. The purpose of this paper is to derive consensus criteria, such that linear multi-agent systems with sampled-data and packet losses can reach consensus. By means of the Lyapunov function approach and the decomposition method, the design problem of a distributed controller is solved in terms of convex optimization. The interplay among the allowable bound of the sampling interval, the probability of random packet losses, and the rate of deterministic packet losses are explicitly derived to characterize consensus conditions. The obtained criteria are closely related to the maximum eigenvalue of the Laplacian matrix versus the second minimum eigenvalue of the Laplacian matrix, which reveals the intrinsic effect of communication topologies on consensus performance. Finally, simulations are given to show the effectiveness of the proposed results.
Resilience and vulnerability to a natural hazard: A mathematical framework based on viability theory
NASA Astrophysics Data System (ADS)
Rougé, Charles; Mathias, Jean-Denis; Deffuant, Guillaume
2013-04-01
This deals with the response of a coupled human and natural system (CHANS) to a natural hazard by using the concepts of resilience and vulnerability within the mathematical framework of viability theory. This theory applies to time-evolving systems such as CHANS and assumes that their desirable properties can be defined as a subset of their state space. Policies can also apply to influence the dynamics of such systems: viability theory aims at finding the policies which keep the properties of a controlled dynamical system for so long as no disturbance hits it. The states of the system such that the properties are guaranteed constitute what is called the viability kernel. This viability framework has been extended to describe the response to a perturbation such as a natural hazard. Resilience describes the capacity of the CHANS to recover by getting back in the viability kernel, where its properties are guaranteed until the onset of the next major event. Defined for a given controlled trajectory that the system may take after the event ends, resilience is (a) whether the system comes back to the viability kernel within a given budget such as a time constraint, but also (b) a decreasing function of vulnerability. Computed for a given trajectory as well, vulnerability is a measure of the consequence of violating a property. We propose a family of functions from which cost functions and other vulnerability indicators can be derived for a certain trajectory. There can be several vulnerability functions, representing for instance social, economic or ecological vulnerability, and each representing the violation of an associated property, but these functions need to be ultimately aggregated as a single indicator. Computing the resilience and vulnerability of a trajectory enables the viability framework to describe the response of both deterministic and stochastic systems to hazards. In the deterministic case, there is only one response trajectory for a given action policy, and methods exist to find the actions which yield the most resilient trajectory, namely the least vulnerable trajectory for which recovery is complete. In the stochastic case however, there is a range of possible trajectories. Statistics can be derived from the probability distribution of the resilience and vulnerability of the trajectories. Dynamic programming methods can then yield either the policies that maximize the probability of being resilient by achieving recovery within a given time horizon, or these which minimize a given vulnerability statistic. These objectives are different and can be in contradiction, so that trade-offs may have to be considered between them. The approach is illustrated in both the deterministic and stochastic cases through a simple model of lake eutrophication, for which the desirable ecological properties of the lake conflict with the economic interest of neighboring farmers.
Comparison of space radiation calculations for deterministic and Monte Carlo transport codes
NASA Astrophysics Data System (ADS)
Lin, Zi-Wei; Adams, James; Barghouty, Abdulnasser; Randeniya, Sharmalee; Tripathi, Ram; Watts, John; Yepes, Pablo
For space radiation protection of astronauts or electronic equipments, it is necessary to develop and use accurate radiation transport codes. Radiation transport codes include deterministic codes, such as HZETRN from NASA and UPROP from the Naval Research Laboratory, and Monte Carlo codes such as FLUKA, the Geant4 toolkit and HETC-HEDS. The deterministic codes and Monte Carlo codes complement each other in that deterministic codes are very fast while Monte Carlo codes are more elaborate. Therefore it is important to investigate how well the results of deterministic codes compare with those of Monte Carlo transport codes and where they differ. In this study we evaluate these different codes in their space radiation applications by comparing their output results in the same given space radiation environments, shielding geometry and material. Typical space radiation environments such as the 1977 solar minimum galactic cosmic ray environment are used as the well-defined input, and simple geometries made of aluminum, water and/or polyethylene are used to represent the shielding material. We then compare various outputs of these codes, such as the dose-depth curves and the flux spectra of different fragments and other secondary particles. These comparisons enable us to learn more about the main differences between these space radiation transport codes. At the same time, they help us to learn the qualitative and quantitative features that these transport codes have in common.
NASA Astrophysics Data System (ADS)
Delvecchio, S.; Antoni, J.
2012-02-01
This paper addresses the use of a cyclostationary blind source separation algorithm (namely RRCR) to extract angle deterministic signals from mechanical rotating machines in presence of stationary speed fluctuations. This means that only phase fluctuations while machine is running in steady-state conditions are considered while run-up or run-down speed variations are not taken into account. The machine is also supposed to run in idle conditions so non-stationary phenomena due to the load are not considered. It is theoretically assessed that in such operating conditions the deterministic (periodic) signal in the angle domain becomes cyclostationary at first and second orders in the time domain. This fact justifies the use of the RRCR algorithm, which is able to directly extract the angle deterministic signal from the time domain without performing any kind of interpolation. This is particularly valuable when angular resampling fails because of uncontrolled speed fluctuations. The capability of the proposed approach is verified by means of simulated and actual vibration signals captured on a pneumatic screwdriver handle. In this particular case not only the extraction of the angle deterministic part can be performed but also the separation of the main sources of excitation (i.e. motor shaft imbalance, epyciloidal gear meshing and air pressure forces) affecting the user hand during operations.
Northern Hemisphere glaciation and the evolution of Plio-Pleistocene climate noise
NASA Astrophysics Data System (ADS)
Meyers, Stephen R.; Hinnov, Linda A.
2010-08-01
Deterministic orbital controls on climate variability are commonly inferred to dominate across timescales of 104-106 years, although some studies have suggested that stochastic processes may be of equal or greater importance. Here we explicitly quantify changes in deterministic orbital processes (forcing and/or pacing) versus stochastic climate processes during the Plio-Pleistocene, via time-frequency analysis of two prominent foraminifera oxygen isotopic stacks. Our results indicate that development of the Northern Hemisphere ice sheet is paralleled by an overall amplification of both deterministic and stochastic climate energy, but their relative dominance is variable. The progression from a more stochastic early Pliocene to a strongly deterministic late Pleistocene is primarily accommodated during two transitory phases of Northern Hemisphere ice sheet growth. This long-term trend is punctuated by “stochastic events,” which we interpret as evidence for abrupt reorganization of the climate system at the initiation and termination of the mid-Pleistocene transition and at the onset of Northern Hemisphere glaciation. In addition to highlighting a complex interplay between deterministic and stochastic climate change during the Plio-Pleistocene, our results support an early onset for Northern Hemisphere glaciation (between 3.5 and 3.7 Ma) and reveal some new characteristics of the orbital signal response, such as the puzzling emergence of 100 ka and 400 ka cyclic climate variability during theoretical eccentricity nodes.
Tag-mediated cooperation with non-deterministic genotype-phenotype mapping
NASA Astrophysics Data System (ADS)
Zhang, Hong; Chen, Shu
2016-01-01
Tag-mediated cooperation provides a helpful framework for resolving evolutionary social dilemmas. However, most of the previous studies have not taken into account genotype-phenotype distinction in tags, which may play an important role in the process of evolution. To take this into consideration, we introduce non-deterministic genotype-phenotype mapping into a tag-based model with spatial prisoner's dilemma. By our definition, the similarity between genotypic tags does not directly imply the similarity between phenotypic tags. We find that the non-deterministic mapping from genotypic tag to phenotypic tag has non-trivial effects on tag-mediated cooperation. Although we observe that high levels of cooperation can be established under a wide variety of conditions especially when the decisiveness is moderate, the uncertainty in the determination of phenotypic tags may have a detrimental effect on the tag mechanism by disturbing the homophilic interaction structure which can explain the promotion of cooperation in tag systems. Furthermore, the non-deterministic mapping may undermine the robustness of the tag mechanism with respect to various factors such as the structure of the tag space and the tag flexibility. This observation warns us about the danger of applying the classical tag-based models to the analysis of empirical phenomena if genotype-phenotype distinction is significant in real world. Non-deterministic genotype-phenotype mapping thus provides a new perspective to the understanding of tag-mediated cooperation.
A Unit on Deterministic Chaos for Student Teachers
ERIC Educational Resources Information Center
Stavrou, D.; Assimopoulos, S.; Skordoulis, C.
2013-01-01
A unit aiming to introduce pre-service teachers of primary education to the limited predictability of deterministic chaotic systems is presented. The unit is based on a commercial chaotic pendulum system connected with a data acquisition interface. The capabilities and difficulties in understanding the notion of limited predictability of 18…
A Deterministic Annealing Approach to Clustering AIRS Data
NASA Technical Reports Server (NTRS)
Guillaume, Alexandre; Braverman, Amy; Ruzmaikin, Alexander
2012-01-01
We will examine the validity of means and standard deviations as a basis for climate data products. We will explore the conditions under which these two simple statistics are inadequate summaries of the underlying empirical probability distributions by contrasting them with a nonparametric, method called Deterministic Annealing technique
The Total Exposure Model (TEM) uses deterministic and stochastic methods to estimate the exposure of a person performing daily activities of eating, drinking, showering, and bathing. There were 250 time histories generated, by subject with activities, for the three exposure ro...
Integrability and Chaos: The Classical Uncertainty
ERIC Educational Resources Information Center
Masoliver, Jaume; Ros, Ana
2011-01-01
In recent years there has been a considerable increase in the publishing of textbooks and monographs covering what was formerly known as random or irregular deterministic motion, now referred to as deterministic chaos. There is still substantial interest in a matter that is included in many graduate and even undergraduate courses on classical…
The development of the deterministic nonlinear PDEs in particle physics to stochastic case
NASA Astrophysics Data System (ADS)
Abdelrahman, Mahmoud A. E.; Sohaly, M. A.
2018-06-01
In the present work, accuracy method called, Riccati-Bernoulli Sub-ODE technique is used for solving the deterministic and stochastic case of the Phi-4 equation and the nonlinear Foam Drainage equation. Also, the control on the randomness input is studied for stability stochastic process solution.
Contemporary Genetics for Gender Researchers: Not Your Grandma's Genetics Anymore
ERIC Educational Resources Information Center
Salk, Rachel H.; Hyde, Janet S.
2012-01-01
Over the past century, much of genetics was deterministic, and feminist researchers framed justified criticisms of genetics research. However, over the past two decades, genetics research has evolved remarkably and has moved far from earlier deterministic approaches. Our article provides a brief primer on modern genetics, emphasizing contemporary…
ERIC Educational Resources Information Center
Rambe, Patient; Nel, Liezel
2015-01-01
The discourse of social media adoption in higher education has often been funnelled through utopian and dystopian perspectives, which are polarised but determinist theorisations of human engagement with educational technologies. Consequently, these determinist approaches have obscured a broadened grasp of the situated, socially constructed nature…
Boyd, A L
2000-07-01
The philosophical and ethical concept of autonomy is herein examined, ex post facto, using an existential lens to examine the process of a personal friend's dying. Anagogy, defined as interpretation of a word, passage, or text that finds beyond the literal, allegorical, and moral senses a fourth and ultimate spiritual or mystical sense, is intended to enlarge the understanding of the use of autonomy in this case. The idea of personhood linked inextricably to reason is, therefore, understood as empowering an individual to choose among various actions, to define and redefine life goals, and to give priority to selected values and moral tenants, which reveal a moral hermeneutic. Conditions and circumstances, existentially exposed, limit choice in unexpected ways, such that the predicted value of autonomy is vulnerable to misuse or misunderstanding. The intent to respect the dignity of every person is central to the philosophy of Respect for Persons ethics, and assumes that autonomy, as freedom of the moral agent, is a moral duty. Implicit reality of freedom is, in a practical sense, essential to being rational agents who can thereby exercise informed choice. The moral law, law of freedom, involves the autonomy of the will and an ultimate end to which all action is directed. Defined as the highest good, morality unites virtue and happiness by ascribing the ultimate end sought as God. The freedom to use rational will finds principles within its own rational nature. The ability to create maxims is autonomy of the will, which equates with the dignity of persons. My recent experience as a companion to a personal friend with a terminal illness inspired me to re-evaluate the concept of autonomy as it is too often interpreted in modern ethical discourse as a individualistic right of choice as opposed to the hermeneutic of dignity of person. This paper describes a shift of position in understanding the paradox of autonomy in this existential context.
Deterministic chaos in an ytterbium-doped mode-locked fiber laser
NASA Astrophysics Data System (ADS)
Mélo, Lucas B. A.; Palacios, Guillermo F. R.; Carelli, Pedro V.; Acioli, Lúcio H.; Rios Leite, José R.; de Miranda, Marcio H. G.
2018-05-01
We experimentally study the nonlinear dynamics of a femtosecond ytterbium doped mode-locked fiber laser. With the laser operating in the pulsed regime a route to chaos is presented, starting from stable mode-locking, period two, period four, chaos and period three regimes. Return maps and bifurcation diagrams were extracted from time series for each regime. The analysis of the time series with the laser operating in the quasi mode-locked regime presents deterministic chaos described by an unidimensional Rossler map. A positive Lyapunov exponent $\\lambda = 0.14$ confirms the deterministic chaos of the system. We suggest an explanation about the observed map by relating gain saturation and intra-cavity loss.
The viability of ADVANTG deterministic method for synthetic radiography generation
NASA Astrophysics Data System (ADS)
Bingham, Andrew; Lee, Hyoung K.
2018-07-01
Fast simulation techniques to generate synthetic radiographic images of high resolution are helpful when new radiation imaging systems are designed. However, the standard stochastic approach requires lengthy run time with poorer statistics at higher resolution. The investigation of the viability of a deterministic approach to synthetic radiography image generation was explored. The aim was to analyze a computational time decrease over the stochastic method. ADVANTG was compared to MCNP in multiple scenarios including a small radiography system prototype, to simulate high resolution radiography images. By using ADVANTG deterministic code to simulate radiography images the computational time was found to decrease 10 to 13 times compared to the MCNP stochastic approach while retaining image quality.
Changing the choice architecture of ageing: live different and 'catch old'.
Gale, Deborah
2014-01-01
Physical ageing and being old are broadly feared or denied, particularly by the young (Chittister 2008: 53). The future is viewed in terms of vague, looming, disabilities, despite the fact that no one can know their personal, ageing fate. As physical and functional limitations become more apparent over time, expected validation occurs in support of the conventional narrative of decline. It is necessary to understand the traditional negative perceptions about ageing if we are to alter them. At present, they do not match the unfolding realities of what it means to grow old, in the early twenty-first century.The challenge of the new longevity is learning to navigate the unexplored life terrain between middle and extreme age. How, then, can we redefine this life stage, navigate new pathways for growing old in order to maximize the untapped contributions of the largest and longest ever living cohort? The baby boomers (born 1946-1964) are not homogenous but they are in position to become the standard bearers for a new narrative and an alternative way to live differently, while ageing.This will require changes to choice architecture and decision making about personal ageing that will challenge long held attitudes, perceptions and mindsets. Hence, changing the narrative about living long and well is the void addressed here.Life is ultimately terminal. In the interim, the process of catching old by living different is the ultimate, life enhancing skill. It is all in the choosing.
Universal Cosmic Absolute and Modern Science
NASA Astrophysics Data System (ADS)
Kostro, Ludwik
The official Sciences, especially all natural sciences, respect in their researches the principle of methodic naturalism i.e. they consider all phenomena as entirely natural and therefore in their scientific explanations they do never adduce or cite supernatural entities and forces. The purpose of this paper is to show that Modern Science has its own self-existent, self-acting, and self-sufficient Natural All-in Being or Omni-Being i.e. the entire Nature as a Whole that justifies the scientific methodic naturalism. Since this Natural All-in Being is one and only It should be considered as the own scientifically justified Natural Absolute of Science and should be called, in my opinion, the Universal Cosmic Absolute of Modern Science. It will be also shown that the Universal Cosmic Absolute is ontologically enormously stratified and is in its ultimate i.e. in its most fundamental stratum trans-reistic and trans-personal. It means that in its basic stratum. It is neither a Thing or a Person although It contains in Itself all things and persons with all other sentient and conscious individuals as well, On the turn of the 20th century the Science has begun to look for a theory of everything, for a final theory, for a master theory. In my opinion the natural Universal Cosmic Absolute will constitute in such a theory the radical all penetrating Ultimate Basic Reality and will substitute step by step the traditional supernatural personal Absolute.
Pediatrics in the year 2020 and beyond: preparing for plausible futures.
Starmer, Amy J; Duby, John C; Slaw, Kenneth M; Edwards, Anne; Leslie, Laurel K
2010-11-01
Although the future of pediatrics is uncertain, the organizations that lead pediatrics, and the professionals who practice within it, have embraced the notion that the pediatric community must anticipate and lead change to ultimately improve the health of children and adolescents. In an attempt to proactively prepare for a variety of conceivable futures, the board of directors of the American Academy of Pediatrics established the Vision of Pediatrics 2020 Task Force in 2008. This group was charged to think broadly about the future of pediatrics, to gather input on key trends that are influencing the future, to create likely scenarios of the future, and to recommend strategies to best prepare pediatric clinicians and pediatric organizations for a range of potential futures. The work of this task force led to the development of 8 "megatrends" that were identified as highly likely to have a profound influence on the future of pediatrics. A separate list of "wild-card" scenarios was created of trends with the potential to have a substantial influence but are less likely to occur. The process of scenario-planning was used to consider the effects of the 8 megatrends on pediatrics in the year 2020 and beyond. Consideration of these possible scenarios affords the opportunity to determine potential future pediatric needs, to identify potential solutions to address those needs, and, ultimately, to proactively prepare the profession to thrive if these or other future scenarios become realities.
In an earlier study, Puente and Obregón [Water Resour. Res. 32(1996)2825] reported on the usage of a deterministic fractal–multifractal (FM) methodology to faithfully describe an 8.3 h high-resolution rainfall time series in Boston, gathered every 15 s ...
Seed availability constrains plant species sorting along a soil fertility gradient
Bryan L. Foster; Erin J. Questad; Cathy D. Collins; Cheryl A. Murphy; Timothy L. Dickson; Val H. Smith
2011-01-01
1. Spatial variation in species composition within and among communities may be caused by deterministic, niche-based species sorting in response to underlying environmental heterogeneity as well as by stochastic factors such as dispersal limitation and variable species pools. An important goal in ecology is to reconcile deterministic and stochastic perspectives of...
The Role of Probability and Intentionality in Preschoolers' Causal Generalizations
ERIC Educational Resources Information Center
Sobel, David M.; Sommerville, Jessica A.; Travers, Lea V.; Blumenthal, Emily J.; Stoddard, Emily
2009-01-01
Three experiments examined whether preschoolers recognize that the causal properties of objects generalize to new members of the same set given either deterministic or probabilistic data. Experiment 1 found that 3- and 4-year-olds were able to make such a generalization given deterministic data but were at chance when they observed probabilistic…
ERIC Educational Resources Information Center
Moreland, James D., Jr
2013-01-01
This research investigates the instantiation of a Service-Oriented Architecture (SOA) within a hard real-time (stringent time constraints), deterministic (maximum predictability) combat system (CS) environment. There are numerous stakeholders across the U.S. Department of the Navy who are affected by this development, and therefore the system…
CPT-based probabilistic and deterministic assessment of in situ seismic soil liquefaction potential
Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Der Kiureghian, A.; Cetin, K.O.
2006-01-01
This paper presents a complete methodology for both probabilistic and deterministic assessment of seismic soil liquefaction triggering potential based on the cone penetration test (CPT). A comprehensive worldwide set of CPT-based liquefaction field case histories were compiled and back analyzed, and the data then used to develop probabilistic triggering correlations. Issues investigated in this study include improved normalization of CPT resistance measurements for the influence of effective overburden stress, and adjustment to CPT tip resistance for the potential influence of "thin" liquefiable layers. The effects of soil type and soil character (i.e., "fines" adjustment) for the new correlations are based on a combination of CPT tip and sleeve resistance. To quantify probability for performancebased engineering applications, Bayesian "regression" methods were used, and the uncertainties of all variables comprising both the seismic demand and the liquefaction resistance were estimated and included in the analysis. The resulting correlations were developed using a Bayesian framework and are presented in both probabilistic and deterministic formats. The results are compared to previous probabilistic and deterministic correlations. ?? 2006 ASCE.
Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow
NASA Astrophysics Data System (ADS)
Gupta, Atma Ram; Kumar, Ashwani
2017-12-01
Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.
NASA Technical Reports Server (NTRS)
Hathaway, Michael D.
1986-01-01
Measurements of the unsteady velocity field within the stator row of a transonic axial-flow fan were acquired using a laser anemometer. Measurements were obtained on axisymmetric surfaces located at 10 and 50 percent span from the shroud, with the fan operating at maximum efficiency at design speed. The ensemble-average and variance of the measured velocities are used to identify rotor-wake-generated (deterministic) unsteadiness and turbulence, respectively. Correlations of both deterministic and turbulent velocity fluctuations provide information on the characteristics of unsteady interactions within the stator row. These correlations are derived from the Navier-Stokes equation in a manner similar to deriving the Reynolds stress terms, whereby various averaging operators are used to average the aperiodic, deterministic, and turbulent velocity fluctuations which are known to be present in multistage turbomachines. The correlations of deterministic and turbulent velocity fluctuations throughout the axial fan stator row are presented. In particular, amplification and attenuation of both types of unsteadiness are shown to occur within the stator blade passage.
Precision production: enabling deterministic throughput for precision aspheres with MRF
NASA Astrophysics Data System (ADS)
Maloney, Chris; Entezarian, Navid; Dumas, Paul
2017-10-01
Aspherical lenses offer advantages over spherical optics by improving image quality or reducing the number of elements necessary in an optical system. Aspheres are no longer being used exclusively by high-end optical systems but are now replacing spherical optics in many applications. The need for a method of production-manufacturing of precision aspheres has emerged and is part of the reason that the optics industry is shifting away from artisan-based techniques towards more deterministic methods. Not only does Magnetorheological Finishing (MRF) empower deterministic figure correction for the most demanding aspheres but it also enables deterministic and efficient throughput for series production of aspheres. The Q-flex MRF platform is designed to support batch production in a simple and user friendly manner. Thorlabs routinely utilizes the advancements of this platform and has provided results from using MRF to finish a batch of aspheres as a case study. We have developed an analysis notebook to evaluate necessary specifications for implementing quality control metrics. MRF brings confidence to optical manufacturing by ensuring high throughput for batch processing of aspheres.
Down to the roughness scale assessment of piston-ring/liner contacts
NASA Astrophysics Data System (ADS)
Checo, H. M.; Jaramillo, A.; Ausas, R. F.; Jai, M.; Buscaglia, G. C.
2017-02-01
The effects of surface roughness in hydrodynamic bearings been accounted for through several approaches, the most widely used being averaging or stochastic techniques. With these the surface is not treated “as it is”, but by means of an assumed probability distribution for the roughness. The so called direct, deterministic or measured-surface simulation) solve the lubrication problem with realistic surfaces down to the roughness scale. This leads to expensive computational problems. Most researchers have tackled this problem considering non-moving surfaces and neglecting the ring dynamics to reduce the computational burden. What is proposed here is to solve the fully-deterministic simulation both in space and in time, so that the actual movement of the surfaces and the rings dynamics are taken into account. This simulation is much more complex than previous ones, as it is intrinsically transient. The feasibility of these fully-deterministic simulations is illustrated two cases: fully deterministic simulation of liner surfaces with diverse finishings (honed and coated bores) with constant piston velocity and load on the ring and also in real engine conditions.
Discrete-Time Deterministic $Q$ -Learning: A Novel Convergence Analysis.
Wei, Qinglai; Lewis, Frank L; Sun, Qiuye; Yan, Pengfei; Song, Ruizhuo
2017-05-01
In this paper, a novel discrete-time deterministic Q -learning algorithm is developed. In each iteration of the developed Q -learning algorithm, the iterative Q function is updated for all the state and control spaces, instead of updating for a single state and a single control in traditional Q -learning algorithm. A new convergence criterion is established to guarantee that the iterative Q function converges to the optimum, where the convergence criterion of the learning rates for traditional Q -learning algorithms is simplified. During the convergence analysis, the upper and lower bounds of the iterative Q function are analyzed to obtain the convergence criterion, instead of analyzing the iterative Q function itself. For convenience of analysis, the convergence properties for undiscounted case of the deterministic Q -learning algorithm are first developed. Then, considering the discounted factor, the convergence criterion for the discounted case is established. Neural networks are used to approximate the iterative Q function and compute the iterative control law, respectively, for facilitating the implementation of the deterministic Q -learning algorithm. Finally, simulation results and comparisons are given to illustrate the performance of the developed algorithm.
Spatial scaling patterns and functional redundancies in a changing boreal lake landscape
Angeler, David G.; Allen, Craig R.; Uden, Daniel R.; Johnson, Richard K.
2015-01-01
Global transformations extend beyond local habitats; therefore, larger-scale approaches are needed to assess community-level responses and resilience to unfolding environmental changes. Using longterm data (1996–2011), we evaluated spatial patterns and functional redundancies in the littoral invertebrate communities of 85 Swedish lakes, with the objective of assessing their potential resilience to environmental change at regional scales (that is, spatial resilience). Multivariate spatial modeling was used to differentiate groups of invertebrate species exhibiting spatial patterns in composition and abundance (that is, deterministic species) from those lacking spatial patterns (that is, stochastic species). We then determined the functional feeding attributes of the deterministic and stochastic invertebrate species, to infer resilience. Between one and three distinct spatial patterns in invertebrate composition and abundance were identified in approximately one-third of the species; the remainder were stochastic. We observed substantial differences in metrics between deterministic and stochastic species. Functional richness and diversity decreased over time in the deterministic group, suggesting a loss of resilience in regional invertebrate communities. However, taxon richness and redundancy increased monotonically in the stochastic group, indicating the capacity of regional invertebrate communities to adapt to change. Our results suggest that a refined picture of spatial resilience emerges if patterns of both the deterministic and stochastic species are accounted for. Spatially extensive monitoring may help increase our mechanistic understanding of community-level responses and resilience to regional environmental change, insights that are critical for developing management and conservation agendas in this current period of rapid environmental transformation.
Chao, Lin; Rang, Camilla Ulla; Proenca, Audrey Menegaz; Chao, Jasper Ubirajara
2016-01-01
Non-genetic phenotypic variation is common in biological organisms. The variation is potentially beneficial if the environment is changing. If the benefit is large, selection can favor the evolution of genetic assimilation, the process by which the expression of a trait is transferred from environmental to genetic control. Genetic assimilation is an important evolutionary transition, but it is poorly understood because the fitness costs and benefits of variation are often unknown. Here we show that the partitioning of damage by a mother bacterium to its two daughters can evolve through genetic assimilation. Bacterial phenotypes are also highly variable. Because gene-regulating elements can have low copy numbers, the variation is attributed to stochastic sampling. Extant Escherichia coli partition asymmetrically and deterministically more damage to the old daughter, the one receiving the mother’s old pole. By modeling in silico damage partitioning in a population, we show that deterministic asymmetry is advantageous because it increases fitness variance and hence the efficiency of natural selection. However, we find that symmetrical but stochastic partitioning can be similarly beneficial. To examine why bacteria evolved deterministic asymmetry, we modeled the effect of damage anchored to the mother’s old pole. While anchored damage strengthens selection for asymmetry by creating additional fitness variance, it has the opposite effect on symmetry. The difference results because anchored damage reinforces the polarization of partitioning in asymmetric bacteria. In symmetric bacteria, it dilutes the polarization. Thus, stochasticity alone may have protected early bacteria from damage, but deterministic asymmetry has evolved to be equally important in extant bacteria. We estimate that 47% of damage partitioning is deterministic in E. coli. We suggest that the evolution of deterministic asymmetry from stochasticity offers an example of Waddington’s genetic assimilation. Our model is able to quantify the evolution of the assimilation because it characterizes the fitness consequences of variation. PMID:26761487
Chao, Lin; Rang, Camilla Ulla; Proenca, Audrey Menegaz; Chao, Jasper Ubirajara
2016-01-01
Non-genetic phenotypic variation is common in biological organisms. The variation is potentially beneficial if the environment is changing. If the benefit is large, selection can favor the evolution of genetic assimilation, the process by which the expression of a trait is transferred from environmental to genetic control. Genetic assimilation is an important evolutionary transition, but it is poorly understood because the fitness costs and benefits of variation are often unknown. Here we show that the partitioning of damage by a mother bacterium to its two daughters can evolve through genetic assimilation. Bacterial phenotypes are also highly variable. Because gene-regulating elements can have low copy numbers, the variation is attributed to stochastic sampling. Extant Escherichia coli partition asymmetrically and deterministically more damage to the old daughter, the one receiving the mother's old pole. By modeling in silico damage partitioning in a population, we show that deterministic asymmetry is advantageous because it increases fitness variance and hence the efficiency of natural selection. However, we find that symmetrical but stochastic partitioning can be similarly beneficial. To examine why bacteria evolved deterministic asymmetry, we modeled the effect of damage anchored to the mother's old pole. While anchored damage strengthens selection for asymmetry by creating additional fitness variance, it has the opposite effect on symmetry. The difference results because anchored damage reinforces the polarization of partitioning in asymmetric bacteria. In symmetric bacteria, it dilutes the polarization. Thus, stochasticity alone may have protected early bacteria from damage, but deterministic asymmetry has evolved to be equally important in extant bacteria. We estimate that 47% of damage partitioning is deterministic in E. coli. We suggest that the evolution of deterministic asymmetry from stochasticity offers an example of Waddington's genetic assimilation. Our model is able to quantify the evolution of the assimilation because it characterizes the fitness consequences of variation.
Impact of refining the assessment of dietary exposure to cadmium in the European adult population.
Ferrari, Pietro; Arcella, Davide; Heraud, Fanny; Cappé, Stefano; Fabiansson, Stefan
2013-01-01
Exposure assessment constitutes an important step in any risk assessment of potentially harmful substances present in food. The European Food Safety Authority (EFSA) first assessed dietary exposure to cadmium in Europe using a deterministic framework, resulting in mean values of exposure in the range of health-based guidance values. Since then, the characterisation of foods has been refined to better match occurrence and consumption data, and a new strategy to handle left-censoring in occurrence data was devised. A probabilistic assessment was performed and compared with deterministic estimates, using occurrence values at the European level and consumption data from 14 national dietary surveys. Mean estimates in the probabilistic assessment ranged from 1.38 (95% CI = 1.35-1.44) to 2.08 (1.99-2.23) µg kg⁻¹ bodyweight (bw) week⁻¹ across the different surveys, which were less than 10% lower than deterministic (middle bound) mean values that ranged from 1.50 to 2.20 µg kg⁻¹ bw week⁻¹. Probabilistic 95th percentile estimates of dietary exposure ranged from 2.65 (2.57-2.72) to 4.99 (4.62-5.38) µg kg⁻¹ bw week⁻¹, which were, with the exception of one survey, between 3% and 17% higher than middle-bound deterministic estimates. Overall, the proportion of subjects exceeding the tolerable weekly intake of 2.5 µg kg⁻¹ bw ranged from 14.8% (13.6-16.0%) to 31.2% (29.7-32.5%) according to the probabilistic assessment. The results of this work indicate that mean values of dietary exposure to cadmium in the European population were of similar magnitude using determinist or probabilistic assessments. For higher exposure levels, probabilistic estimates were almost consistently larger than deterministic counterparts, thus reflecting the impact of using the full distribution of occurrence values to determine exposure levels. It is considered prudent to use probabilistic methodology should exposure estimates be close to or exceeding health-based guidance values.
Martinez, Alexander S.; Faist, Akasha M.
2016-01-01
Background Understanding patterns of biodiversity is a longstanding challenge in ecology. Similar to other biotic groups, arthropod community structure can be shaped by deterministic and stochastic processes, with limited understanding of what moderates the relative influence of these processes. Disturbances have been noted to alter the relative influence of deterministic and stochastic processes on community assembly in various study systems, implicating ecological disturbances as a potential moderator of these forces. Methods Using a disturbance gradient along a 5-year chronosequence of insect-induced tree mortality in a subalpine forest of the southern Rocky Mountains, Colorado, USA, we examined changes in community structure and relative influences of deterministic and stochastic processes in the assembly of aboveground (surface and litter-active species) and belowground (species active in organic and mineral soil layers) arthropod communities. Arthropods were sampled for all years of the chronosequence via pitfall traps (aboveground community) and modified Winkler funnels (belowground community) and sorted to morphospecies. Community structure of both communities were assessed via comparisons of morphospecies abundance, diversity, and composition. Assembly processes were inferred from a mixture of linear models and matrix correlations testing for community associations with environmental properties, and from null-deviation models comparing observed vs. expected levels of species turnover (Beta diversity) among samples. Results Tree mortality altered community structure in both aboveground and belowground arthropod communities, but null models suggested that aboveground communities experienced greater relative influences of deterministic processes, while the relative influence of stochastic processes increased for belowground communities. Additionally, Mantel tests and linear regression models revealed significant associations between the aboveground arthropod communities and vegetation and soil properties, but no significant association among belowground arthropod communities and environmental factors. Discussion Our results suggest context-dependent influences of stochastic and deterministic community assembly processes across different fractions of a spatially co-occurring ground-dwelling arthropod community following disturbance. This variation in assembly may be linked to contrasting ecological strategies and dispersal rates within above- and below-ground communities. Our findings add to a growing body of evidence indicating concurrent influences of stochastic and deterministic processes in community assembly, and highlight the need to consider potential variation across different fractions of biotic communities when testing community ecology theory and considering conservation strategies. PMID:27761333
Comparison of probabilistic and deterministic fiber tracking of cranial nerves.
Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H
2017-09-01
OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p < 0.001; Wilcoxon signed-rank test). The false-positive error of the last obtained depiction was also significantly lower in probabilistic than in deterministic tracking (p < 0.001). The HCP data yielded significantly better results in terms of the Dice coefficient in probabilistic tracking (p < 0.001, Mann-Whitney U-test) and in deterministic tracking (p = 0.02). The false-positive errors were smaller in HCP data in deterministic tracking (p < 0.001) and showed a strong trend toward significance in probabilistic tracking (p = 0.06). In the clinical cases, the probabilistic method visualized 7 of 10 attempted CNs accurately, compared with 3 correct depictions with deterministic tracking. CONCLUSIONS High angular resolution DTI scans are preferable for the DTI-based depiction of the cranial nerves. Probabilistic tracking with a gradual PICo threshold increase is more effective for this task than the previously described deterministic tracking with a gradual FA threshold increase and might represent a method that is useful for depicting cranial nerves with DTI since it eliminates the erroneous fibers without manual intervention.
An Evidence-Based Objection to Retributive Justice.
Mammarella, Brian T M
Advancements in neuroscience and related fields are beginning to show, with increasing clarity, that certain human behaviors stem from uncontrolled, mechanistic causes. These discoveries beg the question: If a given behavior results from some combination of biological predispositions, neurological circumstances, and environmental influences, is that action unwilled and therefore absolved of all attributions of credit, blame, and responsibility? A number of scholars in law and neuroscience who answer "yes" have considered how the absence of free will should impact criminal law's willingness to justify punishments on the basis of retribution, with some arguing that criminal law ought to dispense with retributive justice because the concept of blameworthiness is out of touch with scientific reality. This Note posits a more practical reason for reform by reviewing available empirics on the way people perceive human agency. The research suggests that as the science of human agency becomes increasingly vivid and reductionistic, laypeople will become proportionally less willing to attribute blame, and these shifting societal intuitions will ultimately diminish criminal law's moral credibility. The practical effects of low moral credibility might include diminished compliance, cooperation, and acquiescence with criminal laws, as well as increased general deviance. Importantly, this Note observes that these effects will likely manifest even if people retain a belief in free will. Further, ontological reality plays no part in this Note's argument; whether we in fact have free will is irrelevant. This Note instead contributes to the discourse by highlighting the implications of oncoming shifts in lay conceptions of both particular behaviors and the natural world writ large.
The dramaturgical perspective in relation to self and culture.
Sullivan, Daniel; Landau, Mark J; Young, Isaac F; Stewart, Sheridan A
2014-11-01
Social scientists have studied human behavior from the dramaturgical perspective (DP), through which society is viewed as an elaborate play or game in which individuals enact different roles. The DP is more than a theoretical construct; members of individualist, secular societies occasionally adopt the DP with relation to their own lives. The current research examined the consequences of adopting the DP for evaluations of the self and conceptions of reality at large. Study 1 examined the attitudinal correlates of DP endorsement to test our claim that the DP is situated in an ideological context of individualism and secular modernism. Supporting our claim that the DP invalidates external information about the self's value, in Studies 2A and 2B individuals endorsed the DP to a greater extent after a self-esteem threat, and Studies 2C and 3 showed that exposure to the DP (but not a direct system threat) buffered self-esteem threats. Examining moderators of the DP's influence on self-esteem, Study 4 showed that taking the DP with regard to the ultimate value (vs. concrete experience) of a social role decreased self-esteem and investment in that role. Studies 5A and 5B examined the DP's consequences for perceived moral objectivism. Adopting the DP decreased moral objectivism and moralization of various behaviors but not when the intrinsic self was dispositionally or situationally salient. The latter finding suggests that although contemporary individuals can and occasionally do adopt a reflective stance toward their place within social reality, they nevertheless continue to believe in a true, core self that transcends that precarious drama.
Chen, Karen B; Ponto, Kevin; Tredinnick, Ross D; Radwin, Robert G
2015-06-01
This study was a proof of concept for virtual exertions, a novel method that involves the use of body tracking and electromyography for grasping and moving projections of objects in virtual reality (VR). The user views objects in his or her hands during rehearsed co-contractions of the same agonist-antagonist muscles normally used for the desired activities to suggest exerting forces. Unlike physical objects, virtual objects are images and lack mass. There is currently no practical physically demanding way to interact with virtual objects to simulate strenuous activities. Eleven participants grasped and lifted similar physical and virtual objects of various weights in an immersive 3-D Cave Automatic Virtual Environment. Muscle activity, localized muscle fatigue, ratings of perceived exertions, and NASA Task Load Index were measured. Additionally, the relationship between levels of immersion (2-D vs. 3-D) was studied. Although the overall magnitude of biceps activity and workload were greater in VR, muscle activity trends and fatigue patterns for varying weights within VR and physical conditions were the same. Perceived exertions for varying weights were not significantly different between VR and physical conditions. Perceived exertion levels and muscle activity patterns corresponded to the assigned virtual loads, which supported the hypothesis that the method evoked the perception of physical exertions and showed that the method was promising. Ultimately this approach may offer opportunities for research and training individuals to perform strenuous activities under potentially safer conditions that mimic situations while seeing their own body and hands relative to the scene. © 2014, Human Factors and Ergonomics Society.
Tambone, V; Alessi, A; Macchi, I; Milighetti, S; Muzii, L
2009-01-01
The main difference between a virtual reality and a generic representation is to be directly involved into the action you are performing. As a matter of fact, within the shift from real to virtual world, our biological physique does not mutate but is amplified and connected to the virtual world by technological interfaces. Training using a virtual reality simulator is an option to supplement (or replace) standard training. One of the two main goals of our study is to test, at first, how much students enrolled to the Faculty of Medicine at "University Campus Bio-Medico of Rome" are familiar with synthetic worlds, how long they have been using them and how they would like their Avatar to look like. Moreover, the second aim is to collect students' opinion about the use of virtual, interactive environments to enable learning and participation in dynamic, problem based, clinical, virtual simulations. Simulations might be used to allow learners to make mistakes safely in lieu of real life situations, learn from those mistakes and ultimately to improve performances by subsequent avoidance of those mistakes. The selected approach to the study is based on a semi-structured questionnaire made of 14 questions administered to all the medical students. Most of the students appear not to be very confident with virtual worlds mostly because of a lack of interest. However, a large majority of them are likely to use a virtual world for fun or escaping from reality. Students would select and customize their Avatar by giving her/him the same sexual identity, same figure, same social class but different employment. It is important to notice that a wide majority of the students is interested in practicing on a virtual world in order to manage new experiences and being able to face them; their willing is to get benefits from the ability to make mistakes in a safe environment as well as to record a positive impact on their understanding.
Fantasy and Reality in the History of Weather and Climate Control
NASA Astrophysics Data System (ADS)
Fleming, J. R.
2005-12-01
This presentation examines the history of large-scale weather and climate engineering since 1840, with special reference to imaginative and speculative literature and with special relevance to ethical and policy issues. Ultimate control of the weather and climate embodies both our wildest fantasies and our greatest fears. Fantasy often informs reality (and vice-versa). NASA managers know this well, as do Trekkies. The best science fiction authors typically build from the current state of a field to construct futuristic scenarios that reveal and explore the human condition. Scientists as well often venture into flights of fancy. Though not widely documented, the fantasy-reality axis is also a prominent aspect of the history of the geosciences. James Espy's proposal in the 1840s to enhance precipitation by lighting huge fires, thus stimulating convective updrafts, preceded the widespread charlatanism of the rain-makers, or so-called "pluviculturalists," in the western U.S. One hundred years later, promising discoveries in "cloud seeding" by Irving Langmuir and his associates at the General Electric Corporation rapidly devolved into unsupportable proposals and questionable practices by military and commercial rain-makers seeking to control the weather. During the Cold War, Soviet engineers also promoted a chilling vision (to Westerners) of global climate control. Recently, rather immodest proposals to "fix" a climate system perceived to be out of control have received wide circulation. In 2003 the U.S. Pentagon released a report recommending that the government should "explore geo-engineering options that control the climate." In 2004 a symposium in Cambridge, England set out to "identify, debate, and evaluate" possible, but highly controversial options for the design and construction of engineering projects for the management and mitigation of global climate change. This talk will locate the history of weather and climate modification within a long tradition of imaginative and speculative literature involving "control" of nature. The goal is the articulation of a perspective fully informed by history and the initiation of a dialogue, stimulated by this approach, that uncovers otherwise hidden values, ethical implications, social tensions, and public apprehensions.
Refinements to the Graves and Pitarka (2010) Broadband Ground Motion Simulation Method
Graves, Robert; Arben Pitarka,
2015-01-01
This brief article describes refinements to the Graves and Pitarka (2010) broadband ground motion simulation methodology (GP2010 hereafter) that have been implemented in version 14.3 of the SCEC Broadband Platform (BBP). The updated version of our method on the current SCEC BBP is referred to as GP14.3. Our simulation technique is a hybrid approach that combines low-‐frequency and high-‐frequency motions computed with different methods into a single broadband response. The separate low-‐ and high-‐frequency components have traditionally been called “deterministic” and “stochastic”, respectively; however, this nomenclature is an oversimplification. In reality, the low-‐frequency approach includes many stochastic elements, and likewise, the high-‐frequency approach includes many deterministic elements (e.g., Pulido and Kubo, 2004; Hartzell et al., 2005; Liu et al., 2006; Frankel, 2009; Graves and Pitarka, 2010; Mai et al., 2010). While the traditional terminology will likely remain in use by the broader modeling community, in this paper we will refer to these using the generic terminology “low-‐frequency” and “high-‐ frequency” approaches. Furthermore, one of the primary goals in refining our methodology is to provide a smoother and more consistent transition between the low-‐ and high-‐ frequency calculations, with the ultimate objective being the development of a single unified modeling approach that can be applied over a broad frequency band. GP2010 was validated by modeling recorded strong motions from four California earthquakes. While the method performed well overall, several issues were identified including the tendency to over-‐predict the level of longer period (2-‐5 sec) motions and the effects of rupture directivity. The refinements incorporated in GP14.3 are aimed at addressing these issues with application to the simulation of earthquakes in Western US (WUS). These refinements include the addition of a deep weak zone (details in following section) to the rupture characterization and allowing perturbations in the correlation of rise time and rupture speed with the specified slip distribution. Additionally, we have extended the parameterization of GP14.3 so that it is also applicable for simulating Eastern North America (ENA) earthquakes. This work has been guided by the comprehensive set of validation studies described in Goulet and Abrahamson (2014) and Dreger et al. (2014). The GP14.3 method shows improved performance relative to GP2010, and we direct the interested reader to Dreger et al. (2014) for a detailed assessment of the current methodology. In this paper, we concentrate on describing the modifications in more detail, and also discussing additional refinements that are currently being developed.
Achieving Presence through Evoked Reality
Pillai, Jayesh S.; Schmidt, Colin; Richir, Simon
2013-01-01
The sense of “Presence” (evolving from “telepresence”) has always been associated with virtual reality research and is still an exceptionally mystifying constituent. Now the study of presence clearly spans over various disciplines associated with cognition. This paper attempts to put forth a concept that argues that it’s an experience of an “Evoked Reality (ER)” (illusion of reality) that triggers an “Evoked Presence (EP)” (sense of presence) in our minds. A Three Pole Reality Model is proposed to explain this phenomenon. The poles range from Dream Reality to Simulated Reality with Primary (Physical) Reality at the center. To demonstrate the relationship between ER and EP, a Reality-Presence Map is developed. We believe that this concept of ER and the proposed model may have significant applications in the study of presence, and in exploring the possibilities of not just virtual reality but also what we call “reality.” PMID:23550234
Mixed Reality with HoloLens: Where Virtual Reality Meets Augmented Reality in the Operating Room.
Tepper, Oren M; Rudy, Hayeem L; Lefkowitz, Aaron; Weimer, Katie A; Marks, Shelby M; Stern, Carrie S; Garfein, Evan S
2017-11-01
Virtual reality and augmented reality devices have recently been described in the surgical literature. The authors have previously explored various iterations of these devices, and although they show promise, it has become clear that virtual reality and/or augmented reality devices alone do not adequately meet the demands of surgeons. The solution may lie in a hybrid technology known as mixed reality, which merges many virtual reality and augmented realty features. Microsoft's HoloLens, the first commercially available mixed reality device, provides surgeons intraoperative hands-free access to complex data, the real environment, and bidirectional communication. This report describes the use of HoloLens in the operating room to improve decision-making and surgical workflow. The pace of mixed reality-related technological development will undoubtedly be rapid in the coming years, and plastic surgeons are ideally suited to both lead and benefit from this advance.
Stochastic simulation of reaction-diffusion systems: A fluctuating-hydrodynamics approach
NASA Astrophysics Data System (ADS)
Kim, Changho; Nonaka, Andy; Bell, John B.; Garcia, Alejandro L.; Donev, Aleksandar
2017-03-01
We develop numerical methods for stochastic reaction-diffusion systems based on approaches used for fluctuating hydrodynamics (FHD). For hydrodynamic systems, the FHD formulation is formally described by stochastic partial differential equations (SPDEs). In the reaction-diffusion systems we consider, our model becomes similar to the reaction-diffusion master equation (RDME) description when our SPDEs are spatially discretized and reactions are modeled as a source term having Poisson fluctuations. However, unlike the RDME, which becomes prohibitively expensive for an increasing number of molecules, our FHD-based description naturally extends from the regime where fluctuations are strong, i.e., each mesoscopic cell has few (reactive) molecules, to regimes with moderate or weak fluctuations, and ultimately to the deterministic limit. By treating diffusion implicitly, we avoid the severe restriction on time step size that limits all methods based on explicit treatments of diffusion and construct numerical methods that are more efficient than RDME methods, without compromising accuracy. Guided by an analysis of the accuracy of the distribution of steady-state fluctuations for the linearized reaction-diffusion model, we construct several two-stage (predictor-corrector) schemes, where diffusion is treated using a stochastic Crank-Nicolson method, and reactions are handled by the stochastic simulation algorithm of Gillespie or a weakly second-order tau leaping method. We find that an implicit midpoint tau leaping scheme attains second-order weak accuracy in the linearized setting and gives an accurate and stable structure factor for a time step size of an order of magnitude larger than the hopping time scale of diffusing molecules. We study the numerical accuracy of our methods for the Schlögl reaction-diffusion model both in and out of thermodynamic equilibrium. We demonstrate and quantify the importance of thermodynamic fluctuations to the formation of a two-dimensional Turing-like pattern and examine the effect of fluctuations on three-dimensional chemical front propagation. By comparing stochastic simulations to deterministic reaction-diffusion simulations, we show that fluctuations accelerate pattern formation in spatially homogeneous systems and lead to a qualitatively different disordered pattern behind a traveling wave.
Killeen, Gerry F; Govella, Nicodem J; Lwetoijera, Dickson W; Okumu, Fredros O
2016-04-19
Anopheles arabiensis is stereotypical of diverse vectors that mediate residual malaria transmission globally, because it can feed outdoors upon humans or cattle, or enter but then rapidly exit houses without fatal exposure to insecticidal nets or sprays. Life histories of a well-characterized An. arabiensis population were simulated with a simple but process-explicit deterministic model and relevance to other vectors examined through sensitivity analysis. Where most humans use bed nets, two thirds of An. arabiensis blood feeds and half of malaria transmission events were estimated to occur outdoors. However, it was also estimated that most successful feeds and almost all (>98 %) transmission events are preceded by unsuccessful attempts to attack humans indoors. The estimated proportion of vector blood meals ultimately obtained from humans indoors is dramatically attenuated by availability of alternative hosts, or partial ability to attack humans outdoors. However, the estimated proportion of mosquitoes old enough to transmit malaria, and which have previously entered a house at least once, is far less sensitive to both variables. For vectors with similarly modest preference for cattle over humans and similar ability to evade fatal indoor insecticide exposure once indoors, >80 % of predicted feeding events by mosquitoes old enough to transmit malaria are preceded by at least one house entry event, so long as ≥40 % of attempts to attack humans occur indoors and humans outnumber cattle ≥4-fold. While the exact numerical results predicted by such a simple deterministic model should be considered only approximate and illustrative, the derived conclusions are remarkably insensitive to substantive deviations from the input parameter values measured for this particular An. arabiensis population. This life-history analysis, therefore, identifies a clear, broadly-important opportunity for more effective suppression of residual malaria transmission by An. arabiensis in Africa and other important vectors of residual transmission across the tropics. Improved control of predominantly outdoor residual transmission by An. arabiensis, and other modestly zoophagic vectors like Anopheles darlingi, which frequently enter but then rapidly exit from houses, may be readily achieved by improving existing technology for killing mosquitoes indoors.
Investigation of HZETRN 2010 as a Tool for Single Event Effect Qualification of Avionics Systems
NASA Technical Reports Server (NTRS)
Rojdev, Kristina; Koontz, Steve; Atwell, William; Boeder, Paul
2014-01-01
NASA's future missions are focused on long-duration deep space missions for human exploration which offers no options for a quick emergency return to Earth. The combination of long mission duration with no quick emergency return option leads to unprecedented spacecraft system safety and reliability requirements. It is important that spacecraft avionics systems for human deep space missions are not susceptible to Single Event Effect (SEE) failures caused by space radiation (primarily the continuous galactic cosmic ray background and the occasional solar particle event) interactions with electronic components and systems. SEE effects are typically managed during the design, development, and test (DD&T) phase of spacecraft development by using heritage hardware (if possible) and through extensive component level testing, followed by system level failure analysis tasks that are both time consuming and costly. The ultimate product of the SEE DD&T program is a prediction of spacecraft avionics reliability in the flight environment produced using various nuclear reaction and transport codes in combination with the component and subsystem level radiation test data. Previous work by Koontz, et al.1 utilized FLUKA, a Monte Carlo nuclear reaction and transport code, to calculate SEE and single event upset (SEU) rates. This code was then validated against in-flight data for a variety of spacecraft and space flight environments. However, FLUKA has a long run-time (on the order of days). CREME962, an easy to use deterministic code offering short run times, was also compared with FLUKA predictions and in-flight data. CREME96, though fast and easy to use, has not been updated in several years and underestimates secondary particle shower effects in spacecraft structural shielding mass. Thus, this paper will investigate the use of HZETRN 20103, a fast and easy to use deterministic transport code, similar to CREME96, that was developed at NASA Langley Research Center primarily for flight crew ionizing radiation dose assessments. HZETRN 2010 includes updates to address secondary particle shower effects more accurately, and might be used as another tool to verify spacecraft avionics system reliability in space flight SEE environments.
Stochastic simulation of reaction-diffusion systems: A fluctuating-hydrodynamics approach
Kim, Changho; Nonaka, Andy; Bell, John B.; ...
2017-03-24
Here, we develop numerical methods for stochastic reaction-diffusion systems based on approaches used for fluctuating hydrodynamics (FHD). For hydrodynamic systems, the FHD formulation is formally described by stochastic partial differential equations (SPDEs). In the reaction-diffusion systems we consider, our model becomes similar to the reaction-diffusion master equation (RDME) description when our SPDEs are spatially discretized and reactions are modeled as a source term having Poisson fluctuations. However, unlike the RDME, which becomes prohibitively expensive for an increasing number of molecules, our FHD-based description naturally extends from the regime where fluctuations are strong, i.e., each mesoscopic cell has few (reactive) molecules,more » to regimes with moderate or weak fluctuations, and ultimately to the deterministic limit. By treating diffusion implicitly, we avoid the severe restriction on time step size that limits all methods based on explicit treatments of diffusion and construct numerical methods that are more efficient than RDME methods, without compromising accuracy. Guided by an analysis of the accuracy of the distribution of steady-state fluctuations for the linearized reaction-diffusion model, we construct several two-stage (predictor-corrector) schemes, where diffusion is treated using a stochastic Crank-Nicolson method, and reactions are handled by the stochastic simulation algorithm of Gillespie or a weakly second-order tau leaping method. We find that an implicit midpoint tau leaping scheme attains second-order weak accuracy in the linearized setting and gives an accurate and stable structure factor for a time step size of an order of magnitude larger than the hopping time scale of diffusing molecules. We study the numerical accuracy of our methods for the Schlögl reaction-diffusion model both in and out of thermodynamic equilibrium. We demonstrate and quantify the importance of thermodynamic fluctuations to the formation of a two-dimensional Turing-like pattern and examine the effect of fluctuations on three-dimensional chemical front propagation. Furthermore, by comparing stochastic simulations to deterministic reaction-diffusion simulations, we show that fluctuations accelerate pattern formation in spatially homogeneous systems and lead to a qualitatively different disordered pattern behind a traveling wave.« less
Aspen succession in the Intermountain West: A deterministic model
Dale L. Bartos; Frederick R. Ward; George S. Innis
1983-01-01
A deterministic model of succession in aspen forests was developed using existing data and intuition. The degree of uncertainty, which was determined by allowing the parameter values to vary at random within limits, was larger than desired. This report presents results of an analysis of model sensitivity to changes in parameter values. These results have indicated...
Using stochastic models to incorporate spatial and temporal variability [Exercise 14
Carolyn Hull Sieg; Rudy M. King; Fred Van Dyke
2003-01-01
To this point, our analysis of population processes and viability in the western prairie fringed orchid has used only deterministic models. In this exercise, we conduct a similar analysis, using a stochastic model instead. This distinction is of great importance to population biology in general and to conservation biology in particular. In deterministic models,...
Taking Control: Stealth Assessment of Deterministic Behaviors within a Game-Based System
ERIC Educational Resources Information Center
Snow, Erica L.; Likens, Aaron D.; Allen, Laura K.; McNamara, Danielle S.
2016-01-01
Game-based environments frequently afford students the opportunity to exert agency over their learning paths by making various choices within the environment. The combination of log data from these systems and dynamic methodologies may serve as a stealth means to assess how students behave (i.e., deterministic or random) within these learning…
Guidelines 13 and 14—Prediction uncertainty
Hill, Mary C.; Tiedeman, Claire
2005-01-01
An advantage of using optimization for model development and calibration is that optimization provides methods for evaluating and quantifying prediction uncertainty. Both deterministic and statistical methods can be used. Guideline 13 discusses using regression and post-audits, which we classify as deterministic methods. Guideline 14 discusses inferential statistics and Monte Carlo methods, which we classify as statistical methods.
Deterministic switching of hierarchy during wrinkling in quasi-planar bilayers
Saha, Sourabh K.; Culpepper, Martin L.
2016-04-25
Emergence of hierarchy during compression of quasi-planar bilayers is preceded by a mode-locked state during which the quasi-planar form persists. Transition to hierarchy is determined entirely by geometrically observable parameters. This results in a universal transition phase diagram that enables one to deterministically tune hierarchy even with limited knowledge about material properties.
Stochastic and deterministic models for agricultural production networks.
Bai, P; Banks, H T; Dediu, S; Govan, A Y; Last, M; Lloyd, A L; Nguyen, H K; Olufsen, M S; Rempala, G; Slenning, B D
2007-07-01
An approach to modeling the impact of disturbances in an agricultural production network is presented. A stochastic model and its approximate deterministic model for averages over sample paths of the stochastic system are developed. Simulations, sensitivity and generalized sensitivity analyses are given. Finally, it is shown how diseases may be introduced into the network and corresponding simulations are discussed.
Taking Control: Stealth Assessment of Deterministic Behaviors within a Game-Based System
ERIC Educational Resources Information Center
Snow, Erica L.; Likens, Aaron D.; Allen, Laura K.; McNamara, Danielle S.
2015-01-01
Game-based environments frequently afford students the opportunity to exert agency over their learning paths by making various choices within the environment. The combination of log data from these systems and dynamic methodologies may serve as a stealth means to assess how students behave (i.e., deterministic or random) within these learning…
Probabilistic direct counterfactual quantum communication
NASA Astrophysics Data System (ADS)
Zhang, Sheng
2017-02-01
It is striking that the quantum Zeno effect can be used to launch a direct counterfactual communication between two spatially separated parties, Alice and Bob. So far, existing protocols of this type only provide a deterministic counterfactual communication service. However, this counterfactuality should be payed at a price. Firstly, the transmission time is much longer than a classical transmission costs. Secondly, the chained-cycle structure makes them more sensitive to channel noises. Here, we extend the idea of counterfactual communication, and present a probabilistic-counterfactual quantum communication protocol, which is proved to have advantages over the deterministic ones. Moreover, the presented protocol could evolve to a deterministic one solely by adjusting the parameters of the beam splitters. Project supported by the National Natural Science Foundation of China (Grant No. 61300203).
Li, Longxiang; Xue, Donglin; Deng, Weijie; Wang, Xu; Bai, Yang; Zhang, Feng; Zhang, Xuejun
2017-11-10
In deterministic computer-controlled optical surfacing, accurate dwell time execution by computer numeric control machines is crucial in guaranteeing a high-convergence ratio for the optical surface error. It is necessary to consider the machine dynamics limitations in the numerical dwell time algorithms. In this paper, these constraints on dwell time distribution are analyzed, and a model of the equal extra material removal is established. A positive dwell time algorithm with minimum equal extra material removal is developed. Results of simulations based on deterministic magnetorheological finishing demonstrate the necessity of considering machine dynamics performance and illustrate the validity of the proposed algorithm. Indeed, the algorithm effectively facilitates the determinacy of sub-aperture optical surfacing processes.
Kucza, Witold
2013-07-25
Stochastic and deterministic simulations of dispersion in cylindrical channels on the Poiseuille flow have been presented. The random walk (stochastic) and the uniform dispersion (deterministic) models have been used for computations of flow injection analysis responses. These methods coupled with the genetic algorithm and the Levenberg-Marquardt optimization methods, respectively, have been applied for determination of diffusion coefficients. The diffusion coefficients of fluorescein sodium, potassium hexacyanoferrate and potassium dichromate have been determined by means of the presented methods and FIA responses that are available in literature. The best-fit results agree with each other and with experimental data thus validating both presented approaches. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.
Di Maio, Francesco; Zio, Enrico; Smith, Curtis; ...
2015-07-06
The present special issue contains an overview of the research in the field of Integrated Deterministic and Probabilistic Safety Assessment (IDPSA) of Nuclear Power Plants (NPPs). Traditionally, safety regulation for NPPs design and operation has been based on Deterministic Safety Assessment (DSA) methods to verify criteria that assure plant safety in a number of postulated Design Basis Accident (DBA) scenarios. Referring to such criteria, it is also possible to identify those plant Structures, Systems, and Components (SSCs) and activities that are most important for safety within those postulated scenarios. Then, the design, operation, and maintenance of these “safety-related” SSCs andmore » activities are controlled through regulatory requirements and supported by Probabilistic Safety Assessment (PSA).« less
Multiple Realities and Hybrid Objects: A Creative Approach of Schizophrenic Delusion
Cermolacce, Michel; Despax, Katherine; Richieri, Raphaëlle; Naudin, Jean
2018-01-01
Delusion is usually considered in DSM 5 as a false belief based on incorrect inference about external reality, but the issue of delusion raises crucial concerns, especially that of a possible (or absent) continuity between delusional and normal experiences, and the understanding of delusional experience. In the present study, we first aim to consider delusion from a perspectivist angle, according to the Multiple Reality Theory (MRT). In this model inherited from Alfred Schütz and recently addressed by Gallagher, we are not confronting one reality only, but several (such as the reality of everyday life, of imaginary life, of work, of delusion, etc.). In other terms, the MRT states that our own experience is not drawing its meaning from one reality identified as the outer reality but rather from a multiplicity of realities, each with their own logic and style. Two clinical cases illustrate how the Multiple Realities Theory (MRT) may help address the reality of delusion. Everyday reality and the reality of delusion may be articulated under a few conditions, such as compossibility [i.e., Double Book-Keeping (DBK), in Bleulerian terms] or flexibility. There are indeed possible bridges between them. Possible links with neuroscience or psychoanalysis are evoked. As the subject is confronting different realities, so do the objects among and toward which a subject is evolving. We call such objects Hybrid Objects (HO) due to their multiple belonging. They can operate as shifters, i.e., as some functional operators letting one switch from one reality to another. In the final section, we will emphasize how delusion flexibility, as a dynamic interaction between Multiple Realities, may offer psychotherapeutic possibilities within some reality shared with others, entailing relocation of the present subjects in regained access to some flexibility via Multiple Realities and perspectivism. PMID:29487553
ERIC Educational Resources Information Center
Taçgin, Zeynep; Arslan, Ahmet
2017-01-01
The purpose of this study is to determine perception of postgraduate Computer Education and Instructional Technologies (CEIT) students regarding the concepts of Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR), Augmented Virtuality (AV) and Mirror Reality; and to offer a table that includes differences and similarities between…
ERIC Educational Resources Information Center
Auld, Lawrence W. S.; Pantelidis, Veronica S.
1994-01-01
Describes the Virtual Reality and Education Lab (VREL) established at East Carolina University to study the implications of virtual reality for elementary and secondary education. Highlights include virtual reality software evaluation; hardware evaluation; computer-based curriculum objectives which could use virtual reality; and keeping current…
Hybrid Reality Lab Capabilities - Video 2
NASA Technical Reports Server (NTRS)
Delgado, Francisco J.; Noyes, Matthew
2016-01-01
Our Hybrid Reality and Advanced Operations Lab is developing incredibly realistic and immersive systems that could be used to provide training, support engineering analysis, and augment data collection for various human performance metrics at NASA. To get a better understanding of what Hybrid Reality is, let's go through the two most commonly known types of immersive realities: Virtual Reality, and Augmented Reality. Virtual Reality creates immersive scenes that are completely made up of digital information. This technology has been used to train astronauts at NASA, used during teleoperation of remote assets (arms, rovers, robots, etc.) and other activities. One challenge with Virtual Reality is that if you are using it for real time-applications (like landing an airplane) then the information used to create the virtual scenes can be old (i.e. visualized long after physical objects moved in the scene) and not accurate enough to land the airplane safely. This is where Augmented Reality comes in. Augmented Reality takes real-time environment information (from a camera, or see through window, and places digitally created information into the scene so that it matches with the video/glass information). Augmented Reality enhances real environment information collected with a live sensor or viewport (e.g. camera, window, etc.) with the information-rich visualization provided by Virtual Reality. Hybrid Reality takes Augmented Reality even further, by creating a higher level of immersion where interactivity can take place. Hybrid Reality takes Virtual Reality objects and a trackable, physical representation of those objects, places them in the same coordinate system, and allows people to interact with both objects' representations (virtual and physical) simultaneously. After a short period of adjustment, the individuals begin to interact with all the objects in the scene as if they were real-life objects. The ability to physically touch and interact with digitally created objects that have the same shape, size, location to their physical object counterpart in virtual reality environment can be a game changer when it comes to training, planning, engineering analysis, science, entertainment, etc. Our Project is developing such capabilities for various types of environments. The video outlined with this abstract is a representation of an ISS Hybrid Reality experience. In the video you can see various Hybrid Reality elements that provide immersion beyond just standard Virtual Reality or Augmented Reality.
Finney, Charles E.; Kaul, Brian C.; Daw, C. Stuart; ...
2015-02-18
Here we review developments in the understanding of cycle to cycle variability in internal combustion engines, with a focus on spark-ignited and premixed combustion conditions. Much of the research on cyclic variability has focused on stochastic aspects, that is, features that can be modeled as inherently random with no short term predictability. In some cases, models of this type appear to work very well at describing experimental observations, but the lack of predictability limits control options. Also, even when the statistical properties of the stochastic variations are known, it can be very difficult to discern their underlying physical causes andmore » thus mitigate them. Some recent studies have demonstrated that under some conditions, cyclic combustion variations can have a relatively high degree of low dimensional deterministic structure, which implies some degree of predictability and potential for real time control. These deterministic effects are typically more pronounced near critical stability limits (e.g. near tipping points associated with ignition or flame propagation) such during highly dilute fueling or near the onset of homogeneous charge compression ignition. We review recent progress in experimental and analytical characterization of cyclic variability where low dimensional, deterministic effects have been observed. We describe some theories about the sources of these dynamical features and discuss prospects for interactive control and improved engine designs. In conclusion, taken as a whole, the research summarized here implies that the deterministic component of cyclic variability will become a pivotal issue (and potential opportunity) as engine manufacturers strive to meet aggressive emissions and fuel economy regulations in the coming decades.« less
Dini-Andreote, Francisco; Stegen, James C; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-03-17
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages--which provide a larger spatiotemporal scale relative to within stage analyses--revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended--and experimentally testable--conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems.
Efficient Algorithms for Handling Nondeterministic Automata
NASA Astrophysics Data System (ADS)
Vojnar, Tomáš
Finite (word, tree, or omega) automata play an important role in different areas of computer science, including, for instance, formal verification. Often, deterministic automata are used for which traditional algorithms for important operations such as minimisation and inclusion checking are available. However, the use of deterministic automata implies a need to determinise nondeterministic automata that often arise during various computations even when the computations start with deterministic automata. Unfortunately, determinisation is a very expensive step since deterministic automata may be exponentially bigger than the original nondeterministic automata. That is why, it appears advantageous to avoid determinisation and work directly with nondeterministic automata. This, however, brings a need to be able to implement operations traditionally done on deterministic automata on nondeterministic automata instead. In particular, this is the case of inclusion checking and minimisation (or rather reduction of the size of automata). In the talk, we review several recently proposed techniques for inclusion checking on nondeterministic finite word and tree automata as well as Büchi automata. These techniques are based on using the so called antichains, possibly combined with a use of suitable simulation relations (and, in the case of Büchi automata, the so called Ramsey-based or rank-based approaches). Further, we discuss techniques for reducing the size of nondeterministic word and tree automata using quotienting based on the recently proposed notion of mediated equivalences. The talk is based on several common works with Parosh Aziz Abdulla, Ahmed Bouajjani, Yu-Fang Chen, Peter Habermehl, Lisa Kaati, Richard Mayr, Tayssir Touili, Lorenzo Clemente, Lukáš Holík, and Chih-Duo Hong.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, Emily B.; Crump, Alex R.; Resch, Charles T.
2017-03-28
Subsurface zones of groundwater and surface water mixing (hyporheic zones) are regions of enhanced rates of biogeochemical cycling, yet ecological processes governing hyporheic microbiome composition and function through space and time remain unknown. We sampled attached and planktonic microbiomes in the Columbia River hyporheic zone across seasonal hydrologic change, and employed statistical null models to infer mechanisms generating temporal changes in microbiomes within three hydrologically-connected, physicochemically-distinct geographic zones (inland, nearshore, river). We reveal that microbiomes remain dissimilar through time across all zones and habitat types (attached vs. planktonic) and that deterministic assembly processes regulate microbiome composition in all data subsets.more » The consistent presence of heterotrophic taxa and members of the Planctomycetes-Verrucomicrobia-Chlamydiae (PVC) superphylum nonetheless suggests common selective pressures for physiologies represented in these groups. Further, co-occurrence networks were used to provide insight into taxa most affected by deterministic assembly processes. We identified network clusters to represent groups of organisms that correlated with seasonal and physicochemical change. Extended network analyses identified keystone taxa within each cluster that we propose are central in microbiome composition and function. Finally, the abundance of one network cluster of nearshore organisms exhibited a seasonal shift from heterotrophic to autotrophic metabolisms and correlated with microbial metabolism, possibly indicating an ecological role for these organisms as foundational species in driving biogeochemical reactions within the hyporheic zone. Taken together, our research demonstrates a predominant role for deterministic assembly across highly-connected environments and provides insight into niche dynamics associated with seasonal changes in hyporheic microbiome composition and metabolism.« less
Theory and applications of a deterministic approximation to the coalescent model
Jewett, Ethan M.; Rosenberg, Noah A.
2014-01-01
Under the coalescent model, the random number nt of lineages ancestral to a sample is nearly deterministic as a function of time when nt is moderate to large in value, and it is well approximated by its expectation E[nt]. In turn, this expectation is well approximated by simple deterministic functions that are easy to compute. Such deterministic functions have been applied to estimate allele age, effective population size, and genetic diversity, and they have been used to study properties of models of infectious disease dynamics. Although a number of simple approximations of E[nt] have been derived and applied to problems of population-genetic inference, the theoretical accuracy of the formulas and the inferences obtained using these approximations is not known, and the range of problems to which they can be applied is not well understood. Here, we demonstrate general procedures by which the approximation nt ≈ E[nt] can be used to reduce the computational complexity of coalescent formulas, and we show that the resulting approximations converge to their true values under simple assumptions. Such approximations provide alternatives to exact formulas that are computationally intractable or numerically unstable when the number of sampled lineages is moderate or large. We also extend an existing class of approximations of E[nt] to the case of multiple populations of time-varying size with migration among them. Our results facilitate the use of the deterministic approximation nt ≈ E[nt] for deriving functionally simple, computationally efficient, and numerically stable approximations of coalescent formulas under complicated demographic scenarios. PMID:24412419
Dini-Andreote, Francisco; Stegen, James C.; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-01-01
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages—which provide a larger spatiotemporal scale relative to within stage analyses—revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended—and experimentally testable—conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems. PMID:25733885
Ordinal optimization and its application to complex deterministic problems
NASA Astrophysics Data System (ADS)
Yang, Mike Shang-Yu
1998-10-01
We present in this thesis a new perspective to approach a general class of optimization problems characterized by large deterministic complexities. Many problems of real-world concerns today lack analyzable structures and almost always involve high level of difficulties and complexities in the evaluation process. Advances in computer technology allow us to build computer models to simulate the evaluation process through numerical means, but the burden of high complexities remains to tax the simulation with an exorbitant computing cost for each evaluation. Such a resource requirement makes local fine-tuning of a known design difficult under most circumstances, let alone global optimization. Kolmogorov equivalence of complexity and randomness in computation theory is introduced to resolve this difficulty by converting the complex deterministic model to a stochastic pseudo-model composed of a simple deterministic component and a white-noise like stochastic term. The resulting randomness is then dealt with by a noise-robust approach called Ordinal Optimization. Ordinal Optimization utilizes Goal Softening and Ordinal Comparison to achieve an efficient and quantifiable selection of designs in the initial search process. The approach is substantiated by a case study in the turbine blade manufacturing process. The problem involves the optimization of the manufacturing process of the integrally bladed rotor in the turbine engines of U.S. Air Force fighter jets. The intertwining interactions among the material, thermomechanical, and geometrical changes makes the current FEM approach prohibitively uneconomical in the optimization process. The generalized OO approach to complex deterministic problems is applied here with great success. Empirical results indicate a saving of nearly 95% in the computing cost.
Virtual reality, augmented reality…I call it i-Reality.
Grossmann, Rafael J
2015-01-01
The new term improved reality (i-Reality) is suggested to include virtual reality (VR) and augmented reality (AR). It refers to a real world that includes improved, enhanced and digitally created features that would offer an advantage on a particular occasion (i.e., a medical act). I-Reality may help us bridge the gap between the high demand for medical providers and the low supply of them by improving the interaction between providers and patients.
A stochastic model for correlated protein motions
NASA Astrophysics Data System (ADS)
Karain, Wael I.; Qaraeen, Nael I.; Ajarmah, Basem
2006-06-01
A one-dimensional Langevin-type stochastic difference equation is used to find the deterministic and Gaussian contributions of time series representing the projections of a Bovine Pancreatic Trypsin Inhibitor (BPTI) protein molecular dynamics simulation along different eigenvector directions determined using principal component analysis. The deterministic part shows a distinct nonlinear behavior only for eigenvectors contributing significantly to the collective protein motion.
ERIC Educational Resources Information Center
Raveendran, Aswathy; Chunawala, Sugra
2015-01-01
Several educators have emphasized that students need to understand science as a human endeavor that is not value free. In the exploratory study reported here, we investigated how doctoral students of biology understand the intersection of values and science in the context of genetic determinism. Deterministic research claims have been critiqued…
The dual reading of general conditionals: The influence of abstract versus concrete contexts.
Wang, Moyun; Yao, Xinyun
2018-04-01
A current main issue on conditionals is whether the meaning of general conditionals (e.g., If a card is red, then it is round) is deterministic (exceptionless) or probabilistic (exception-tolerating). In order to resolve the issue, two experiments examined the influence of conditional contexts (with vs. without frequency information of truth table cases) on the reading of general conditionals. Experiment 1 examined the direct reading of general conditionals in the possibility judgment task. Experiment 2 examined the indirect reading of general conditionals in the truth judgment task. It was found that both the direct and indirect reading of general conditionals exhibited the duality: the predominant deterministic semantic reading of conditionals without frequency information, and the predominant probabilistic pragmatic reading of conditionals with frequency information. The context of general conditionals determined the predominant reading of general conditionals. There were obvious individual differences in reading general conditionals with frequency information. The meaning of general conditionals is relative, depending on conditional contexts. The reading of general conditionals is flexible and complex so that no simple deterministic and probabilistic accounts are able to explain it. The present findings are beyond the extant deterministic and probabilistic accounts of conditionals.
Huttunen, K-L; Mykrä, H; Oksanen, J; Astorga, A; Paavola, R; Muotka, T
2017-05-03
One of the key challenges to understanding patterns of β diversity is to disentangle deterministic patterns from stochastic ones. Stochastic processes may mask the influence of deterministic factors on community dynamics, hindering identification of the mechanisms causing variation in community composition. We studied temporal β diversity (among-year dissimilarity) of macroinvertebrate communities in near-pristine boreal streams across 14 years. To assess whether the observed β diversity deviates from that expected by chance, and to identify processes (deterministic vs. stochastic) through which different explanatory factors affect community variability, we used a null model approach. We observed that at the majority of sites temporal β diversity was low indicating high community stability. When stochastic variation was unaccounted for, connectivity was the only variable explaining temporal β diversity, with weakly connected sites exhibiting higher community variability through time. After accounting for stochastic effects, connectivity lost importance, suggesting that it was related to temporal β diversity via random colonization processes. Instead, β diversity was best explained by in-stream vegetation, community variability decreasing with increasing bryophyte cover. These results highlight the potential of stochastic factors to dampen the influence of deterministic processes, affecting our ability to understand and predict changes in biological communities through time.
Characterizing Uncertainty and Variability in PBPK Models ...
Mode-of-action based risk and safety assessments can rely upon tissue dosimetry estimates in animals and humans obtained from physiologically-based pharmacokinetic (PBPK) modeling. However, risk assessment also increasingly requires characterization of uncertainty and variability; such characterization for PBPK model predictions represents a continuing challenge to both modelers and users. Current practices show significant progress in specifying deterministic biological models and the non-deterministic (often statistical) models, estimating their parameters using diverse data sets from multiple sources, and using them to make predictions and characterize uncertainty and variability. The International Workshop on Uncertainty and Variability in PBPK Models, held Oct 31-Nov 2, 2006, sought to identify the state-of-the-science in this area and recommend priorities for research and changes in practice and implementation. For the short term, these include: (1) multidisciplinary teams to integrate deterministic and non-deterministic/statistical models; (2) broader use of sensitivity analyses, including for structural and global (rather than local) parameter changes; and (3) enhanced transparency and reproducibility through more complete documentation of the model structure(s) and parameter values, the results of sensitivity and other analyses, and supporting, discrepant, or excluded data. Longer-term needs include: (1) theoretic and practical methodological impro
Deterministic generation of remote entanglement with active quantum feedback
Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...
2015-12-10
We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less
Xia, J.; Franseen, E.K.; Miller, R.D.; Weis, T.V.
2004-01-01
We successfully applied deterministic deconvolution to real ground-penetrating radar (GPR) data by using the source wavelet that was generated in and transmitted through air as the operator. The GPR data were collected with 400-MHz antennas on a bench adjacent to a cleanly exposed quarry face. The quarry site is characterized by horizontally bedded carbonate strata with shale partings. In order to provide groundtruth for this deconvolution approach, 23 conductive rods were drilled into the quarry face at key locations. The steel rods provided critical information for: (1) correlation between reflections on GPR data and geologic features exposed in the quarry face, (2) GPR resolution limits, (3) accuracy of velocities calculated from common midpoint data and (4) identifying any multiples. Comparing the results of deconvolved data with non-deconvolved data demonstrates the effectiveness of deterministic deconvolution in low dielectric-loss media for increased accuracy of velocity models (improved at least 10-15% in our study after deterministic deconvolution), increased vertical and horizontal resolution of specific geologic features and more accurate representation of geologic features as confirmed from detailed study of the adjacent quarry wall. ?? 2004 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Duffy, S. F.; Hu, J.; Hopkins, D. A.
1995-01-01
The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.
Implementation speed of deterministic population passages compared to that of Rabi pulses
NASA Astrophysics Data System (ADS)
Chen, Jingwei; Wei, L. F.
2015-02-01
Fast Rabi π -pulse technique has been widely applied to various coherent quantum manipulations, although it requires precise designs of the pulse areas. Relaxing the precise pulse designs, various rapid adiabatic passage (RAP) approaches have been alternatively utilized to implement various population passages deterministically. However, the usual RAP protocol could not be implemented desirably fast, as the relevant adiabatic condition should be robustly satisfied during the passage. Here, we propose a modified shortcut to adiabaticity (STA) technique to accelerate significantly the desired deterministic quantum state population passages. This transitionless technique is beyond the usual rotating wave approximation (RWA) performed in the recent STA protocols, and thus can be applied to deliver various fast quantum evolutions wherein the relevant counter-rotating effects cannot be neglected. The proposal is demonstrated specifically with the driven two- and three-level systems. Numerical results show that with the present STA technique beyond the RWA the usual Stark-chirped RAPs and stimulated Raman adiabatic passages could be significantly speeded up; the deterministic population passages could be implemented as fast as the widely used fast Rabi π pulses, but are insensitive to the applied pulse areas.
NASA Astrophysics Data System (ADS)
Adams, Mike; Smalian, Silva
2017-09-01
For nuclear waste packages the expected dose rates and nuclide inventory are beforehand calculated. Depending on the package of the nuclear waste deterministic programs like MicroShield® provide a range of results for each type of packaging. Stochastic programs like "Monte-Carlo N-Particle Transport Code System" (MCNP®) on the other hand provide reliable results for complex geometries. However this type of program requires a fully trained operator and calculations are time consuming. The problem here is to choose an appropriate program for a specific geometry. Therefore we compared the results of deterministic programs like MicroShield® and stochastic programs like MCNP®. These comparisons enable us to make a statement about the applicability of the various programs for chosen types of containers. As a conclusion we found that for thin-walled geometries deterministic programs like MicroShield® are well suited to calculate the dose rate. For cylindrical containers with inner shielding however, deterministic programs hit their limits. Furthermore we investigate the effect of an inhomogeneous material and activity distribution on the results. The calculations are still ongoing. Results will be presented in the final abstract.
Hu, Xiao-Bing; Wang, Ming; Di Paolo, Ezequiel
2013-06-01
Searching the Pareto front for multiobjective optimization problems usually involves the use of a population-based search algorithm or of a deterministic method with a set of different single aggregate objective functions. The results are, in fact, only approximations of the real Pareto front. In this paper, we propose a new deterministic approach capable of fully determining the real Pareto front for those discrete problems for which it is possible to construct optimization algorithms to find the k best solutions to each of the single-objective problems. To this end, two theoretical conditions are given to guarantee the finding of the actual Pareto front rather than its approximation. Then, a general methodology for designing a deterministic search procedure is proposed. A case study is conducted, where by following the general methodology, a ripple-spreading algorithm is designed to calculate the complete exact Pareto front for multiobjective route optimization. When compared with traditional Pareto front search methods, the obvious advantage of the proposed approach is its unique capability of finding the complete Pareto front. This is illustrated by the simulation results in terms of both solution quality and computational efficiency.
Russo, Maria Teresa; Di Stefano, Nicola
2014-01-01
The article calls into question the very possibility of a post-human aesthetics, starting from the following premise: rather than post-human, it is more correct to speak of post-natural, indicating by this expression a reality produced through a new type of evolution, which does not simply change human nature, but de-natures it, radically transforming it into an artefact. This post-nature which aspires to be perfect, immortal, invulnerable, is entirely devoid of beauty. In fact, while there may be an aesthetic of the artificial and of the artefact if it is in relation to objects, there is, however, no aesthetic of the post-human body. This is because is configured as a non-body and does not have the characteristics for what is commonly intended as beauty (harmony between matter and form, a reflection of inner life, uniqueness). Also in this case, it is more correct to speak of post-beauty, which in its properties appears to be the mirror image of beauty and ultimately, represents its complete dissolution.
An, Gary; Bartels, John; Vodovotz, Yoram
2011-03-01
The clinical translation of promising basic biomedical findings, whether derived from reductionist studies in academic laboratories or as the product of extensive high-throughput and -content screens in the biotechnology and pharmaceutical industries, has reached a period of stagnation in which ever higher research and development costs are yielding ever fewer new drugs. Systems biology and computational modeling have been touted as potential avenues by which to break through this logjam. However, few mechanistic computational approaches are utilized in a manner that is fully cognizant of the inherent clinical realities in which the drugs developed through this ostensibly rational process will be ultimately used. In this article, we present a Translational Systems Biology approach to inflammation. This approach is based on the use of mechanistic computational modeling centered on inherent clinical applicability, namely that a unified suite of models can be applied to generate in silico clinical trials, individualized computational models as tools for personalized medicine, and rational drug and device design based on disease mechanism.
Biomedical imaging ontologies: A survey and proposal for future work
Smith, Barry; Arabandi, Sivaram; Brochhausen, Mathias; Calhoun, Michael; Ciccarese, Paolo; Doyle, Scott; Gibaud, Bernard; Goldberg, Ilya; Kahn, Charles E.; Overton, James; Tomaszewski, John; Gurcan, Metin
2015-01-01
Background: Ontology is one strategy for promoting interoperability of heterogeneous data through consistent tagging. An ontology is a controlled structured vocabulary consisting of general terms (such as “cell” or “image” or “tissue” or “microscope”) that form the basis for such tagging. These terms are designed to represent the types of entities in the domain of reality that the ontology has been devised to capture; the terms are provided with logical definitions thereby also supporting reasoning over the tagged data. Aim: This paper provides a survey of the biomedical imaging ontologies that have been developed thus far. It outlines the challenges, particularly faced by ontologies in the fields of histopathological imaging and image analysis, and suggests a strategy for addressing these challenges in the example domain of quantitative histopathology imaging. Results and Conclusions: The ultimate goal is to support the multiscale understanding of disease that comes from using interoperable ontologies to integrate imaging data with clinical and genomics data. PMID:26167381
Newman, Constance; Kimeu, Anastasiah; Shamblin, Leigh; Penders, Christopher; McQuide, Pamela A; Bwonya, Judith
2011-01-01
IntraHealth International's USAID-funded Capacity Kenya project conducted a performance needs assessment of the Kenya health provider education system in 2010. Various stakeholders shared their understandings of the role played by gender and identified opportunities to improve gender equality in health provider education. Findings suggest that occupational segregation, sexual harassment and discrimination based on pregnancy and family responsibilities present problems, especially for female students and faculty. To grow and sustain its workforce over the long term, Kenyan human resource leaders and managers must act to eliminate gender-based obstacles by implementing existing non-discrimination and equal opportunity policies and laws to increase the entry, retention and productivity of students and faculty. Families and communities must support girls' schooling and defer early marriage. All this will result in a fuller pool of students, faculty and matriculated health workers and, ultimately, a more robust health workforce to meet Kenya's health challenges.
Toward the light field display: autostereoscopic rendering via a cluster of projectors.
Yang, Ruigang; Huang, Xinyu; Li, Sifang; Jaynes, Christopher
2008-01-01
Ultimately, a display device should be capable of reproducing the visual effects observed in reality. In this paper we introduce an autostereoscopic display that uses a scalable array of digital light projectors and a projection screen augmented with microlenses to simulate a light field for a given three-dimensional scene. Physical objects emit or reflect light in all directions to create a light field that can be approximated by the light field display. The display can simultaneously provide many viewers from different viewpoints a stereoscopic effect without head tracking or special viewing glasses. This work focuses on two important technical problems related to the light field display; calibration and rendering. We present a solution to automatically calibrate the light field display using a camera and introduce two efficient algorithms to render the special multi-view images by exploiting their spatial coherence. The effectiveness of our approach is demonstrated with a four-projector prototype that can display dynamic imagery with full parallax.
Assessment of Wind Parameter Sensitivity on Extreme and Fatigue Wind Turbine Loads
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Amy N; Sethuraman, Latha; Jonkman, Jason
Wind turbines are designed using a set of simulations to ascertain the structural loads that the turbine could encounter. While mean hub-height wind speed is considered to vary, other wind parameters such as turbulence spectra, sheer, veer, spatial coherence, and component correlation are fixed or conditional values that, in reality, could have different characteristics at different sites and have a significant effect on the resulting loads. This paper therefore seeks to assess the sensitivity of different wind parameters on the resulting ultimate and fatigue loads on the turbine during normal operational conditions. Eighteen different wind parameters are screened using anmore » Elementary Effects approach with radial points. As expected, the results show a high sensitivity of the loads to the turbulence standard deviation in the primary wind direction, but the sensitivity to wind shear is often much greater. To a lesser extent, other wind parameters that drive loads include the coherence in the primary wind direction and veer.« less
Nichols, Charles D; Sanders-Bush, Elaine
2002-05-01
Hallucinogenic drugs such as lysergic acid diethylamide (LSD) have profound effects on humans including hallucinations and detachment from reality. These remarkable behavioral effects have many similarities to the debilitating symptoms of neuropsychiatric disorders such as schizophrenia. The effects of hallucinogens are thought to be mediated by serotonin receptor activation; however, how these drugs elicit the unusual behavioral effects remains largely a mystery, despite much research. We have undertaken the first comprehensive analysis of gene expression influenced by acute LSD administration in the mammalian brain. These studies represent a novel approach to elucidate the mechanism of action of this class of drugs. We have identified a number of genes that are predicted to be involved in the processes of synaptic plasticity, glutamatergic signaling and cytoskeletal architecture. Understanding these molecular events will lead to new insights into the etiology of disorders whose behavioral symptoms resemble the temporary effects of hallucinogenic drugs, and also may ultimately result in new therapies.
Cybersecurity through Real-Time Distributed Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kisner, Roger A; Manges, Wayne W; MacIntyre, Lawrence Paul
2010-04-01
Critical infrastructure sites and facilities are becoming increasingly dependent on interconnected physical and cyber-based real-time distributed control systems (RTDCSs). A mounting cybersecurity threat results from the nature of these ubiquitous and sometimes unrestrained communications interconnections. Much work is under way in numerous organizations to characterize the cyber threat, determine means to minimize risk, and develop mitigation strategies to address potential consequences. While it seems natural that a simple application of cyber-protection methods derived from corporate business information technology (IT) domain would lead to an acceptable solution, the reality is that the characteristics of RTDCSs make many of those methods inadequatemore » and unsatisfactory or even harmful. A solution lies in developing a defense-in-depth approach that ranges from protection at communications interconnect levels ultimately to the control system s functional characteristics that are designed to maintain control in the face of malicious intrusion. This paper summarizes the nature of RTDCSs from a cybersecurity perspec tive and discusses issues, vulnerabilities, candidate mitigation approaches, and metrics.« less
The moral sense of humanitarian actors: an empirical exploration.
Rességuier, Anaïs
2018-01-01
This paper examines humanitarianism's moral positioning above private and political interests to save lives and alleviate suffering. It does not aim to assess the legitimacy of this stance, but rather to probe the way in which humanitarian actors relate to this moral dimension in their everyday work. It investigates empirically humanitarian ethics from the perspective of humanitarian actors, drawing on interviews conducted in Beirut, Lebanon, in 2014. As it is exploratory, three key conceptual innovations were required. The first of these is the introduction of the tools developed to consider a neglected reality: humanitarian actors' 'moral sense' vis-à-vis the humanitarian sector's 'moral culture'. Second, the study shows how the sector's moral culture is structured around the notion of 'concern for persons in need'. Third, it analyses the way in which the sector and its actors handle the asymmetrical relationships encountered daily. Ultimately this paper seeks to valorise humanitarian actors' creativity in their common practices and explore potential challenges to it. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.
Receptor-ligand binding sites and virtual screening.
Hattotuwagama, Channa K; Davies, Matthew N; Flower, Darren R
2006-01-01
Within the pharmaceutical industry, the ultimate source of continuing profitability is the unremitting process of drug discovery. To be profitable, drugs must be marketable: legally novel, safe and relatively free of side effects, efficacious, and ideally inexpensive to produce. While drug discovery was once typified by a haphazard and empirical process, it is now increasingly driven by both knowledge of the receptor-mediated basis of disease and how drug molecules interact with receptors and the wider physiome. Medicinal chemistry postulates that to understand a congeneric ligand series, or set thereof, is to understand the nature and requirements of a ligand binding site. Likewise, structural molecular biology posits that to understand a binding site is to understand the nature of ligands bound therein. Reality sits somewhere between these extremes, yet subsumes them both. Complementary to rules of ligand design, arising through decades of medicinal chemistry, structural biology and computational chemistry are able to elucidate the nature of binding site-ligand interactions, facilitating, at both pragmatic and conceptual levels, the drug discovery process.
Evolving MEMS Resonator Designs for Fabrication
NASA Technical Reports Server (NTRS)
Hornby, Gregory S.; Kraus, William F.; Lohn, Jason D.
2008-01-01
Because of their small size and high reliability, microelectromechanical (MEMS) devices have the potential to revolution many areas of engineering. As with conventionally-sized engineering design, there is likely to be a demand for the automated design of MEMS devices. This paper describes our current status as we progress toward our ultimate goal of using an evolutionary algorithm and a generative representation to produce designs of a MEMS device and successfully demonstrate its transfer to an actual chip. To produce designs that are likely to transfer to reality, we present two ways to modify evaluation of designs. The first is to add location noise, differences between the actual dimensions of the design and the design blueprint, which is a technique we have used for our work in evolving antennas and robots. The second method is to add prestress to model the warping that occurs during the extreme heat of fabrication. In future we expect to fabricate and test some MEMS resonators that are evolved in this way.
Information and the Nature of Reality
NASA Astrophysics Data System (ADS)
Davies, Paul; Gregersen, Niels Henrik
2014-05-01
1. Introduction: does information matter?; Paul Davies and Niels Henrik Gregersen; Part I. History: 2. From matter to materialism and (almost) back Ernan McMullin; 3. Unsolved dilemmas: the concept of matter in the history of philosophy and in contemporary physics Philip Clayton; Part II. Physics: 4. Universe from bit Paul Davies; 5. The computational universe Seth Lloyd; 6. Minds and values in the quantum universe Henry Pierce Stapp; Part III. Biology: 7. The concept of information in biology John Maynard Smith; 8. Levels of information: Shannon-Bolzmann-Darwin Terrence W. Deacon; 9. Information and communication in living matter Bernd-Olaf Kuppers; 10. Semiotic freedom: an emerging force Jesper Hoffmeyer; 11. Care on earth: generating informed concern Holmes Rolston; Part IV. Philosophy and Theology: 12. The sciences of complexity - a new theological resource? Arthur Peacocke; 13. God as the ultimate informational principle Keith Ward; 14. Information, theology and the universe John F. Haught; 15. God, matter, and information: towards a Stoicizing Logos christology Niels Henrik Gregersen; 16. What is the 'spiritual body'? Michael Welker; Index.
Information and the Nature of Reality
NASA Astrophysics Data System (ADS)
Davies, Paul; Gregersen, Niels Henrik
2010-09-01
1. Introduction: does information matter?; Paul Davies and Niels Henrik Gregersen; Part I. History: 2. From matter to materialism and (almost) back Ernan McMullin; 3. Unsolved dilemmas: the concept of matter in the history of philosophy and in contemporary physics Philip Clayton; Part II. Physics: 4. Universe from bit Paul Davies; 5. The computational universe Seth Lloyd; 6. Minds and values in the quantum universe Henry Pierce Stapp; Part III. Biology: 7. The concept of information in biology John Maynard Smith; 8. Levels of information: Shannon-Bolzmann-Darwin Terrence W. Deacon; 9. Information and communication in living matter Bernd-Olaf Küppers; 10. Semiotic freedom: an emerging force Jesper Hoffmeyer; 11. Care on earth: generating informed concern Holmes Rolston; Part IV. Philosophy and Theology: 12. The sciences of complexity - a new theological resource? Arthur Peacocke; 13. God as the ultimate informational principle Keith Ward; 14. Information, theology and the universe John F. Haught; 15. God, matter, and information: towards a Stoicizing Logos christology Niels Henrik Gregersen; 16. What is the 'spiritual body'? Michael Welker; Index.
Simulation training in neurosurgery: advances in education and practice
Konakondla, Sanjay; Fong, Reginald; Schirmer, Clemens M
2017-01-01
The current simulation technology used for neurosurgical training leaves much to be desired. Significant efforts are thoroughly exhausted in hopes of developing simulations that translate to give learners the “real-life” feel. Though a respectable goal, this may not be necessary as the application for simulation in neurosurgical training may be most useful in early learners. The ultimate uniformly agreeable endpoint of improved outcome and patient safety drives these investments. We explore the development, availability, educational taskforces, cost burdens and the simulation advancements in neurosurgical training. The technologies can be directed at achieving early resident milestones placed by the Accreditation Council for Graduate Medical Education. We discuss various aspects of neurosurgery disciplines with specific technologic advances of simulation software. An overview of the scholarly landscape of the recent publications in the realm of medical simulation and virtual reality pertaining to neurologic surgery is provided. We analyze concurrent concept overlap between PubMed headings and provide a graphical overview of the associations between these terms. PMID:28765716
The remarkable life of Erwin Schrödinger's What Is Life?
Margo, Curtis E; Harman, Lynn E
2015-01-01
In the seven decades since Schrödinger's book was published, it has gone through stages of differing appraisal, starting with guarded approbation in the 1940s. When several luminaries in molecular biology described the work as influencing their careers, the book's renown increased. In What Is Life?, Schrödinger examined genetics from the perspective of a theoretical physicist, and conjured up ideas that dilettantes admired and experts slighted. Schrödinger sowed his most important ideas in terms of metaphors, allowing readers considerable latitude for interpretation. Some found nothing worthwhile in the book, only chemical naivete and ignorance of work that had already been done. Others found deep inspiration and a desire to understand biological reproduction, even if it required new paradigms of physical science. What Is Life?--like the ancient parable of the blind men and an elephant--is an example of the ineffable nature of truth, pitting subjective experience against the totality of the reality. The legacy of What Is Life? may ultimately be respect for different opinions.
Wanted, an Anthrax vaccine: Dead or Alive?
Smith, Kendall A
2005-01-01
It has been more than 100 years since the realization that microbes are capable of causing disease. In that time, we have learned a great deal as to how each organism has adapted to the immune system so as to avoid elimination. As well, we have also learned an immense amount since Louis Pasteur first proposed that the solution to infectious diseases was to culture the microbes and attenuate their virulence, so as to use them as vaccines. From the optimism and promise of the 19th century and immunization as the ultimate answer to the invasion by the microbial world, to the scientific realities of the 21st century, it is of interest to retrace the steps of the earliest microbiologists cum immunologists, to realize how far we've come, as well as how far we yet have to go. This editorial focuses on the history of anthrax as a microbial disease, and the earliest efforts at producing a vaccine for its prevention. PMID:15836780
Completing advance directives for health care decisions: getting to yes.
Shewchuk, T R
1998-09-01
The concept of advance directives for health care decision making has been judicially condoned, legislatively promoted, and systematically implemented by health care institutions, yet the execution rate of advance directives remains low. Physicians should discuss with their patients advance care planning generally and end-of-life issues specifically, preferably when patients are in good health and not when they face an acute medical crisis. The physician-hospital relationship poses particular challenges for the optimal implementation of advance directives that must be addressed. Hospital administrators must improve education of patients and physicians on the value of such documents as well as internal mechanisms to ensure better implementation of directives. Health insurance plans may be better able to ensure optimal gathering and implementation of directives. Patients must become more familiar and more comfortable with advance care planning and the reality of death and dying issues. Full acceptance of the value of directives ultimately rests on achieving full participation of all involved--providers, patients, families, and payors--in this most profound process.
The gap between: being and knowing in Zen Buddhism and psychoanalysis.
Cooper, P C
2001-12-01
The author discusses various relationships derived from the image of gap, precipice, and abyss with specific emphasis on interacting dynamics between being and knowing as explicated in the Zen Buddhist teachings of Hui-neng and in the psychoanalytic writings of Wilfred Bion. While of significant value to psychoanalysis, it is argued that symbolic meanings can occlude the actuality of the analysand's or of the spiritual seeker's affective experiencing, particularly concerning the human tendency to concretize experiential states engendered through meditation and/or the psychoanalytic encounter. The author draws from Matte-Blanco's explication of symmetrical and asymmetrical perceptual modalities to discuss the fluid nature of spiritual experiencing, paradoxical coexistence of ultimate and relative realities and reciprocal dynamics and identities between states of experiencing that might otherwise appear opposed. The primacy of experiencing for both disciplines, particularly concerning the experiencing subject's momentary state of consciousness, forms a central theme for both Zen and psychoanalysis. Brief clinical vignettes support and illuminate the author's points.
Humanlike Robots - Synthetically Mimicking Humans
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph
2012-01-01
Nature inspired many inventions and the field of technology that is based on the mimicking or inspiration of nature is widely known as Biomimetics and it is increasingly leading to many new capabilities. There are numerous examples of biomimetic successes including the copying of fins for swimming, and the inspiration of the insects and birds flight. More and more commercial implementations of biomimetics are appearing and behaving lifelike and applications are emerging that are important to our daily life. Making humanlike robots is the ultimate challenge to biomimetics and, for many years, it was considered science fiction, but such robots are becoming an engineering reality. Advances in producing such robot are allowing them to perform impressive functions and tasks. The development of such robots involves addressing many challenges and is raising concerns that are related to fear of their application implications and potential ethical issues. In this paper, the state-of-the-art of humanlike robots, potential applications and challenges will be reviewed.
Ganier, Franck; Hoareau, Charlotte; Tisseau, Jacques
2014-01-01
Virtual reality opens new opportunities for operator training in complex tasks. It lowers costs and has fewer constraints than traditional training. The ultimate goal of virtual training is to transfer knowledge gained in a virtual environment to an actual real-world setting. This study tested whether a maintenance procedure could be learnt equally well by virtual-environment and conventional training. Forty-two adults were divided into three equally sized groups: virtual training (GVT® [generic virtual training]), conventional training (using a real tank suspension and preparation station) and control (no training). Participants then performed the procedure individually in the real environment. Both training types (conventional and virtual) produced similar levels of performance when the procedure was carried out in real conditions. Performance level for the two trained groups was better in terms of success and time taken to complete the task, time spent consulting job instructions and number of times the instructor provided guidance.
Galderisi, Alfonso; Schlissel, Elise; Cengiz, Eda
2017-09-23
Decades after the invention of insulin pump, diabetes management has encountered a technology revolution with the introduction of continuous glucose monitoring, sensor-augmented insulin pump therapy and closed-loop/artificial pancreas systems. In this review, we discuss the significance of the 2016 Endocrine Society Guidelines for insulin pump therapy and continuous glucose monitoring and summarize findings from relevant diabetes technology studies that were conducted after the publication of the 2016 Endocrine Society Guidelines. The 2016 Endocrine Society Guidelines have been a great resource for clinicians managing diabetes in this new era of diabetes technology. There is good body of evidence indicating that using diabetes technology systems safely tightens glycemic control while managing both type 1 and type 2 diabetes. The first-generation diabetes technology systems will evolve as we gain more experience and collaboratively work to improve them with an ultimate goal of keeping people with diabetes complication and burden-free until the cure for diabetes becomes a reality.
The New Jersey Nursing Initiative: building sustainable collaboration.
Bakewell-Sachs, Susan; Mertz, Lynn M; Egreczky, Dana; Ladden, Maryjoan
2011-01-01
The New Jersey Nursing Initiative was publically launched in 2009 as a 5-year, $22 million program of the Robert Wood Johnson Foundation based at the New Jersey Chamber of Commerce Foundation. It was reauthorized in 2011 through 2016 for an additional $8.5 million. The initiative includes a faculty preparation program and strategic tracks of work focusing on building education capacity, increasing current faculty capacity, making nurse faculty a preferred career, leading policy initiatives, creating sustainable funding in support of nursing education, and ultimately, building local, regional, and statewide collaborative networks. The tagline, "So a Nurse will be there for You," emphasizes both the reality of an aging nursing workforce needing replacement and the expected health care transformation that will result in the need for new knowledge and skills in the future nursing workforce. The purpose of this article was to describe the New Jersey Nursing Initiative, emphasizing the partnerships that have resulted from the project to date. Copyright © 2011 Elsevier Inc. All rights reserved.
Two-Photon and Second Harmonic Microscopy in Clinical and Translational Cancer Research
PERRY, SETH W.; BURKE, RYAN M.; BROWN, EDWARD B.
2012-01-01
Application of two-photon microscopy (TPM) to translational and clinical cancer research has burgeoned over the last several years, as several avenues of pre-clinical research have come to fruition. In this review, we focus on two forms of TPM—two-photon excitation fluorescence microscopy, and second harmonic generation microscopy—as they have been used for investigating cancer pathology in ex vivo and in vivo human tissue. We begin with discussion of two-photon theory and instrumentation particularly as applicable to cancer research, followed by an overview of some of the relevant cancer research literature in areas that include two-photon imaging of human tissue biopsies, human skin in vivo, and the rapidly developing technology of two-photon microendoscopy. We believe these and other evolving two-photon methodologies will continue to help translate cancer research from the bench to the bedside, and ultimately bring minimally invasive methods for cancer diagnosis and treatment to therapeutic reality. PMID:22258888
Price-Dynamics of Shares and Bohmian Mechanics: Deterministic or Stochastic Model?
NASA Astrophysics Data System (ADS)
Choustova, Olga
2007-02-01
We apply the mathematical formalism of Bohmian mechanics to describe dynamics of shares. The main distinguishing feature of the financial Bohmian model is the possibility to take into account market psychology by describing expectations of traders by the pilot wave. We also discuss some objections (coming from conventional financial mathematics of stochastic processes) against the deterministic Bohmian model. In particular, the objection that such a model contradicts to the efficient market hypothesis which is the cornerstone of the modern market ideology. Another objection is of pure mathematical nature: it is related to the quadratic variation of price trajectories. One possibility to reply to this critique is to consider the stochastic Bohm-Vigier model, instead of the deterministic one. We do this in the present note.
Illustrated structural application of universal first-order reliability method
NASA Technical Reports Server (NTRS)
Verderaime, V.
1994-01-01
The general application of the proposed first-order reliability method was achieved through the universal normalization of engineering probability distribution data. The method superimposes prevailing deterministic techniques and practices on the first-order reliability method to surmount deficiencies of the deterministic method and provide benefits of reliability techniques and predictions. A reliability design factor is derived from the reliability criterion to satisfy a specified reliability and is analogous to the deterministic safety factor. Its application is numerically illustrated on several practical structural design and verification cases with interesting results and insights. Two concepts of reliability selection criteria are suggested. Though the method was developed to support affordable structures for access to space, the method should also be applicable for most high-performance air and surface transportation systems.
Observations, theoretical ideas and modeling of turbulent flows: Past, present and future
NASA Technical Reports Server (NTRS)
Chapman, G. T.; Tobak, M.
1985-01-01
Turbulence was analyzed in a historical context featuring the interactions between observations, theoretical ideas, and modeling within three successive movements. These are identified as predominantly statistical, structural and deterministic. The statistical movement is criticized for its failure to deal with the structural elements observed in turbulent flows. The structural movement is criticized for its failure to embody observed structural elements within a formal theory. The deterministic movement is described as having the potential of overcoming these deficiencies by allowing structural elements to exhibit chaotic behavior that is nevertheless embodied within a theory. Four major ideas of this movement are described: bifurcation theory, strange attractors, fractals, and the renormalization group. A framework for the future study of turbulent flows is proposed, based on the premises of the deterministic movement.
NASA Astrophysics Data System (ADS)
Gao, Yi
The development and utilization of wind energy for satisfying electrical demand has received considerable attention in recent years due to its tremendous environmental, social and economic benefits, together with public support and government incentives. Electric power generation from wind energy behaves quite differently from that of conventional sources. The fundamentally different operating characteristics of wind energy facilities therefore affect power system reliability in a different manner than those of conventional systems. The reliability impact of such a highly variable energy source is an important aspect that must be assessed when the wind power penetration is significant. The focus of the research described in this thesis is on the utilization of state sampling Monte Carlo simulation in wind integrated bulk electric system reliability analysis and the application of these concepts in system planning and decision making. Load forecast uncertainty is an important factor in long range planning and system development. This thesis describes two approximate approaches developed to reduce the number of steps in a load duration curve which includes load forecast uncertainty, and to provide reasonably accurate generating and bulk system reliability index predictions. The developed approaches are illustrated by application to two composite test systems. A method of generating correlated random numbers with uniform distributions and a specified correlation coefficient in the state sampling method is proposed and used to conduct adequacy assessment in generating systems and in bulk electric systems containing correlated wind farms in this thesis. The studies described show that it is possible to use the state sampling Monte Carlo simulation technique to quantitatively assess the reliability implications associated with adding wind power to a composite generation and transmission system including the effects of multiple correlated wind sites. This is an important development as it permits correlated wind farms to be incorporated in large practical system studies without requiring excessive increases in computer solution time. The procedures described in this thesis for creating monthly and seasonal wind farm models should prove useful in situations where time period models are required to incorporate scheduled maintenance of generation and transmission facilities. There is growing interest in combining deterministic considerations with probabilistic assessment in order to evaluate the quantitative system risk and conduct bulk power system planning. A relatively new approach that incorporates deterministic and probabilistic considerations in a single risk assessment framework has been designated as the joint deterministic-probabilistic approach. The research work described in this thesis illustrates that the joint deterministic-probabilistic approach can be effectively used to integrate wind power in bulk electric system planning. The studies described in this thesis show that the application of the joint deterministic-probabilistic method provides more stringent results for a system with wind power than the traditional deterministic N-1 method because the joint deterministic-probabilistic technique is driven by the deterministic N-1 criterion with an added probabilistic perspective which recognizes the power output characteristics of a wind turbine generator.
Reality Check: Basics of Augmented, Virtual, and Mixed Reality.
Brigham, Tara J
2017-01-01
Augmented, virtual, and mixed reality applications all aim to enhance a user's current experience or reality. While variations of this technology are not new, within the last few years there has been a significant increase in the number of artificial reality devices or applications available to the general public. This column will explain the difference between augmented, virtual, and mixed reality and how each application might be useful in libraries. It will also provide an overview of the concerns surrounding these different reality applications and describe how and where they are currently being used.
Reproducibility in a multiprocessor system
Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka
2013-11-26
Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.
Stochastic Processes in Physics: Deterministic Origins and Control
NASA Astrophysics Data System (ADS)
Demers, Jeffery
Stochastic processes are ubiquitous in the physical sciences and engineering. While often used to model imperfections and experimental uncertainties in the macroscopic world, stochastic processes can attain deeper physical significance when used to model the seemingly random and chaotic nature of the underlying microscopic world. Nowhere more prevalent is this notion than in the field of stochastic thermodynamics - a modern systematic framework used describe mesoscale systems in strongly fluctuating thermal environments which has revolutionized our understanding of, for example, molecular motors, DNA replication, far-from equilibrium systems, and the laws of macroscopic thermodynamics as they apply to the mesoscopic world. With progress, however, come further challenges and deeper questions, most notably in the thermodynamics of information processing and feedback control. Here it is becoming increasingly apparent that, due to divergences and subtleties of interpretation, the deterministic foundations of the stochastic processes themselves must be explored and understood. This thesis presents a survey of stochastic processes in physical systems, the deterministic origins of their emergence, and the subtleties associated with controlling them. First, we study time-dependent billiards in the quivering limit - a limit where a billiard system is indistinguishable from a stochastic system, and where the simplified stochastic system allows us to view issues associated with deterministic time-dependent billiards in a new light and address some long-standing problems. Then, we embark on an exploration of the deterministic microscopic Hamiltonian foundations of non-equilibrium thermodynamics, and we find that important results from mesoscopic stochastic thermodynamics have simple microscopic origins which would not be apparent without the benefit of both the micro and meso perspectives. Finally, we study the problem of stabilizing a stochastic Brownian particle with feedback control, and we find that in order to avoid paradoxes involving the first law of thermodynamics, we need a model for the fine details of the thermal driving noise. The underlying theme of this thesis is the argument that the deterministic microscopic perspective and stochastic mesoscopic perspective are both important and useful, and when used together, we can more deeply and satisfyingly understand the physics occurring over either scale.
Towards Pervasive Augmented Reality: Context-Awareness in Augmented Reality.
Grubert, Jens; Langlotz, Tobias; Zollmann, Stefanie; Regenbrecht, Holger
2017-06-01
Augmented Reality is a technique that enables users to interact with their physical environment through the overlay of digital information. While being researched for decades, more recently, Augmented Reality moved out of the research labs and into the field. While most of the applications are used sporadically and for one particular task only, current and future scenarios will provide a continuous and multi-purpose user experience. Therefore, in this paper, we present the concept of Pervasive Augmented Reality, aiming to provide such an experience by sensing the user's current context and adapting the AR system based on the changing requirements and constraints. We present a taxonomy for Pervasive Augmented Reality and context-aware Augmented Reality, which classifies context sources and context targets relevant for implementing such a context-aware, continuous Augmented Reality experience. We further summarize existing approaches that contribute towards Pervasive Augmented Reality. Based our taxonomy and survey, we identify challenges for future research directions in Pervasive Augmented Reality.
Arifler, Dogu; Arifler, Dizem
2017-04-01
For biomedical applications of nanonetworks, employing molecular communication for information transport is advantageous over nano-electromagnetic communication: molecular communication is potentially biocompatible and inherently energy-efficient. Recently, several studies have modeled receivers in diffusion-based molecular communication systems as "perfectly monitoring" or "perfectly absorbing" spheres based on idealized descriptions of chemoreception. In this paper, we focus on perfectly absorbing receivers and present methods to improve the accuracy of simulation procedures that are used to analyze these receivers. We employ schemes available from the chemical physics and biophysics literature and outline a Monte Carlo simulation algorithm that accounts for the possibility of molecule absorption during discrete time steps, leading to a more accurate analysis of absorption probabilities. Unlike most existing studies that consider a single receiver, this paper analyzes absorption probabilities for multiple receivers deterministically or randomly deployed in a region. For random deployments, the ultimate absorption probabilities as a function of transmitter-receiver distance are shown to fit well to power laws; the exponents derived become more negative as the number of receivers increases up to a limit beyond which no additional receivers can be "packed" in the deployment region. This paper is expected to impact the design of molecular nanonetworks with multiple absorbing receivers.
NASA Astrophysics Data System (ADS)
Matveev, A. S.; Ishchenko, R.
2017-11-01
We consider a generic deterministic time-invariant fluid model of a single server switched network, which consists of finitely many infinite size buffers (queues) and receives constant rate inflows of jobs from the outside. Any flow undergoes a multi-phase service, entering a specific buffer after every phase, and ultimately leaves the network; the route of the flow over the buffers is pre-specified, and flows may merge inside the network. They share a common source of service, which can serve at most one buffer at a time and has to switch among buffers from time to time; any switch consumes a nonzero switchover period. With respect to the long-run maximal scaled wip (work in progress) performance metric, near-optimality of periodic scheduling and service protocols is established: the deepest optimum (that is over all feasible processes in the network, irrespective of the initial state) is furnished by such a protocol up to as small error as desired. Moreover, this can be achieved with a special periodic protocol introduced in the paper. It is also shown that the exhaustive policy is optimal for any buffer whose service at the maximal rate does not cause growth of the scaled wip.
Elegant space systems: How do we get there?
NASA Astrophysics Data System (ADS)
Salado, Alejandro; Nilchiani, Roshanak
Can the space industry produce elegant systems? If so, how? Space systems development has become process-centric, e.g., process creation or modification is the default response to most development and/or operations challenges when problems are encountered. But is that really effective? An increasing number of researchers and practitioners disagree with such an approach and suggest that elegance is as important to a system and its operation as fulfillment of technical and contractual requirements; consequently they are proposing a review and refreshment of the systems engineering practice. Elegance is generally recognizable, but hard to achieve deterministically. The research community has begun an endeavor to define what elegance is in systems engineering terms, find ways to measure or at least characterize it, and create or adapt philosophies and methodologies that promote elegance as a design objective (driver?). This paper asserts that while elegance cannot be engineered in a traditional sense, it can emerge as a natural result of design activity. This needs to be enabled and can be facilitated, but ultimately depends on the talent of the design teams as individuals and as a group. This paper summarizes existing technical definitions of elegance and discusses a) how it can be pursued and b) cultural conditions and habits that help elegance emerge during the development and operation of a space system.
NASA Astrophysics Data System (ADS)
Doležel, Jiří; Novák, Drahomír; Petrů, Jan
2017-09-01
Transportation routes of oversize and excessive loads are currently planned in relation to ensure the transit of a vehicle through critical points on the road. Critical points are level-intersection of roads, bridges etc. This article presents a comprehensive procedure to determine a reliability and a load-bearing capacity level of the existing bridges on highways and roads using the advanced methods of reliability analysis based on simulation techniques of Monte Carlo type in combination with nonlinear finite element method analysis. The safety index is considered as a main criterion of the reliability level of the existing construction structures and the index is described in current structural design standards, e.g. ISO and Eurocode. An example of a single-span slab bridge made of precast prestressed concrete girders of the 60 year current time and its load bearing capacity is set for the ultimate limit state and serviceability limit state. The structure’s design load capacity was estimated by the full probability nonlinear MKP analysis using a simulation technique Latin Hypercube Sampling (LHS). Load-bearing capacity values based on a fully probabilistic analysis are compared with the load-bearing capacity levels which were estimated by deterministic methods of a critical section of the most loaded girders.
ERIC Educational Resources Information Center
Pantelidis, Veronica S.
2009-01-01
Many studies have been conducted on the use of virtual reality in education and training. This article lists examples of such research. Reasons to use virtual reality are discussed. Advantages and disadvantages of using virtual reality are presented, as well as suggestions on when to use and when not to use virtual reality. A model that can be…
Do rational numbers play a role in selection for stochasticity?
Sinclair, Robert
2014-01-01
When a given tissue must, to be able to perform its various functions, consist of different cell types, each fairly evenly distributed and with specific probabilities, then there are at least two quite different developmental mechanisms which might achieve the desired result. Let us begin with the case of two cell types, and first imagine that the proportion of numbers of cells of these types should be 1:3. Clearly, a regular structure composed of repeating units of four cells, three of which are of the dominant type, will easily satisfy the requirements, and a deterministic mechanism may lend itself to the task. What if, however, the proportion should be 10:33? The same simple, deterministic approach would now require a structure of repeating units of 43 cells, and this certainly seems to require a far more complex and potentially prohibitive deterministic developmental program. Stochastic development, replacing regular units with random distributions of given densities, might not be evolutionarily competitive in comparison with the deterministic program when the proportions should be 1:3, but it has the property that, whatever developmental mechanism underlies it, its complexity does not need to depend very much upon target cell densities at all. We are immediately led to speculate that proportions which correspond to fractions with large denominators (such as the 33 of 10/33) may be more easily achieved by stochastic developmental programs than by deterministic ones, and this is the core of our thesis: that stochastic development may tend to occur more often in cases involving rational numbers with large denominators. To be imprecise: that simple rationality and determinism belong together, as do irrationality and randomness.
Comparative study on neutronics characteristics of a 1500 MWe metal fuel sodium-cooled fast reactor
Ohgama, Kazuya; Aliberti, Gerardo; Stauff, Nicolas E.; ...
2017-02-28
Under the cooperative effort of the Civil Nuclear Energy R&D Working Group within the framework of the U.S.-Japan bilateral, Argonne National Laboratory (ANL) and Japan Atomic Energy Agency (JAEA) have been performing benchmark study using Japan Sodium-cooled Fast Reactor (JSFR) design with metal fuel. In this benchmark study, core characteristic parameters at the beginning of cycle were evaluated by the best estimate deterministic and stochastic methodologies of ANL and JAEA. The results obtained by both institutions show a good agreement with less than 200 pcm of discrepancy on the neutron multiplication factor, and less than 3% of discrepancy on themore » sodium void reactivity, Doppler reactivity, and control rod worth. The results by the stochastic and deterministic approaches were compared in each party to investigate impacts of the deterministic approximation and to understand potential variations in the results due to different calculation methodologies employed. From the detailed analysis of methodologies, it was found that the good agreement in multiplication factor from the deterministic calculations comes from the cancellation of the differences on the methodology (0.4%) and nuclear data (0.6%). The different treatment in reflector cross section generation was estimated as the major cause of the discrepancy between the multiplication factors by the JAEA and ANL deterministic methodologies. Impacts of the nuclear data libraries were also investigated using a sensitivity analysis methodology. Furthermore, the differences on the inelastic scattering cross sections of U-238, ν values and fission cross sections of Pu-239 and µ-average of Na-23 are the major contributors to the difference on the multiplication factors.« less
Hahl, Sayuri K; Kremling, Andreas
2016-01-01
In the mathematical modeling of biochemical reactions, a convenient standard approach is to use ordinary differential equations (ODEs) that follow the law of mass action. However, this deterministic ansatz is based on simplifications; in particular, it neglects noise, which is inherent to biological processes. In contrast, the stochasticity of reactions is captured in detail by the discrete chemical master equation (CME). Therefore, the CME is frequently applied to mesoscopic systems, where copy numbers of involved components are small and random fluctuations are thus significant. Here, we compare those two common modeling approaches, aiming at identifying parallels and discrepancies between deterministic variables and possible stochastic counterparts like the mean or modes of the state space probability distribution. To that end, a mathematically flexible reaction scheme of autoregulatory gene expression is translated into the corresponding ODE and CME formulations. We show that in the thermodynamic limit, deterministic stable fixed points usually correspond well to the modes in the stationary probability distribution. However, this connection might be disrupted in small systems. The discrepancies are characterized and systematically traced back to the magnitude of the stoichiometric coefficients and to the presence of nonlinear reactions. These factors are found to synergistically promote large and highly asymmetric fluctuations. As a consequence, bistable but unimodal, and monostable but bimodal systems can emerge. This clearly challenges the role of ODE modeling in the description of cellular signaling and regulation, where some of the involved components usually occur in low copy numbers. Nevertheless, systems whose bimodality originates from deterministic bistability are found to sustain a more robust separation of the two states compared to bimodal, but monostable systems. In regulatory circuits that require precise coordination, ODE modeling is thus still expected to provide relevant indications on the underlying dynamics.
Comparative study on neutronics characteristics of a 1500 MWe metal fuel sodium-cooled fast reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ohgama, Kazuya; Aliberti, Gerardo; Stauff, Nicolas E.
Under the cooperative effort of the Civil Nuclear Energy R&D Working Group within the framework of the U.S.-Japan bilateral, Argonne National Laboratory (ANL) and Japan Atomic Energy Agency (JAEA) have been performing benchmark study using Japan Sodium-cooled Fast Reactor (JSFR) design with metal fuel. In this benchmark study, core characteristic parameters at the beginning of cycle were evaluated by the best estimate deterministic and stochastic methodologies of ANL and JAEA. The results obtained by both institutions show a good agreement with less than 200 pcm of discrepancy on the neutron multiplication factor, and less than 3% of discrepancy on themore » sodium void reactivity, Doppler reactivity, and control rod worth. The results by the stochastic and deterministic approaches were compared in each party to investigate impacts of the deterministic approximation and to understand potential variations in the results due to different calculation methodologies employed. From the detailed analysis of methodologies, it was found that the good agreement in multiplication factor from the deterministic calculations comes from the cancellation of the differences on the methodology (0.4%) and nuclear data (0.6%). The different treatment in reflector cross section generation was estimated as the major cause of the discrepancy between the multiplication factors by the JAEA and ANL deterministic methodologies. Impacts of the nuclear data libraries were also investigated using a sensitivity analysis methodology. Furthermore, the differences on the inelastic scattering cross sections of U-238, ν values and fission cross sections of Pu-239 and µ-average of Na-23 are the major contributors to the difference on the multiplication factors.« less
Study on the evaluation method for fault displacement based on characterized source model
NASA Astrophysics Data System (ADS)
Tonagi, M.; Takahama, T.; Matsumoto, Y.; Inoue, N.; Irikura, K.; Dalguer, L. A.
2016-12-01
In IAEA Specific Safety Guide (SSG) 9 describes that probabilistic methods for evaluating fault displacement should be used if no sufficient basis is provided to decide conclusively that the fault is not capable by using the deterministic methodology. In addition, International Seismic Safety Centre compiles as ANNEX to realize seismic hazard for nuclear facilities described in SSG-9 and shows the utility of the deterministic and probabilistic evaluation methods for fault displacement. In Japan, it is required that important nuclear facilities should be established on ground where fault displacement will not arise when earthquakes occur in the future. Under these situations, based on requirements, we need develop evaluation methods for fault displacement to enhance safety in nuclear facilities. We are studying deterministic and probabilistic methods with tentative analyses using observed records such as surface fault displacement and near-fault strong ground motions of inland crustal earthquake which fault displacements arose. In this study, we introduce the concept of evaluation methods for fault displacement. After that, we show parts of tentative analysis results for deterministic method as follows: (1) For the 1999 Chi-Chi earthquake, referring slip distribution estimated by waveform inversion, we construct a characterized source model (Miyake et al., 2003, BSSA) which can explain observed near-fault broad band strong ground motions. (2) Referring a characterized source model constructed in (1), we study an evaluation method for surface fault displacement using hybrid method, which combines particle method and distinct element method. At last, we suggest one of the deterministic method to evaluate fault displacement based on characterized source model. This research was part of the 2015 research project `Development of evaluating method for fault displacement` by the Secretariat of Nuclear Regulation Authority (S/NRA), Japan.
Hu, Weigang; Zhang, Qi; Tian, Tian; Li, Dingyao; Cheng, Gang; Mu, Jing; Wu, Qingbai; Niu, Fujun; Stegen, James C; An, Lizhe; Feng, Huyuan
2015-01-01
Understanding the processes that influence the structure of biotic communities is one of the major ecological topics, and both stochastic and deterministic processes are expected to be at work simultaneously in most communities. Here, we investigated the vertical distribution patterns of bacterial communities in a 10-m-long soil core taken within permafrost of the Qinghai-Tibet Plateau. To get a better understanding of the forces that govern these patterns, we examined the diversity and structure of bacterial communities, and the change in community composition along the vertical distance (spatial turnover) from both taxonomic and phylogenetic perspectives. Measures of taxonomic and phylogenetic beta diversity revealed that bacterial community composition changed continuously along the soil core, and showed a vertical distance-decay relationship. Multiple stepwise regression analysis suggested that bacterial alpha diversity and phylogenetic structure were strongly correlated with soil conductivity and pH but weakly correlated with depth. There was evidence that deterministic and stochastic processes collectively drived bacterial vertically-structured pattern. Bacterial communities in five soil horizons (two originated from the active layer and three from permafrost) of the permafrost core were phylogenetically random, indicator of stochastic processes. However, we found a stronger effect of deterministic processes related to soil pH, conductivity, and organic carbon content that were structuring the bacterial communities. We therefore conclude that the vertical distribution of bacterial communities was governed primarily by deterministic ecological selection, although stochastic processes were also at work. Furthermore, the strong impact of environmental conditions (for example, soil physicochemical parameters and seasonal freeze-thaw cycles) on these communities underlines the sensitivity of permafrost microorganisms to climate change and potentially subsequent permafrost thaw.
Tian, Tian; Li, Dingyao; Cheng, Gang; Mu, Jing; Wu, Qingbai; Niu, Fujun; Stegen, James C.; An, Lizhe; Feng, Huyuan
2015-01-01
Understanding the processes that influence the structure of biotic communities is one of the major ecological topics, and both stochastic and deterministic processes are expected to be at work simultaneously in most communities. Here, we investigated the vertical distribution patterns of bacterial communities in a 10-m-long soil core taken within permafrost of the Qinghai-Tibet Plateau. To get a better understanding of the forces that govern these patterns, we examined the diversity and structure of bacterial communities, and the change in community composition along the vertical distance (spatial turnover) from both taxonomic and phylogenetic perspectives. Measures of taxonomic and phylogenetic beta diversity revealed that bacterial community composition changed continuously along the soil core, and showed a vertical distance-decay relationship. Multiple stepwise regression analysis suggested that bacterial alpha diversity and phylogenetic structure were strongly correlated with soil conductivity and pH but weakly correlated with depth. There was evidence that deterministic and stochastic processes collectively drived bacterial vertically-structured pattern. Bacterial communities in five soil horizons (two originated from the active layer and three from permafrost) of the permafrost core were phylogenetically random, indicator of stochastic processes. However, we found a stronger effect of deterministic processes related to soil pH, conductivity, and organic carbon content that were structuring the bacterial communities. We therefore conclude that the vertical distribution of bacterial communities was governed primarily by deterministic ecological selection, although stochastic processes were also at work. Furthermore, the strong impact of environmental conditions (for example, soil physicochemical parameters and seasonal freeze-thaw cycles) on these communities underlines the sensitivity of permafrost microorganisms to climate change and potentially subsequent permafrost thaw. PMID:26699734
Modulation of thermal pain-related brain activity with virtual reality: evidence from fMRI.
Hoffman, Hunter G; Richards, Todd L; Coda, Barbara; Bills, Aric R; Blough, David; Richards, Anne L; Sharar, Sam R
2004-06-07
This study investigated the neural correlates of virtual reality analgesia. Virtual reality significantly reduced subjective pain ratings (i.e. analgesia). Using fMRI, pain-related brain activity was measured for each participant during conditions of no virtual reality and during virtual reality (order randomized). As predicted, virtual reality significantly reduced pain-related brain activity in all five regions of interest; the anterior cingulate cortex, primary and secondary somatosensory cortex, insula, and thalamus (p<0.002, corrected). Results showed direct modulation of human brain pain responses by virtual reality distraction. Copyright 2004 Lippincott Williams and Wilkins
The 'mad scientists': psychoanalysis, dream and virtual reality.
Leclaire, Marie
2003-04-01
The author explores the concept of reality-testing as a means of assessing the relationship with reality that prevails in dream and in virtual reality. Based on a model developed by Jean Laplanche, she compares these activities in detail in order to determine their respective independence from the function of reality-testing. By carefully examining the concept of hallucination in the writings of Freud and Daniel Dennett, the author seeks to pinpoint the specific modalities of interaction between perceptions, ideas, wishes and actions that converge in the 'belief' and in the 'sense of reality'. The paper's main thesis consists of the distinction that it draws between immediacy-testing and reality-testing, with the further argument that this distinction not only dissipates the conceptual vagueness that generally surrounds the latter of the two concepts but also that it promotes a more precise analysis of the function of reality in dream and in virtual reality.
Reality television predicts both positive and negative outcomes for adolescent girls.
Ferguson, Christopher J; Salmond, Kimberlee; Modi, Kamla
2013-06-01
To assess the influence of media, specifically reality television, on adolescent behavior. A total of 1141 preteen and adolescent girls (age range 11-17) answered questions related to their reality television viewing, personality, self-esteem, relational aggression, appearance focus, and desire for fame. Our results indicated that the influence of reality television on adolescent behavior is complex and potentially related to the adolescents' intended uses and gratifications for using reality television. Reality television viewing was positively related to increased self-esteem and expectations of respect in dating relationships. However, watching reality television also was related to an increased focus on appearance and willingness to compromise other values for fame. Reality television viewing did not predict relational aggression. The potential influences of reality television use on adolescent girls are both positive and negative, defying easy categorization. Copyright © 2013 Mosby, Inc. All rights reserved.
Innovative application of virtual display technique in virtual museum
NASA Astrophysics Data System (ADS)
Zhang, Jiankang
2017-09-01
Virtual museum refers to display and simulate the functions of real museum on the Internet in the form of 3 Dimensions virtual reality by applying interactive programs. Based on Virtual Reality Modeling Language, virtual museum building and its effective interaction with the offline museum lie in making full use of 3 Dimensions panorama technique, virtual reality technique and augmented reality technique, and innovatively taking advantages of dynamic environment modeling technique, real-time 3 Dimensions graphics generating technique, system integration technique and other key virtual reality techniques to make sure the overall design of virtual museum.3 Dimensions panorama technique, also known as panoramic photography or virtual reality, is a technique based on static images of the reality. Virtual reality technique is a kind of computer simulation system which can create and experience the interactive 3 Dimensions dynamic visual world. Augmented reality, also known as mixed reality, is a technique which simulates and mixes the information (visual, sound, taste, touch, etc.) that is difficult for human to experience in reality. These technologies make virtual museum come true. It will not only bring better experience and convenience to the public, but also be conducive to improve the influence and cultural functions of the real museum.
Fransson, Boel A; Chen, Chi-Ya; Noyes, Julie A; Ragle, Claude A
2016-11-01
To determine the construct and concurrent validity of instrument motion metrics for laparoscopic skills assessment in virtual reality and augmented reality simulators. Evaluation study. Veterinarian students (novice, n = 14) and veterinarians (experienced, n = 11) with no or variable laparoscopic experience. Participants' minimally invasive surgery (MIS) experience was determined by hospital records of MIS procedures performed in the Teaching Hospital. Basic laparoscopic skills were assessed by 5 tasks using a physical box trainer. Each participant completed 2 tasks for assessments in each type of simulator (virtual reality: bowel handling and cutting; augmented reality: object positioning and a pericardial window model). Motion metrics such as instrument path length, angle or drift, and economy of motion of each simulator were recorded. None of the motion metrics in a virtual reality simulator showed correlation with experience, or to the basic laparoscopic skills score. All metrics in augmented reality were significantly correlated with experience (time, instrument path, and economy of movement), except for the hand dominance metric. The basic laparoscopic skills score was correlated to all performance metrics in augmented reality. The augmented reality motion metrics differed between American College of Veterinary Surgeons diplomates and residents, whereas basic laparoscopic skills score and virtual reality metrics did not. Our results provide construct validity and concurrent validity for motion analysis metrics for an augmented reality system, whereas a virtual reality system was validated only for the time score. © Copyright 2016 by The American College of Veterinary Surgeons.
Enterprise resource planning for hospitals.
van Merode, Godefridus G; Groothuis, Siebren; Hasman, Arie
2004-06-30
Integrated hospitals need a central planning and control system to plan patients' processes and the required capacity. Given the changes in healthcare one can ask the question what type of information systems can best support these healthcare delivery organizations. We focus in this review on the potential of enterprise resource planning (ERP) systems for healthcare delivery organizations. First ERP systems are explained. An overview is then presented of the characteristics of the planning process in hospital environments. Problems with ERP that are due to the special characteristics of healthcare are presented. The situations in which ERP can or cannot be used are discussed. It is suggested to divide hospitals in a part that is concerned only with deterministic processes and a part that is concerned with non-deterministic processes. ERP can be very useful for planning and controlling the deterministic processes.
Large conditional single-photon cross-phase modulation
NASA Astrophysics Data System (ADS)
Beck, Kristin; Hosseini, Mahdi; Duan, Yiheng; Vuletic, Vladan
2016-05-01
Deterministic optical quantum logic requires a nonlinear quantum process that alters the phase of a quantum optical state by π through interaction with only one photon. Here, we demonstrate a large conditional cross-phase modulation between a signal field, stored inside an atomic quantum memory, and a control photon that traverses a high-finesse optical cavity containing the atomic memory. This approach avoids fundamental limitations associated with multimode effects for traveling optical photons. We measure a conditional cross-phase shift of up to π / 3 between the retrieved signal and control photons, and confirm deterministic entanglement between the signal and control modes by extracting a positive concurrence. With a moderate improvement in cavity finesse, our system can reach a coherent phase shift of p at low loss, enabling deterministic and universal photonic quantum logic. Preprint: arXiv:1512.02166 [quant-ph
Full 3D visualization tool-kit for Monte Carlo and deterministic transport codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frambati, S.; Frignani, M.
2012-07-01
We propose a package of tools capable of translating the geometric inputs and outputs of many Monte Carlo and deterministic radiation transport codes into open source file formats. These tools are aimed at bridging the gap between trusted, widely-used radiation analysis codes and very powerful, more recent and commonly used visualization software, thus supporting the design process and helping with shielding optimization. Three main lines of development were followed: mesh-based analysis of Monte Carlo codes, mesh-based analysis of deterministic codes and Monte Carlo surface meshing. The developed kit is considered a powerful and cost-effective tool in the computer-aided design formore » radiation transport code users of the nuclear world, and in particular in the fields of core design and radiation analysis. (authors)« less
From Weakly Chaotic Dynamics to Deterministic Subdiffusion via Copula Modeling
NASA Astrophysics Data System (ADS)
Nazé, Pierre
2018-03-01
Copula modeling consists in finding a probabilistic distribution, called copula, whereby its coupling with the marginal distributions of a set of random variables produces their joint distribution. The present work aims to use this technique to connect the statistical distributions of weakly chaotic dynamics and deterministic subdiffusion. More precisely, we decompose the jumps distribution of Geisel-Thomae map into a bivariate one and determine the marginal and copula distributions respectively by infinite ergodic theory and statistical inference techniques. We verify therefore that the characteristic tail distribution of subdiffusion is an extreme value copula coupling Mittag-Leffler distributions. We also present a method to calculate the exact copula and joint distributions in the case where weakly chaotic dynamics and deterministic subdiffusion statistical distributions are already known. Numerical simulations and consistency with the dynamical aspects of the map support our results.
Deterministic generation of multiparticle entanglement by quantum Zeno dynamics.
Barontini, Giovanni; Hohmann, Leander; Haas, Florian; Estève, Jérôme; Reichel, Jakob
2015-09-18
Multiparticle entangled quantum states, a key resource in quantum-enhanced metrology and computing, are usually generated by coherent operations exclusively. However, unusual forms of quantum dynamics can be obtained when environment coupling is used as part of the state generation. In this work, we used quantum Zeno dynamics (QZD), based on nondestructive measurement with an optical microcavity, to deterministically generate different multiparticle entangled states in an ensemble of 36 qubit atoms in less than 5 microseconds. We characterized the resulting states by performing quantum tomography, yielding a time-resolved account of the entanglement generation. In addition, we studied the dependence of quantum states on measurement strength and quantified the depth of entanglement. Our results show that QZD is a versatile tool for fast and deterministic entanglement generation in quantum engineering applications. Copyright © 2015, American Association for the Advancement of Science.
Deterministic error correction for nonlocal spatial-polarization hyperentanglement
Li, Tao; Wang, Guan-Yu; Deng, Fu-Guo; Long, Gui-Lu
2016-01-01
Hyperentanglement is an effective quantum source for quantum communication network due to its high capacity, low loss rate, and its unusual character in teleportation of quantum particle fully. Here we present a deterministic error-correction scheme for nonlocal spatial-polarization hyperentangled photon pairs over collective-noise channels. In our scheme, the spatial-polarization hyperentanglement is first encoded into a spatial-defined time-bin entanglement with identical polarization before it is transmitted over collective-noise channels, which leads to the error rejection of the spatial entanglement during the transmission. The polarization noise affecting the polarization entanglement can be corrected with a proper one-step decoding procedure. The two parties in quantum communication can, in principle, obtain a nonlocal maximally entangled spatial-polarization hyperentanglement in a deterministic way, which makes our protocol more convenient than others in long-distance quantum communication. PMID:26861681
Deterministic error correction for nonlocal spatial-polarization hyperentanglement.
Li, Tao; Wang, Guan-Yu; Deng, Fu-Guo; Long, Gui-Lu
2016-02-10
Hyperentanglement is an effective quantum source for quantum communication network due to its high capacity, low loss rate, and its unusual character in teleportation of quantum particle fully. Here we present a deterministic error-correction scheme for nonlocal spatial-polarization hyperentangled photon pairs over collective-noise channels. In our scheme, the spatial-polarization hyperentanglement is first encoded into a spatial-defined time-bin entanglement with identical polarization before it is transmitted over collective-noise channels, which leads to the error rejection of the spatial entanglement during the transmission. The polarization noise affecting the polarization entanglement can be corrected with a proper one-step decoding procedure. The two parties in quantum communication can, in principle, obtain a nonlocal maximally entangled spatial-polarization hyperentanglement in a deterministic way, which makes our protocol more convenient than others in long-distance quantum communication.
The application of diffraction grating in the design of virtual reality (VR) system
NASA Astrophysics Data System (ADS)
Chen, Jiekang; Huang, Qitai; Guan, Min
2017-10-01
Virtual Reality (VR) products serve for human eyes ultimately, and the optical properties of VR optical systems must be consistent with the characteristic of human eyes. The monocular coaxial VR optical system is simulated in ZEMAX. A diffraction grating is added to the optical surface next to the eye, and the lights emitted from the diffraction grating are deflected, which can forming an asymmetrical field of view(FOV). Then the lateral chromatic aberration caused by the diffraction grating was corrected by the chromatic dispersion of the prism. Finally, the aspheric surface was added to further optimum design. During the optical design of the system, how to balance the dispersion of the diffraction grating and the prism is the main problem. The balance was achieved by adjusting the parameters of the grating and the prism constantly, and then using aspheric surfaces finally. In order to make the asymmetric FOV of the system consistent with the angle of the visual axis, and to ensure the stereo vision area clear, the smaller half FOV of monocular system is required to reach 30°. Eventually, a system with asymmetrical FOV of 30°+40° was designed. In addition, the aberration curve of the system was analyzed by ZEMAX, and the binocular FOV was calculated according to the principle of binocular overlap. The results show that the asymmetry of FOV of VR monocular optical system can fit to human eyes and the imaging quality match for the human visual characteristics. At the same time, the diffraction grating increases binocular FOV, which decreases the requirement for the design FOV of monocular system.
NASA Astrophysics Data System (ADS)
Chikozho, C.; Kujinga, K.
2017-08-01
Decision makers in developing countries are often confronted by difficult choices regarding the selection and deployment of appropriate water supply governance regimes that sufficiently take into account national socio-economic and political realities. Indeed, scholars and practitioners alike continue to grapple with the need to create the optimum water supply and allocation decision-making space applicable to specific developing countries. In this paper, we review documented case studies from various parts of the world to explore the utility of free-market economics approaches in water supply governance. This is one of the major paradigms that have emerged in the face of enduring questions regarding how best to govern water supply systems in developing countries. In the paper, we postulate that increasing pressure on available natural resources may have already rendered obsolete some of the water supply governance regimes that have served human societies very well for many decades. Our main findings show that national and municipal water supply governance paradigms tend to change in tandem with emerging national development frameworks and priorities. While many developing countries have adopted water management and governance policy prescriptions from the international arena, national and local socio-economic and political realities ultimately determine what works and what does not work on the ground. We thus, conclude that the choice of what constitutes an appropriate water supply governance regime in context is never simple. Indeed, the majority of case studies reviewed in the paper tend to rely on a mix of market economics and developmental statism to make their water governance regimes more realistic and workable on the ground.
Isaranuwatchai, Wanrudee; Brydges, Ryan; Carnahan, Heather; Backstein, David; Dubrowski, Adam
2014-05-01
While the ultimate goal of simulation training is to enhance learning, cost-effectiveness is a critical factor. Research that compares simulation training in terms of educational- and cost-effectiveness will lead to better-informed curricular decisions. Using previously published data we conducted a cost-effectiveness analysis of three simulation-based programs. Medical students (n = 15 per group) practiced in one of three 2-h intravenous catheterization skills training programs: low-fidelity (virtual reality), high-fidelity (mannequin), or progressive (consisting of virtual reality, task trainer, and mannequin simulator). One week later, all performed a transfer test on a hybrid simulation (standardized patient with a task trainer). We used a net benefit regression model to identify the most cost-effective training program via paired comparisons. We also created a cost-effectiveness acceptability curve to visually represent the probability that one program is more cost-effective when compared to its comparator at various 'willingness-to-pay' values. We conducted separate analyses for implementation and total costs. The results showed that the progressive program had the highest total cost (p < 0.001) whereas the high-fidelity program had the highest implementation cost (p < 0.001). While the most cost-effective program depended on the decision makers' willingness-to-pay value, the progressive training program was generally most educationally- and cost-effective. Our analyses suggest that a progressive program that strategically combines simulation modalities provides a cost-effective solution. More generally, we have introduced how a cost-effectiveness analysis may be applied to simulation training; a method that medical educators may use to investment decisions (e.g., purchasing cost-effective and educationally sound simulators).
Burden, Christy; Appleyard, Tracy-Louise; Angouri, Jo; Draycott, Timothy J; McDermott, Leanne; Fox, Robert
2013-10-01
Virtual-reality (VR) training has been demonstrated to improve laparoscopic surgical skills in the operating theatre. The incorporation of laparoscopic VR simulation into surgical training in gynaecology remains a significant educational challenge. We undertook a pilot study to assess the feasibility of the implementation of a laparoscopic VR simulation programme into a single unit. An observational study with qualitative analysis of semi-structured group interviews. Trainees in gynaecology (n=9) were scheduled to undertake a pre-validated structured training programme on a laparoscopic VR simulator (LapSim(®)) over six months. The main outcome measure was the trainees' progress through the training modules in six months. Trainees' perceptions of the feasibility and barriers to the implementation of laparoscopic VR training were assessed in focus groups after training. Sixty-six percent of participants completed six of ten modules. Overall, feedback from the focus groups was positive; trainees felt training improved their dexterity, hand-eye co-ordination and confidence in theatre. Negative aspects included lack of haptic feedback, and facility for laparoscopic port placement training. Time restriction emerged as the main barrier to training. Despite positive perceptions of training, no trainee completed more than two-thirds of the modules of a self-directed laparoscopic VR training programme. Suggested improvements to the integration of future laparoscopic VR training include an additional theoretical component with a fuller understanding of benefits of VR training, and scheduled supervision. Ultimately, the success of a laparoscopic VR simulation training programme might only be improved if it is a mandatory component of the curriculum, together with dedicated time for training. Future multi-centred implementation studies of validated laparoscopic VR curricula are required. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Dreaming as a 'curtain of illusion': revisiting the 'royal road' with Bion as our guide.
Grotstein, James S
2009-08-01
One of Bion's most unique contributions to psychoanalysis is his conception of dreaming in which he elaborates, modifies, and extends Freud 's ideas. While Freud dealt extensively with dream-work, he showed more interest in dreams themselves and their latent meaning and theorized that dreams ultimately constituted wish-fulfillments originating from the activity of the pleasure principle. Bion, on the other hand, focuses more on the process of dreaming itself and believes that dreaming occurs throughout the day as well as the night and serves the reality principle as well as the pleasure principle. In order for wakeful consciousness to occur, dreaming must absorb (contain) the day residue, and transfer it to System Ucs. from System Cs. for it to be processed (transformed) and then returned to System Cs. through the selectively-permeable contact-barrier. Dreaming, consequently, allows the subject to remain awake by day and asleep by night by its processing of the day's residue. Bion seems to conceive of dreaming as an ever-present invisible filter that overlays much of our mental life, including perception, as well as attention itself. He further believes that dreaming is a form of thinking that normally involves the collaborative yet oppositional (not conflictual) activity of the reality and pleasure principles as well as the primary and secondary processes. He also conflates Freud 's primary and secondary processes into a single 'binary-oppositional' structure (Lévi-Strauss, 1958, 1970) that he terms 'alpha-function', which constitutes a virtual model that corresponds to the in-vivo activity of dreaming. He further believes that the analyst dreams as he or she listens and interprets and that the analysand likewise dreams while he or she freely associates.
Nano transfer and nanoreplication using deterministically grown sacrificial nanotemplates
Melechko, Anatoli V [Oak Ridge, TN; McKnight, Timothy E [Greenback, TN; Guillorn, Michael A [Ithaca, NY; Ilic, Bojan [Ithaca, NY; Merkulov, Vladimir I [Knoxville, TX; Doktycz, Mitchel J [Knoxville, TN; Lowndes, Douglas H [Knoxville, TN; Simpson, Michael L [Knoxville, TN
2012-03-27
Methods, manufactures, machines and compositions are described for nanotransfer and nanoreplication using deterministically grown sacrificial nanotemplates. An apparatus, includes a substrate and a nanoconduit material coupled to a surface of the substrate. The substrate defines an aperture and the nanoconduit material defines a nanoconduit that is i) contiguous with the aperture and ii) aligned substantially non-parallel to a plane defined by the surface of the substrate.
Robust Sensitivity Analysis for Multi-Attribute Deterministic Hierarchical Value Models
2002-03-01
such as weighted sum method, weighted 5 product method, and the Analytic Hierarchy Process ( AHP ). This research focuses on only weighted sum...different groups. They can be termed as deterministic, stochastic, or fuzzy multi-objective decision methods if they are classified according to the...weighted product model (WPM), and analytic hierarchy process ( AHP ). His method attempts to identify the most important criteria weight and the most
Tularosa Basin Play Fairway Analysis Data and Models
Nash, Greg
2017-07-11
This submission includes raster datasets for each layer of evidence used for weights of evidence analysis as well as the deterministic play fairway analysis (PFA). Data representative of heat, permeability and groundwater comprises some of the raster datasets. Additionally, the final deterministic PFA model is provided along with a certainty model. All of these datasets are best used with an ArcGIS software package, specifically Spatial Data Modeler.
Evaluation Seismicity west of block-lut for Deterministic Seismic Hazard Assessment of Shahdad ,Iran
NASA Astrophysics Data System (ADS)
Ney, B.; Askari, M.
2009-04-01
Evaluation Seismicity west of block-lut for Deterministic Seismic Hazard Assessment of Shahdad ,Iran Behnoosh Neyestani , Mina Askari Students of Science and Research University,Iran. Seismic Hazard Assessment has been done for Shahdad city in this study , and four maps (Kerman-Bam-Nakhil Ab-Allah Abad) has been prepared to indicate the Deterministic estimate of Peak Ground Acceleration (PGA) in this area. Deterministic Seismic Hazard Assessment has been preformed for a region in eastern Iran (Shahdad) based on the available geological, seismological and geophysical information and seismic zoning map of region has been constructed. For this assessment first Seimotectonic map of study region in a radius of 100km is prepared using geological maps, distribution of historical and instrumental earthquake data and focal mechanism solutions it is used as the base map for delineation of potential seismic sources. After that minimum distance, for every seismic sources until site (Shahdad) and maximum magnitude for each source have been determined. In Shahdad ,according to results, peak ground acceleration using the Yoshimitsu Fukushima &Teiji Tanaka'1990 attenuation relationship is estimated to be 0.58 g, that is related to the movement of nayband fault with distance 2.4km of the site and maximum magnitude Ms=7.5.
NASA Astrophysics Data System (ADS)
Ramos, José A.; Mercère, Guillaume
2016-12-01
In this paper, we present an algorithm for identifying two-dimensional (2D) causal, recursive and separable-in-denominator (CRSD) state-space models in the Roesser form with deterministic-stochastic inputs. The algorithm implements the N4SID, PO-MOESP and CCA methods, which are well known in the literature on 1D system identification, but here we do so for the 2D CRSD Roesser model. The algorithm solves the 2D system identification problem by maintaining the constraint structure imposed by the problem (i.e. Toeplitz and Hankel) and computes the horizontal and vertical system orders, system parameter matrices and covariance matrices of a 2D CRSD Roesser model. From a computational point of view, the algorithm has been presented in a unified framework, where the user can select which of the three methods to use. Furthermore, the identification task is divided into three main parts: (1) computing the deterministic horizontal model parameters, (2) computing the deterministic vertical model parameters and (3) computing the stochastic components. Specific attention has been paid to the computation of a stabilised Kalman gain matrix and a positive real solution when required. The efficiency and robustness of the unified algorithm have been demonstrated via a thorough simulation example.
Deterministic Migration-Based Separation of White Blood Cells.
Kim, Byeongyeon; Choi, Young Joon; Seo, Hyekyung; Shin, Eui-Cheol; Choi, Sungyoung
2016-10-01
Functional and phenotypic analyses of peripheral white blood cells provide useful clinical information. However, separation of white blood cells from peripheral blood requires a time-consuming, inconvenient process and thus analyses of separated white blood cells are limited in clinical settings. To overcome this limitation, a microfluidic separation platform is developed to enable deterministic migration of white blood cells, directing the cells into designated positions according to a ridge pattern. The platform uses slant ridge structures on the channel top to induce the deterministic migration, which allows efficient and high-throughput separation of white blood cells from unprocessed whole blood. The extent of the deterministic migration under various rheological conditions is explored, enabling highly efficient migration of white blood cells in whole blood and achieving high-throughput separation of the cells (processing 1 mL of whole blood less than 7 min). In the separated cell population, the composition of lymphocyte subpopulations is well preserved, and T cells secrete cytokines without any functional impairment. On the basis of the results, this microfluidic platform is a promising tool for the rapid enrichment of white blood cells, and it is useful for functional and phenotypic analyses of peripheral white blood cells. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Daleo, Pedro; Alberti, Juan; Jumpponen, Ari; ...
2018-04-12
Microbial community assembly is affected by a combination of forces that act simultaneously, but the mechanisms underpinning their relative influences remain elusive. This gap strongly limits our ability to predict human impacts on microbial communities and the processes they regulate. Here, we experimentally demonstrate that increased salinity stress, food web alteration and nutrient loading interact to drive outcomes in salt marsh fungal leaf communities. Both salinity stress and food web alterations drove communities to deterministically diverge, resulting in distinct fungal communities. Increased nutrient loads, nevertheless, partially suppressed the influence of other factors as determinants of fungal assembly. Using a nullmore » model approach, we found that increased nutrient loads enhanced the relative importance of stochastic over deterministic divergent processes; without increased nutrient loads, samples from different treatments showed a relatively (deterministic) divergent community assembly whereas increased nutrient loads drove the system to more stochastic assemblies, suppressing the effect of other treatments. These results demonstrate that common anthropogenic modifications can interact to control fungal community assembly. As a result, our results suggest that when the environmental conditions are spatially heterogeneous (as in our case, caused by specific combinations of experimental treatments), increased stochasticity caused by greater nutrient inputs can reduce the importance of deterministic filters that otherwise caused divergence, thus driving to microbial community homogenization.« less
NASA Astrophysics Data System (ADS)
de Macedo, Isadora A. S.; da Silva, Carolina B.; de Figueiredo, J. J. S.; Omoboya, Bode
2017-01-01
Wavelet estimation as well as seismic-to-well tie procedures are at the core of every seismic interpretation workflow. In this paper we perform a comparative study of wavelet estimation methods for seismic-to-well tie. Two approaches to wavelet estimation are discussed: a deterministic estimation, based on both seismic and well log data, and a statistical estimation, based on predictive deconvolution and the classical assumptions of the convolutional model, which provides a minimum-phase wavelet. Our algorithms, for both wavelet estimation methods introduce a semi-automatic approach to determine the optimum parameters of deterministic wavelet estimation and statistical wavelet estimation and, further, to estimate the optimum seismic wavelets by searching for the highest correlation coefficient between the recorded trace and the synthetic trace, when the time-depth relationship is accurate. Tests with numerical data show some qualitative conclusions, which are probably useful for seismic inversion and interpretation of field data, by comparing deterministic wavelet estimation and statistical wavelet estimation in detail, especially for field data example. The feasibility of this approach is verified on real seismic and well data from Viking Graben field, North Sea, Norway. Our results also show the influence of the washout zones on well log data on the quality of the well to seismic tie.
Daleo, Pedro; Alberti, Juan; Jumpponen, Ari; Veach, Allison; Ialonardi, Florencia; Iribarne, Oscar; Silliman, Brian
2018-06-01
Microbial community assembly is affected by a combination of forces that act simultaneously, but the mechanisms underpinning their relative influences remain elusive. This gap strongly limits our ability to predict human impacts on microbial communities and the processes they regulate. Here, we experimentally demonstrate that increased salinity stress, food web alteration and nutrient loading interact to drive outcomes in salt marsh fungal leaf communities. Both salinity stress and food web alterations drove communities to deterministically diverge, resulting in distinct fungal communities. Increased nutrient loads, nevertheless, partially suppressed the influence of other factors as determinants of fungal assembly. Using a null model approach, we found that increased nutrient loads enhanced the relative importance of stochastic over deterministic divergent processes; without increased nutrient loads, samples from different treatments showed a relatively (deterministic) divergent community assembly whereas increased nutrient loads drove the system to more stochastic assemblies, suppressing the effect of other treatments. These results demonstrate that common anthropogenic modifications can interact to control fungal community assembly. Furthermore, our results suggest that when the environmental conditions are spatially heterogeneous (as in our case, caused by specific combinations of experimental treatments), increased stochasticity caused by greater nutrient inputs can reduce the importance of deterministic filters that otherwise caused divergence, thus driving to microbial community homogenization. © 2018 by the Ecological Society of America.
Combining Deterministic structures and stochastic heterogeneity for transport modeling
NASA Astrophysics Data System (ADS)
Zech, Alraune; Attinger, Sabine; Dietrich, Peter; Teutsch, Georg
2017-04-01
Contaminant transport in highly heterogeneous aquifers is extremely challenging and subject of current scientific debate. Tracer plumes often show non-symmetric but highly skewed plume shapes. Predicting such transport behavior using the classical advection-dispersion-equation (ADE) in combination with a stochastic description of aquifer properties requires a dense measurement network. This is in contrast to the available information for most aquifers. A new conceptual aquifer structure model is presented which combines large-scale deterministic information and the stochastic approach for incorporating sub-scale heterogeneity. The conceptual model is designed to allow for a goal-oriented, site specific transport analysis making use of as few data as possible. Thereby the basic idea is to reproduce highly skewed tracer plumes in heterogeneous media by incorporating deterministic contrasts and effects of connectivity instead of using unimodal heterogeneous models with high variances. The conceptual model consists of deterministic blocks of mean hydraulic conductivity which might be measured by pumping tests indicating values differing in orders of magnitudes. A sub-scale heterogeneity is introduced within every block. This heterogeneity can be modeled as bimodal or log-normal distributed. The impact of input parameters, structure and conductivity contrasts is investigated in a systematic manor. Furthermore, some first successful implementation of the model was achieved for the well known MADE site.
Tyson, Reny B; Nowacek, Douglas P; Miller, Patrick J O
2007-09-01
Nonlinear phenomena or nonlinearities in animal vocalizations include features such as subharmonics, deterministic chaos, biphonation, and frequency jumps that until recently were generally ignored in acoustic analyses. Recent documentation of these phenomena in several species suggests that they may play a communicative role, though the exact function is still under investigation. Here, qualitative descriptions and quantitative analyses of nonlinearities in the vocalizations of killer whales (Orcinus orca) and North Atlantic right whales (Eubalaena glacialis) are provided. All four nonlinear features were present in both species, with at least one feature occurring in 92.4% of killer and 65.7% of right whale vocalizations analyzed. Occurrence of biphonation varied the most between species, being present in 89.0% of killer whale vocalizations and only 20.4% of right whale vocalizations. Because deterministic chaos is qualitatively and quantitatively different than random or Gaussian noise, a program (TISEAN) designed specifically to identify deterministic chaos to confirm the presence of this nonlinearity was used. All segments tested in this software indicate that both species do indeed exhibit deterministic chaos. The results of this study provide confirmation that such features are common in the vocalizations of cetacean species and lay the groundwork for future studies.
Modeling the within-host dynamics of cholera: bacterial-viral interaction.
Wang, Xueying; Wang, Jin
2017-08-01
Novel deterministic and stochastic models are proposed in this paper for the within-host dynamics of cholera, with a focus on the bacterial-viral interaction. The deterministic model is a system of differential equations describing the interaction among the two types of vibrios and the viruses. The stochastic model is a system of Markov jump processes that is derived based on the dynamics of the deterministic model. The multitype branching process approximation is applied to estimate the extinction probability of bacteria and viruses within a human host during the early stage of the bacterial-viral infection. Accordingly, a closed-form expression is derived for the disease extinction probability, and analytic estimates are validated with numerical simulations. The local and global dynamics of the bacterial-viral interaction are analysed using the deterministic model, and the result indicates that there is a sharp disease threshold characterized by the basic reproduction number [Formula: see text]: if [Formula: see text], vibrios ingested from the environment into human body will not cause cholera infection; if [Formula: see text], vibrios will grow with increased toxicity and persist within the host, leading to human cholera. In contrast, the stochastic model indicates, more realistically, that there is always a positive probability of disease extinction within the human host.
Demographic noise can reverse the direction of deterministic selection
Constable, George W. A.; Rogers, Tim; McKane, Alan J.; Tarnita, Corina E.
2016-01-01
Deterministic evolutionary theory robustly predicts that populations displaying altruistic behaviors will be driven to extinction by mutant cheats that absorb common benefits but do not themselves contribute. Here we show that when demographic stochasticity is accounted for, selection can in fact act in the reverse direction to that predicted deterministically, instead favoring cooperative behaviors that appreciably increase the carrying capacity of the population. Populations that exist in larger numbers experience a selective advantage by being more stochastically robust to invasions than smaller populations, and this advantage can persist even in the presence of reproductive costs. We investigate this general effect in the specific context of public goods production and find conditions for stochastic selection reversal leading to the success of public good producers. This insight, developed here analytically, is missed by the deterministic analysis as well as by standard game theoretic models that enforce a fixed population size. The effect is found to be amplified by space; in this scenario we find that selection reversal occurs within biologically reasonable parameter regimes for microbial populations. Beyond the public good problem, we formulate a general mathematical framework for models that may exhibit stochastic selection reversal. In this context, we describe a stochastic analog to r−K theory, by which small populations can evolve to higher densities in the absence of disturbance. PMID:27450085
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daleo, Pedro; Alberti, Juan; Jumpponen, Ari
Microbial community assembly is affected by a combination of forces that act simultaneously, but the mechanisms underpinning their relative influences remain elusive. This gap strongly limits our ability to predict human impacts on microbial communities and the processes they regulate. Here, we experimentally demonstrate that increased salinity stress, food web alteration and nutrient loading interact to drive outcomes in salt marsh fungal leaf communities. Both salinity stress and food web alterations drove communities to deterministically diverge, resulting in distinct fungal communities. Increased nutrient loads, nevertheless, partially suppressed the influence of other factors as determinants of fungal assembly. Using a nullmore » model approach, we found that increased nutrient loads enhanced the relative importance of stochastic over deterministic divergent processes; without increased nutrient loads, samples from different treatments showed a relatively (deterministic) divergent community assembly whereas increased nutrient loads drove the system to more stochastic assemblies, suppressing the effect of other treatments. These results demonstrate that common anthropogenic modifications can interact to control fungal community assembly. As a result, our results suggest that when the environmental conditions are spatially heterogeneous (as in our case, caused by specific combinations of experimental treatments), increased stochasticity caused by greater nutrient inputs can reduce the importance of deterministic filters that otherwise caused divergence, thus driving to microbial community homogenization.« less
Deterministic and stochastic models for middle east respiratory syndrome (MERS)
NASA Astrophysics Data System (ADS)
Suryani, Dessy Rizki; Zevika, Mona; Nuraini, Nuning
2018-03-01
World Health Organization (WHO) data stated that since September 2012, there were 1,733 cases of Middle East Respiratory Syndrome (MERS) with 628 death cases that occurred in 27 countries. MERS was first identified in Saudi Arabia in 2012 and the largest cases of MERS outside Saudi Arabia occurred in South Korea in 2015. MERS is a disease that attacks the respiratory system caused by infection of MERS-CoV. MERS-CoV transmission occurs directly through direct contact between infected individual with non-infected individual or indirectly through contaminated object by the free virus. Suspected, MERS can spread quickly because of the free virus in environment. Mathematical modeling is used to illustrate the transmission of MERS disease using deterministic model and stochastic model. Deterministic model is used to investigate the temporal dynamic from the system to analyze the steady state condition. Stochastic model approach using Continuous Time Markov Chain (CTMC) is used to predict the future states by using random variables. From the models that were built, the threshold value for deterministic models and stochastic models obtained in the same form and the probability of disease extinction can be computed by stochastic model. Simulations for both models using several of different parameters are shown, and the probability of disease extinction will be compared with several initial conditions.
Parallel Stochastic discrete event simulation of calcium dynamics in neuron.
Ishlam Patoary, Mohammad Nazrul; Tropper, Carl; McDougal, Robert A; Zhongwei, Lin; Lytton, William W
2017-09-26
The intra-cellular calcium signaling pathways of a neuron depends on both biochemical reactions and diffusions. Some quasi-isolated compartments (e.g. spines) are so small and calcium concentrations are so low that one extra molecule diffusing in by chance can make a nontrivial difference in its concentration (percentage-wise). These rare events can affect dynamics discretely in such way that they cannot be evaluated by a deterministic simulation. Stochastic models of such a system provide a more detailed understanding of these systems than existing deterministic models because they capture their behavior at a molecular level. Our research focuses on the development of a high performance parallel discrete event simulation environment, Neuron Time Warp (NTW), which is intended for use in the parallel simulation of stochastic reaction-diffusion systems such as intra-calcium signaling. NTW is integrated with NEURON, a simulator which is widely used within the neuroscience community. We simulate two models, a calcium buffer and a calcium wave model. The calcium buffer model is employed in order to verify the correctness and performance of NTW by comparing it to a serial deterministic simulation in NEURON. We also derived a discrete event calcium wave model from a deterministic model using the stochastic IP3R structure.
NASA Astrophysics Data System (ADS)
Fischer, P.; Jardani, A.; Lecoq, N.
2018-02-01
In this paper, we present a novel inverse modeling method called Discrete Network Deterministic Inversion (DNDI) for mapping the geometry and property of the discrete network of conduits and fractures in the karstified aquifers. The DNDI algorithm is based on a coupled discrete-continuum concept to simulate numerically water flows in a model and a deterministic optimization algorithm to invert a set of observed piezometric data recorded during multiple pumping tests. In this method, the model is partioned in subspaces piloted by a set of parameters (matrix transmissivity, and geometry and equivalent transmissivity of the conduits) that are considered as unknown. In this way, the deterministic optimization process can iteratively correct the geometry of the network and the values of the properties, until it converges to a global network geometry in a solution model able to reproduce the set of data. An uncertainty analysis of this result can be performed from the maps of posterior uncertainties on the network geometry or on the property values. This method has been successfully tested for three different theoretical and simplified study cases with hydraulic responses data generated from hypothetical karstic models with an increasing complexity of the network geometry, and of the matrix heterogeneity.
Freeman, Daniel; Bradley, Jonathan; Antley, Angus; Bourke, Emilie; DeWeever, Natalie; Evans, Nicole; Černis, Emma; Sheaves, Bryony; Waite, Felicity; Dunn, Graham; Slater, Mel; Clark, David M
2016-07-01
Persecutory delusions may be unfounded threat beliefs maintained by safety-seeking behaviours that prevent disconfirmatory evidence being successfully processed. Use of virtual reality could facilitate new learning. To test the hypothesis that enabling patients to test the threat predictions of persecutory delusions in virtual reality social environments with the dropping of safety-seeking behaviours (virtual reality cognitive therapy) would lead to greater delusion reduction than exposure alone (virtual reality exposure). Conviction in delusions and distress in a real-world situation were assessed in 30 patients with persecutory delusions. Patients were then randomised to virtual reality cognitive therapy or virtual reality exposure, both with 30 min in graded virtual reality social environments. Delusion conviction and real-world distress were then reassessed. In comparison with exposure, virtual reality cognitive therapy led to large reductions in delusional conviction (reduction 22.0%, P = 0.024, Cohen's d = 1.3) and real-world distress (reduction 19.6%, P = 0.020, Cohen's d = 0.8). Cognitive therapy using virtual reality could prove highly effective in treating delusions. © The Royal College of Psychiatrists 2016.
Freeman, Daniel; Bradley, Jonathan; Antley, Angus; Bourke, Emilie; DeWeever, Natalie; Evans, Nicole; Černis, Emma; Sheaves, Bryony; Waite, Felicity; Dunn, Graham; Slater, Mel; Clark, David M.
2016-01-01
Background Persecutory delusions may be unfounded threat beliefs maintained by safety-seeking behaviours that prevent disconfirmatory evidence being successfully processed. Use of virtual reality could facilitate new learning. Aims To test the hypothesis that enabling patients to test the threat predictions of persecutory delusions in virtual reality social environments with the dropping of safety-seeking behaviours (virtual reality cognitive therapy) would lead to greater delusion reduction than exposure alone (virtual reality exposure). Method Conviction in delusions and distress in a real-world situation were assessed in 30 patients with persecutory delusions. Patients were then randomised to virtual reality cognitive therapy or virtual reality exposure, both with 30 min in graded virtual reality social environments. Delusion conviction and real-world distress were then reassessed. Results In comparison with exposure, virtual reality cognitive therapy led to large reductions in delusional conviction (reduction 22.0%, P = 0.024, Cohen's d = 1.3) and real-world distress (reduction 19.6%, P = 0.020, Cohen's d = 0.8). Conclusion Cognitive therapy using virtual reality could prove highly effective in treating delusions. PMID:27151071
Virtual reality and paranoid ideations in people with an 'at-risk mental state' for psychosis.
Valmaggia, Lucia R; Freeman, Daniel; Green, Catherine; Garety, Philippa; Swapp, David; Antley, Angus; Prescott, Corinne; Fowler, David; Kuipers, Elizabeth; Bebbington, Paul; Slater, Mel; Broome, Matthew; McGuire, Philip K
2007-12-01
Virtual reality provides a means of studying paranoid thinking in controlled laboratory conditions. However, this method has not been used with a clinical group. To establish the feasibility and safety of using virtual reality methodology in people with an at-risk mental state and to investigate the applicability of a cognitive model of paranoia to this group. Twenty-one participants with an at-risk mental state were assessed before and after entering a virtual reality environment depicting the inside of an underground train. Virtual reality did not raise levels of distress at the time of testing or cause adverse experiences over the subsequent week. Individuals attributed mental states to virtual reality characters including hostile intent. Persecutory ideation in virtual reality was predicted by higher levels of trait paranoia, anxiety, stress, immersion in virtual reality, perseveration and interpersonal sensitivity. Virtual reality is an acceptable experimental technique for use with individuals with at-risk mental states. Paranoia in virtual reality was understandable in terms of the cognitive model of persecutory delusions.
Virtual reality simulators and training in laparoscopic surgery.
Yiannakopoulou, Eugenia; Nikiteas, Nikolaos; Perrea, Despina; Tsigris, Christos
2015-01-01
Virtual reality simulators provide basic skills training without supervision in a controlled environment, free of pressure of operating on patients. Skills obtained through virtual reality simulation training can be transferred on the operating room. However, relative evidence is limited with data available only for basic surgical skills and for laparoscopic cholecystectomy. No data exist on the effect of virtual reality simulation on performance on advanced surgical procedures. Evidence suggests that performance on virtual reality simulators reliably distinguishes experienced from novice surgeons Limited available data suggest that independent approach on virtual reality simulation training is not different from proctored approach. The effect of virtual reality simulators training on acquisition of basic surgical skills does not seem to be different from the effect the physical simulators. Limited data exist on the effect of virtual reality simulation training on the acquisition of visual spatial perception and stress coping skills. Undoubtedly, virtual reality simulation training provides an alternative means of improving performance in laparoscopic surgery. However, future research efforts should focus on the effect of virtual reality simulation on performance in the context of advanced surgical procedure, on standardization of training, on the possibility of synergistic effect of virtual reality simulation training combined with mental training, on personalized training. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
Virtual reality for emergency training
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altinkemer, K.
1995-12-31
Virtual reality is a sequence of scenes generated by a computer as a response to the five different senses. These senses are sight, sound, taste, touch, smell. Other senses that can be used in virtual reality include balance, pheromonal, and immunological senses. Many application areas include: leisure and entertainment, medicine, architecture, engineering, manufacturing, and training. Virtual reality is especially important when it is used for emergency training and management of natural disasters including earthquakes, floods, tornados and other situations which are hard to emulate. Classical training methods for these extraordinary environments lack the realistic surroundings that virtual reality can provide.more » In order for virtual reality to be a successful training tool the design needs to include certain aspects; such as how real virtual reality should be and how much fixed cost is entailed in setting up the virtual reality trainer. There are also pricing questions regarding the price per training session on virtual reality trainer, and the appropriate training time length(s).« less
NASA Astrophysics Data System (ADS)
Harland, S. R.; Browning, J.; Healy, D.; Meredith, P. G.; Mitchell, T. M.
2017-12-01
Ultimate failure in brittle rocks is commonly accepted to occur as a coalescence of micro-crack damage into a single failure plane. The geometry and evolution with stress of the cracks (damage) within the medium will play a role in dictating the geometry of the ultimate failure plane. Currently, the majority of experimental studies investigating damage evolution and rock failure use conventional triaxial stress states (σ1 > σ2 = σ3). Results from these tests can easily be represented on a Mohr-Coulomb plot (σn - τ), conveniently allowing the user to determine the geometry of the resultant failure plane. In reality however, stress in the subsurface is generally truly triaxial (σ1 > σ2 > σ3) and in this case, the Mohr-Coulomb failure criterion is inadequate as it incorporates no dependence on the intermediate stress (σ2), which has been shown to play an important role in controlling failure. It has recently been shown that differential stress is the key driver in initiating crack growth, regardless of the mean stress. Polyaxial failure criteria that incorporate the effect of the intermediate stress do exist and include the Modified Lade, Modified Wiebols and Cook, and the Drucker-Prager criteria. However, unlike the Mohr-Coulomb failure criterion, these polyaxial criteria do not offer any prediction of, or insight into, the geometry of the resultant failure plane. An additional downfall of all of the common conventional and polyaxial failure criteria is that they fail to describe the geometry of the damage (i.e. pre-failure microcracking) envelope with progressive stress; it is commonly assumed that the damage envelope is parallel to the ultimate brittle failure envelope. Here we use previously published polyaxial failure data for the Shirahama sandstone and Westerley granite to illustrate that the commonly used Mohr-Coulomb and polyaxial failure criteria do not sufficiently describe or capture failure or damage envelopes under true triaxial stress states. We investigate if and how Mohr-Coulomb type constructions can provide geometrical solutions to truly-triaxial problems. We look to incorporate both the intermediate stress and the differential stress as the controlling parameters in failure and examine the geometry of damage envelopes using damage onset data.
1993-04-01
until exhausted. SECURITY CLASSIFICATION OF THIS PAGE All other editions are obsolete. UNCLASSIFIED " VIRTUAL REALITY JAMES F. DAILEY, LIEUTENANT COLONEL...US" This paper reviews the exciting field of virtual reality . The author describes the basic concepts of virtual reality and finds that its numerous...potential benefits to society could revolutionize everyday life. The various components that make up a virtual reality system are described in detail
NASA Astrophysics Data System (ADS)
Roberts, Sean; Eykholt, R.; Thaut, Michael H.
2000-08-01
We investigate rhythmic finger tapping in both the presence and the absence of a metronome. We examine both the time intervals between taps and the time lags between the stimulus tones from the metronome and the response taps by the subject. We analyze the correlations in these data sets, and we search for evidence of deterministic chaos, as opposed to randomness, in the fluctuations.
Deterministic annealing for density estimation by multivariate normal mixtures
NASA Astrophysics Data System (ADS)
Kloppenburg, Martin; Tavan, Paul
1997-03-01
An approach to maximum-likelihood density estimation by mixtures of multivariate normal distributions for large high-dimensional data sets is presented. Conventionally that problem is tackled by notoriously unstable expectation-maximization (EM) algorithms. We remove these instabilities by the introduction of soft constraints, enabling deterministic annealing. Our developments are motivated by the proof that algorithmically stable fuzzy clustering methods that are derived from statistical physics analogs are special cases of EM procedures.
2015-07-06
preparation for deterministic spin-photon entanglement ; (3) Demonstration of initialization of the 2 qubit states; (4) Demonstration of nonlocal nuclear...Demonstration of a flying qubit by entanglement of the quantum dot spin polarization with the polarization of a spontaneously emitted photon. Future...coherent optical control steps in preparation for deterministic spin-photon entanglement ; (3) Demonstration of initialization of the 2 qubit states in
Ada programming guidelines for deterministic storage management
NASA Technical Reports Server (NTRS)
Auty, David
1988-01-01
Previous reports have established that a program can be written in the Ada language such that the program's storage management requirements are determinable prior to its execution. Specific guidelines for ensuring such deterministic usage of Ada dynamic storage requirements are described. Because requirements may vary from one application to another, guidelines are presented in a most-restrictive to least-restrictive fashion to allow the reader to match appropriate restrictions to the particular application area under investigation.
NASA Astrophysics Data System (ADS)
Lemarchand, A.; Lesne, A.; Mareschal, M.
1995-05-01
The reaction-diffusion equation associated with the Fisher chemical model A+B-->2A admits wave-front solutions by replacing an unstable stationary state with a stable one. The deterministic analysis concludes that their propagation velocity is not prescribed by the dynamics. For a large class of initial conditions the velocity which is spontaneously selected is equal to the minimum allowed velocity vmin, as predicted by the marginal stability criterion. In order to test the relevance of this deterministic description we investigate the macroscopic consequences, on the velocity and the width of the front, of the intrinsic stochasticity due to the underlying microscopic dynamics. We solve numerically the Langevin equations, deduced analytically from the master equation within a system size expansion procedure. We show that the mean profile associated with the stochastic solution propagates faster than the deterministic solution at a velocity up to 25% greater than vmin.
Hybrid stochastic and deterministic simulations of calcium blips.
Rüdiger, S; Shuai, J W; Huisinga, W; Nagaiah, C; Warnecke, G; Parker, I; Falcke, M
2007-09-15
Intracellular calcium release is a prime example for the role of stochastic effects in cellular systems. Recent models consist of deterministic reaction-diffusion equations coupled to stochastic transitions of calcium channels. The resulting dynamics is of multiple time and spatial scales, which complicates far-reaching computer simulations. In this article, we introduce a novel hybrid scheme that is especially tailored to accurately trace events with essential stochastic variations, while deterministic concentration variables are efficiently and accurately traced at the same time. We use finite elements to efficiently resolve the extreme spatial gradients of concentration variables close to a channel. We describe the algorithmic approach and we demonstrate its efficiency compared to conventional methods. Our single-channel model matches experimental data and results in intriguing dynamics if calcium is used as charge carrier. Random openings of the channel accumulate in bursts of calcium blips that may be central for the understanding of cellular calcium dynamics.
A Stochastic Tick-Borne Disease Model: Exploring the Probability of Pathogen Persistence.
Maliyoni, Milliward; Chirove, Faraimunashe; Gaff, Holly D; Govinder, Keshlan S
2017-09-01
We formulate and analyse a stochastic epidemic model for the transmission dynamics of a tick-borne disease in a single population using a continuous-time Markov chain approach. The stochastic model is based on an existing deterministic metapopulation tick-borne disease model. We compare the disease dynamics of the deterministic and stochastic models in order to determine the effect of randomness in tick-borne disease dynamics. The probability of disease extinction and that of a major outbreak are computed and approximated using the multitype Galton-Watson branching process and numerical simulations, respectively. Analytical and numerical results show some significant differences in model predictions between the stochastic and deterministic models. In particular, we find that a disease outbreak is more likely if the disease is introduced by infected deer as opposed to infected ticks. These insights demonstrate the importance of host movement in the expansion of tick-borne diseases into new geographic areas.
Deterministic strain-induced arrays of quantum emitters in a two-dimensional semiconductor
Branny, Artur; Kumar, Santosh; Proux, Raphaël; Gerardot, Brian D
2017-01-01
An outstanding challenge in quantum photonics is scalability, which requires positioning of single quantum emitters in a deterministic fashion. Site positioning progress has been made in established platforms including defects in diamond and self-assembled quantum dots, albeit often with compromised coherence and optical quality. The emergence of single quantum emitters in layered transition metal dichalcogenide semiconductors offers new opportunities to construct a scalable quantum architecture. Here, using nanoscale strain engineering, we deterministically achieve a two-dimensional lattice of quantum emitters in an atomically thin semiconductor. We create point-like strain perturbations in mono- and bi-layer WSe2 which locally modify the band-gap, leading to efficient funnelling of excitons towards isolated strain-tuned quantum emitters that exhibit high-purity single photon emission. We achieve near unity emitter creation probability and a mean positioning accuracy of 120±32 nm, which may be improved with further optimization of the nanopillar dimensions. PMID:28530219
Probabilistic track coverage in cooperative sensor networks.
Ferrari, Silvia; Zhang, Guoxian; Wettergren, Thomas A
2010-12-01
The quality of service of a network performing cooperative track detection is represented by the probability of obtaining multiple elementary detections over time along a target track. Recently, two different lines of research, namely, distributed-search theory and geometric transversals, have been used in the literature for deriving the probability of track detection as a function of random and deterministic sensors' positions, respectively. In this paper, we prove that these two approaches are equivalent under the same problem formulation. Also, we present a new performance function that is derived by extending the geometric-transversal approach to the case of random sensors' positions using Poisson flats. As a result, a unified approach for addressing track detection in both deterministic and probabilistic sensor networks is obtained. The new performance function is validated through numerical simulations and is shown to bring about considerable computational savings for both deterministic and probabilistic sensor networks.
Magnified gradient function with deterministic weight modification in adaptive learning.
Ng, Sin-Chun; Cheung, Chi-Chung; Leung, Shu-Hung
2004-11-01
This paper presents two novel approaches, backpropagation (BP) with magnified gradient function (MGFPROP) and deterministic weight modification (DWM), to speed up the convergence rate and improve the global convergence capability of the standard BP learning algorithm. The purpose of MGFPROP is to increase the convergence rate by magnifying the gradient function of the activation function, while the main objective of DWM is to reduce the system error by changing the weights of a multilayered feedforward neural network in a deterministic way. Simulation results show that the performance of the above two approaches is better than BP and other modified BP algorithms for a number of learning problems. Moreover, the integration of the above two approaches forming a new algorithm called MDPROP, can further improve the performance of MGFPROP and DWM. From our simulation results, the MDPROP algorithm always outperforms BP and other modified BP algorithms in terms of convergence rate and global convergence capability.
Deterministic blade row interactions in a centrifugal compressor stage
NASA Technical Reports Server (NTRS)
Kirtley, K. R.; Beach, T. A.
1991-01-01
The three-dimensional viscous flow in a low speed centrifugal compressor stage is simulated using an average passage Navier-Stokes analysis. The impeller discharge flow is of the jet/wake type with low momentum fluid in the shroud-pressure side corner coincident with the tip leakage vortex. This nonuniformity introduces periodic unsteadiness in the vane frame of reference. The effect of such deterministic unsteadiness on the time-mean is included in the analysis through the average passage stress, which allows the analysis of blade row interactions. The magnitude of the divergence of the deterministic unsteady stress is of the order of the divergence of the Reynolds stress over most of the span, from the impeller trailing edge to the vane throat. Although the potential effects on the blade trailing edge from the diffuser vane are small, strong secondary flows generated by the impeller degrade the performance of the diffuser vanes.
Daniell method for power spectral density estimation in atomic force microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labuda, Aleksander
An alternative method for power spectral density (PSD) estimation—the Daniell method—is revisited and compared to the most prevalent method used in the field of atomic force microscopy for quantifying cantilever thermal motion—the Bartlett method. Both methods are shown to underestimate the Q factor of a simple harmonic oscillator (SHO) by a predictable, and therefore correctable, amount in the absence of spurious deterministic noise sources. However, the Bartlett method is much more prone to spectral leakage which can obscure the thermal spectrum in the presence of deterministic noise. By the significant reduction in spectral leakage, the Daniell method leads to amore » more accurate representation of the true PSD and enables clear identification and rejection of deterministic noise peaks. This benefit is especially valuable for the development of automated PSD fitting algorithms for robust and accurate estimation of SHO parameters from a thermal spectrum.« less
Large conditional single-photon cross-phase modulation
Hosseini, Mahdi; Duan, Yiheng; Vuletić, Vladan
2016-01-01
Deterministic optical quantum logic requires a nonlinear quantum process that alters the phase of a quantum optical state by π through interaction with only one photon. Here, we demonstrate a large conditional cross-phase modulation between a signal field, stored inside an atomic quantum memory, and a control photon that traverses a high-finesse optical cavity containing the atomic memory. This approach avoids fundamental limitations associated with multimode effects for traveling optical photons. We measure a conditional cross-phase shift of π/6 (and up to π/3 by postselection on photons that remain in the system longer than average) between the retrieved signal and control photons, and confirm deterministic entanglement between the signal and control modes by extracting a positive concurrence. By upgrading to a state-of-the-art cavity, our system can reach a coherent phase shift of π at low loss, enabling deterministic and universal photonic quantum logic. PMID:27519798
Detecting and disentangling nonlinear structure from solar flux time series
NASA Technical Reports Server (NTRS)
Ashrafi, S.; Roszman, L.
1992-01-01
Interest in solar activity has grown in the past two decades for many reasons. Most importantly for flight dynamics, solar activity changes the atmospheric density, which has important implications for spacecraft trajectory and lifetime prediction. Building upon the previously developed Rayleigh-Benard nonlinear dynamic solar model, which exhibits many dynamic behaviors observed in the Sun, this work introduces new chaotic solar forecasting techniques. Our attempt to use recently developed nonlinear chaotic techniques to model and forecast solar activity has uncovered highly entangled dynamics. Numerical techniques for decoupling additive and multiplicative white noise from deterministic dynamics and examines falloff of the power spectra at high frequencies as a possible means of distinguishing deterministic chaos from noise than spectrally white or colored are presented. The power spectral techniques presented are less cumbersome than current methods for identifying deterministic chaos, which require more computationally intensive calculations, such as those involving Lyapunov exponents and attractor dimension.
Berryman, Donna R
2012-01-01
Augmented reality is a technology that overlays digital information on objects or places in the real world for the purpose of enhancing the user experience. It is not virtual reality, that is, the technology that creates a totally digital or computer created environment. Augmented reality, with its ability to combine reality and digital information, is being studied and implemented in medicine, marketing, museums, fashion, and numerous other areas. This article presents an overview of augmented reality, discussing what it is, how it works, its current implementations, and its potential impact on libraries.
The Next Wave: Humans, Computers, and Redefining Reality
NASA Technical Reports Server (NTRS)
Little, William
2018-01-01
The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.
ERIC Educational Resources Information Center
Squires, David R.
2017-01-01
The structure of the literature review features the current trajectory of Augmented Reality in the field including the current literature detailing how Augmented Reality has been applied in educational environments; how Augmented Reality has been applied in training environments; how Augmented Reality has been used to measure cognition and the…
The development of children’s concepts of invisibility
Woolley, Jacqueline D.; McInnis, Melissa A.
2015-01-01
One of the most striking examples of appearance-reality discrepancy is invisibility – when something has no appearance yet still exists. The issue of invisibility sits at the juncture of two foundational ontological distinctions, that between appearance and reality and that between reality and non-reality. We probed the invisibility concepts of 47 3-7-year-olds using two sets of tasks: (1) an entity task, in which children were queried about the visibility and reality status of a variety of both visible and invisible entities, and (2) two standard appearance-reality tasks. Results showed that children’s concepts of visibility and reality status are intertwined, and that an understanding that some entities are impossible to see develops between the ages of 3 and 7. PMID:25937704
Summary of the 1st International Workshop on Networked Reality in Telecommunication
NASA Astrophysics Data System (ADS)
Davis, T.
1994-05-01
s of workshop papers are presented. Networked reality refers to the array of technologies and services involved in collecting a representation of reality at one location and using it to reconstruct an artificial representation of that reality at a remote location. The term encompasses transmission of the required information between the sites, and also includes the psychological, cultural, and legal implications of introducing derived communication systems. Networked reality is clearly derived from the emerging virtual reality technology base but is intended to go beyond the latter to include its integration with the required telecommunication technologies. A noteworthy feature of the Networked Reality '94 technical program is the extent of emphasis on social (particularly medical) impacts of the technology.
Drawing Realities: The Themes of Children's Story Drawings.
ERIC Educational Resources Information Center
Wilson, Brent; Wilson, Marjorie
1979-01-01
Drawing on the Kreilters' work with the psychology of adult artists, the authors show how children's story drawings develop the same four types of realities: origins, everyday experiences, normative realities (rules), and prophetic (anticipatory) realities. Illustrations are included. (SJL)
Virtual Reality and Its Potential Application in Education and Training.
ERIC Educational Resources Information Center
Milheim, William D.
1995-01-01
An overview is provided of current trends in virtual reality research and development, including discussion of hardware, types of virtual reality, and potential problems with virtual reality. Implications for education and training are explored. (Author/JKP)
NASA Astrophysics Data System (ADS)
Contreras, Arturo Javier
This dissertation describes a novel Amplitude-versus-Angle (AVA) inversion methodology to quantitatively integrate pre-stack seismic data, well logs, geologic data, and geostatistical information. Deterministic and stochastic inversion algorithms are used to characterize flow units of deepwater reservoirs located in the central Gulf of Mexico. A detailed fluid/lithology sensitivity analysis was conducted to assess the nature of AVA effects in the study area. Standard AVA analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generate typical Class III AVA responses. Layer-dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution, indicating that presence of light saturating fluids clearly affects the elastic response of sands. Accordingly, AVA deterministic and stochastic inversions, which combine the advantages of AVA analysis with those of inversion, have provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties and fluid-sensitive modulus attributes (P-Impedance, S-Impedance, density, and LambdaRho, in the case of deterministic inversion; and P-velocity, S-velocity, density, and lithotype (sand-shale) distributions, in the case of stochastic inversion). The quantitative use of rock/fluid information through AVA seismic data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, provides accurate 3D models of petrophysical properties such as porosity, permeability, and water saturation. Pre-stack stochastic inversion provides more realistic and higher-resolution results than those obtained from analogous deterministic techniques. Furthermore, 3D petrophysical models can be more accurately co-simulated from AVA stochastic inversion results. By combining AVA sensitivity analysis techniques with pre-stack stochastic inversion, geologic data, and awareness of inversion pitfalls, it is possible to substantially reduce the risk in exploration and development of conventional and non-conventional reservoirs. From the final integration of deterministic and stochastic inversion results with depositional models and analogous examples, the M-series reservoirs have been interpreted as stacked terminal turbidite lobes within an overall fan complex (the Miocene MCAVLU Submarine Fan System); this interpretation is consistent with previous core data interpretations and regional stratigraphic/depositional studies.
A Virtual Reality-Based Simulation of Abdominal Surgery
1994-06-30
415) 591-7881 In! IhNiI 1 SHORT TITLE: A Virtual Reality -Based Simulation of Abdominal Surgery REPORTING PERIOD: October 31, 1993-June 30, 1994 The...Report - A Virtual Reality -Based Simulation Of Abdominal Surgery Page 2 June 21, 1994 TECHNICAL REPORT SUMMARY Virtual Reality is a marriage between...applications of this technology. Virtual reality systems can be used to teach surgical anatomy, diagnose surgical problems, plan operations. simulate and
Augmented Reality: A Brand New Challenge for the Assessment and Treatment of Psychological Disorders
Chicchi Giglioli, Irene Alice; Pedroli, Elisa
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology. PMID:26339283
Chicchi Giglioli, Irene Alice; Pallavicini, Federica; Pedroli, Elisa; Serino, Silvia; Riva, Giuseppe
2015-01-01
Augmented Reality is a new technological system that allows introducing virtual contents in the real world in order to run in the same representation and, in real time, enhancing the user's sensory perception of reality. From another point of view, Augmented Reality can be defined as a set of techniques and tools that add information to the physical reality. To date, Augmented Reality has been used in many fields, such as medicine, entertainment, maintenance, architecture, education, and cognitive and motor rehabilitation but very few studies and applications of AR exist in clinical psychology. In the treatment of psychological disorders, Augmented Reality has given preliminary evidence to be a useful tool due to its adaptability to the patient needs and therapeutic purposes and interactivity. Another relevant factor is the quality of the user's experience in the Augmented Reality system determined from emotional engagement and sense of presence. This experience could increase the AR ecological validity in the treatment of psychological disorders. This paper reviews the recent studies on the use of Augmented Reality in the evaluation and treatment of psychological disorders, focusing on current uses of this technology and on the specific features that delineate Augmented Reality a new technique useful for psychology.
1990-08-01
the spectral domain is extended to include the effects of two-dimensional, two-component current flow in planar transmission line discontinuities 6n...PROFESSOR: Tatsuo Itoh A deterministic formulation of the method of moments carried out in the spectral domain is extended to include the effects of...two-dimensional, two- component current flow in planar transmission line discontinuities on open substrates. The method includes the effects of space
Using Reputation Systems and Non-Deterministic Routing to Secure Wireless Sensor Networks
Moya, José M.; Vallejo, Juan Carlos; Fraga, David; Araujo, Álvaro; Villanueva, Daniel; de Goyeneche, Juan-Mariano
2009-01-01
Security in wireless sensor networks is difficult to achieve because of the resource limitations of the sensor nodes. We propose a trust-based decision framework for wireless sensor networks coupled with a non-deterministic routing protocol. Both provide a mechanism to effectively detect and confine common attacks, and, unlike previous approaches, allow bad reputation feedback to the network. This approach has been extensively simulated, obtaining good results, even for unrealistically complex attack scenarios. PMID:22412345
The threshold of a stochastic delayed SIR epidemic model with temporary immunity
NASA Astrophysics Data System (ADS)
Liu, Qun; Chen, Qingmei; Jiang, Daqing
2016-05-01
This paper is concerned with the asymptotic properties of a stochastic delayed SIR epidemic model with temporary immunity. Sufficient conditions for extinction and persistence in the mean of the epidemic are established. The threshold between persistence in the mean and extinction of the epidemic is obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number R0 of the deterministic system.
Comment on: Supervisory Asymmetric Deterministic Secure Quantum Communication
NASA Astrophysics Data System (ADS)
Kao, Shih-Hung; Tsai, Chia-Wei; Hwang, Tzonelih
2012-12-01
In 2010, Xiu et al. (Optics Communications 284:2065-2069, 2011) proposed several applications based on a new secure four-site distribution scheme using χ-type entangled states. This paper points out that one of these applications, namely, supervisory asymmetric deterministic secure quantum communication, is subject to an information leakage problem, in which the receiver can extract two bits of a three-bit secret message without the supervisor's permission. An enhanced protocol is proposed to resolve this problem.
Saito, Hiroshi; Katahira, Kentaro; Okanoya, Kazuo; Okada, Masato
2014-01-01
The decision making behaviors of humans and animals adapt and then satisfy an "operant matching law" in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karamooz, Saeed; Breeding, John Eric; Justice, T Alan
As MicroTCA expands into applications beyond the telecommunications industry from which it originated, it faces new challenges in the area of inter-blade communications. The ability to achieve deterministic, low-latency communications between blades is critical to realizing a scalable architecture. In the past, legacy bus architectures accomplished inter-blade communications using dedicated parallel buses across the backplane. Because of limited fabric resources on its backplane, MicroTCA uses the carrier hub (MCH) for this purpose. Unfortunately, MCH products from commercial vendors are limited to standard bus protocols such as PCI Express, Serial Rapid IO and 10/40GbE. While these protocols have exceptional throughput capability,more » they are neither deterministic nor necessarily low-latency. To overcome this limitation, an MCH has been developed based on the Xilinx Virtex-7 690T FPGA. This MCH provides the system architect/developer complete flexibility in both the interface protocol and routing of information between blades. In this paper, we present the application of this configurable MCH concept to the Machine Protection System under development for the Spallation Neutron Sources's proton accelerator. Specifically, we demonstrate the use of the configurable MCH as a 12x4-lane crossbar switch using the Aurora protocol to achieve a deterministic, low-latency data link. In this configuration, the crossbar has an aggregate bandwidth of 48 GB/s.« less
The concerted calculation of the BN-600 reactor for the deterministic and stochastic codes
NASA Astrophysics Data System (ADS)
Bogdanova, E. V.; Kuznetsov, A. N.
2017-01-01
The solution of the problem of increasing the safety of nuclear power plants implies the existence of complete and reliable information about the processes occurring in the core of a working reactor. Nowadays the Monte-Carlo method is the most general-purpose method used to calculate the neutron-physical characteristic of the reactor. But it is characterized by large time of calculation. Therefore, it may be useful to carry out coupled calculations with stochastic and deterministic codes. This article presents the results of research for possibility of combining stochastic and deterministic algorithms in calculation the reactor BN-600. This is only one part of the work, which was carried out in the framework of the graduation project at the NRC “Kurchatov Institute” in cooperation with S. S. Gorodkov and M. A. Kalugin. It is considering the 2-D layer of the BN-600 reactor core from the international benchmark test, published in the report IAEA-TECDOC-1623. Calculations of the reactor were performed with MCU code and then with a standard operative diffusion algorithm with constants taken from the Monte - Carlo computation. Macro cross-section, diffusion coefficients, the effective multiplication factor and the distribution of neutron flux and power were obtained in 15 energy groups. The reasonable agreement between stochastic and deterministic calculations of the BN-600 is observed.
Impacts of Considering Climate Variability on Investment Decisions in Ethiopia
NASA Astrophysics Data System (ADS)
Strzepek, K.; Block, P.; Rosegrant, M.; Diao, X.
2005-12-01
In Ethiopia, climate extremes, inducing droughts or floods, are not unusual. Monitoring the effects of these extremes, and climate variability in general, is critical for economic prediction and assessment of the country's future welfare. The focus of this study involves adding climate variability to a deterministic, mean climate-driven agro-economic model, in an attempt to understand its effects and degree of influence on general economic prediction indicators for Ethiopia. Four simulations are examined, including a baseline simulation and three investment strategies: simulations of irrigation investment, roads investment, and a combination investment of both irrigation and roads. The deterministic model is transformed into a stochastic model by dynamically adding year-to-year climate variability through climate-yield factors. Nine sets of actual, historic, variable climate data are individually assembled and implemented into the 12-year stochastic model simulation, producing an ensemble of economic prediction indicators. This ensemble allows for a probabilistic approach to planning and policy making, allowing decision makers to consider risk. The economic indicators from the deterministic and stochastic approaches, including rates of return to investments, are significantly different. The predictions of the deterministic model appreciably overestimate the future welfare of Ethiopia; the predictions of the stochastic model, utilizing actual climate data, tend to give a better semblance of what may be expected. Inclusion of climate variability is vital for proper analysis of the predictor values from this agro-economic model.
NASA Astrophysics Data System (ADS)
Reynders, Edwin P. B.; Langley, Robin S.
2018-08-01
The hybrid deterministic-statistical energy analysis method has proven to be a versatile framework for modeling built-up vibro-acoustic systems. The stiff system components are modeled deterministically, e.g., using the finite element method, while the wave fields in the flexible components are modeled as diffuse. In the present paper, the hybrid method is extended such that not only the ensemble mean and variance of the harmonic system response can be computed, but also of the band-averaged system response. This variance represents the uncertainty that is due to the assumption of a diffuse field in the flexible components of the hybrid system. The developments start with a cross-frequency generalization of the reciprocity relationship between the total energy in a diffuse field and the cross spectrum of the blocked reverberant loading at the boundaries of that field. By making extensive use of this generalization in a first-order perturbation analysis, explicit expressions are derived for the cross-frequency and band-averaged variance of the vibrational energies in the diffuse components and for the cross-frequency and band-averaged variance of the cross spectrum of the vibro-acoustic field response of the deterministic components. These expressions are extensively validated against detailed Monte Carlo analyses of coupled plate systems in which diffuse fields are simulated by randomly distributing small point masses across the flexible components, and good agreement is found.
A random walk on water (Henry Darcy Medal Lecture)
NASA Astrophysics Data System (ADS)
Koutsoyiannis, D.
2009-04-01
Randomness and uncertainty had been well appreciated in hydrology and water resources engineering in their initial steps as scientific disciplines. However, this changed through the years and, following other geosciences, hydrology adopted a naïve view of randomness in natural processes. Such a view separates natural phenomena into two mutually exclusive types, random or stochastic, and deterministic. When a classification of a specific process into one of these two types fails, then a separation of the process into two different, usually additive, parts is typically devised, each of which may be further subdivided into subparts (e.g., deterministic subparts such as periodic and aperiodic or trends). This dichotomous logic is typically combined with a manichean perception, in which the deterministic part supposedly represents cause-effect relationships and thus is physics and science (the "good"), whereas randomness has little relationship with science and no relationship with understanding (the "evil"). Probability theory and statistics, which traditionally provided the tools for dealing with randomness and uncertainty, have been regarded by some as the "necessary evil" but not as an essential part of hydrology and geophysics. Some took a step further to banish them from hydrology, replacing them with deterministic sensitivity analysis and fuzzy-logic representations. Others attempted to demonstrate that irregular fluctuations observed in natural processes are au fond manifestations of underlying chaotic deterministic dynamics with low dimensionality, thus attempting to render probabilistic descriptions unnecessary. Some of the above recent developments are simply flawed because they make erroneous use of probability and statistics (which, remarkably, provide the tools for such analyses), whereas the entire underlying logic is just a false dichotomy. To see this, it suffices to recall that Pierre Simon Laplace, perhaps the most famous proponent of determinism in the history of philosophy of science (cf. Laplace's demon), is, at the same time, one of the founders of probability theory, which he regarded as "nothing but common sense reduced to calculation". This harmonizes with James Clerk Maxwell's view that "the true logic for this world is the calculus of Probabilities" and was more recently and epigrammatically formulated in the title of Edwin Thompson Jaynes's book "Probability Theory: The Logic of Science" (2003). Abandoning dichotomous logic, either on ontological or epistemic grounds, we can identify randomness or stochasticity with unpredictability. Admitting that (a) uncertainty is an intrinsic property of nature; (b) causality implies dependence of natural processes in time and thus suggests predictability; but, (c) even the tiniest uncertainty (e.g., in initial conditions) may result in unpredictability after a certain time horizon, we may shape a stochastic representation of natural processes that is consistent with Karl Popper's indeterministic world view. In this representation, probability quantifies uncertainty according to the Kolmogorov system, in which probability is a normalized measure, i.e., a function that maps sets (areas where the initial conditions or the parameter values lie) to real numbers (in the interval [0, 1]). In such a representation, predictability (suggested by deterministic laws) and unpredictability (randomness) coexist, are not separable or additive components, and it is a matter of specifying the time horizon of prediction to decide which of the two dominates. An elementary numerical example has been devised to illustrate the above ideas and demonstrate that they offer a pragmatic and useful guide for practice, rather than just pertaining to philosophical discussions. A chaotic model, with fully and a priori known deterministic dynamics and deterministic inputs (without any random agent), is assumed to represent the hydrological balance in an area partly covered by vegetation. Experimentation with this toy model demonstrates, inter alia, that: (1) for short time horizons the deterministic dynamics is able to give good predictions; but (2) these predictions become extremely inaccurate and useless for long time horizons; (3) for such horizons a naïve statistical prediction (average of past data) which fully neglects the deterministic dynamics is more skilful; and (4) if this statistical prediction, in addition to past data, is combined with the probability theory (the principle of maximum entropy, in particular), it can provide a more informative prediction. Also, the toy model shows that the trajectories of the system state (and derivative properties thereof) do not resemble a regular (e.g., periodic) deterministic process nor a purely random process, but exhibit patterns indicating anti-persistence and persistence (where the latter statistically complies with a Hurst-Kolmogorov behaviour). If the process is averaged over long time scales, the anti-persistent behaviour improves predictability, whereas the persistent behaviour substantially deteriorates it. A stochastic representation of this deterministic system, which incorporates dynamics, is not only possible, but also powerful as it provides good predictions for both short and long horizons and helps to decide on when the deterministic dynamics should be considered or neglected. Obviously, a natural system is extremely more complex than this simple toy model and hence unpredictability is naturally even more prominent in the former. In addition, in a complex natural system, we can never know the exact dynamics and we must infer it from past data, which implies additional uncertainty and an additional role of stochastics in the process of formulating the system equations and estimating the involved parameters. Data also offer the only solid grounds to test any hypothesis about the dynamics, and failure of performing such testing against evidence from data renders the hypothesised dynamics worthless. If this perception of natural phenomena is adequately plausible, then it may help in studying interesting fundamental questions regarding the current state and the trends of hydrological and water resources research and their promising future paths. For instance: (i) Will it ever be possible to achieve a fully "physically based" modelling of hydrological systems that will not depend on data or stochastic representations? (ii) To what extent can hydrological uncertainty be reduced and what are the effective means for such reduction? (iii) Are current stochastic methods in hydrology consistent with observed natural behaviours? What paths should we explore for their advancement? (iv) Can deterministic methods provide solid scientific grounds for water resources engineering and management? In particular, can there be risk-free hydraulic engineering and water management? (v) Is the current (particularly important) interface between hydrology and climate satisfactory?. In particular, should hydrology rely on climate models that are not properly validated (i.e., for periods and scales not used in calibration)? In effect, is the evolution of climate and its impacts on water resources deterministically predictable?
Fogel, Joshua; Shlivko, Alexander
2016-01-02
Reality television watching and social media use are popular activities. Reality television can include mention of illegal drug use and prescription drug misuse. To determine if reality television and social media use of Twitter are associated with either illegal drug use or prescription drug misuse. Survey of 576 college students in 2011. Independent variables included watching reality television (social cognitive theory), parasocial interaction (parasocial interaction theory), television hours watched (cultivation theory), following a reality television character on Twitter, and demographics. Outcome variables were illegal drug use and prescription drug misuse. Watching reality television and also identifying with reality TV program characters were each associated with greater odds for illegal drug use. Also, following a reality TV character on Twitter had greater odds for illegal drug use and also in one analytical model for prescription drug misuse. No support was seen for cultivation theory. Those born in the United States had greater odds for illegal drug use and prescription drug misuse. Women and Asians had lower odds for illegal drug use. African Americans and Asians had lower odds for prescription drug misuse. Physicians, psychologists, and other healthcare practitioners may find it useful to include questions in their clinical interview about reality television watching and Twitter use. Physician and psychology groups, public health practitioners, and government health agencies should consider discussing with television broadcasting companies the potential negative impact of including content with illegal drugs and prescription drug misuse on reality television programs.
Learning Rationales and Virtual Reality Technology in Education.
ERIC Educational Resources Information Center
Chiou, Guey-Fa
1995-01-01
Defines and describes virtual reality technology and differentiates between virtual learning environment, learning material, and learning tools. Links learning rationales to virtual reality technology to pave conceptual foundations for application of virtual reality technology education. Constructivism, case-based learning, problem-based learning,…
Virtual Reality and the Virtual Library.
ERIC Educational Resources Information Center
Oppenheim, Charles
1993-01-01
Explains virtual reality, including proper and improper uses of the term, and suggests ways that libraries might be affected by it. Highlights include elements of virtual reality systems; possible virtual reality applications, including architecture, the chemical industry, transport planning, armed forces, and entertainment; and the virtual…
Rhetoric as Reality Construction.
ERIC Educational Resources Information Center
Kneupper, Charles W.
This essay provides an analytic development of a philosophy of rhetoric which focuses its concern on social reality. According to this philosophy, the activity of the human mind invents symbolic constructions of reality. Primary socialization is interpreted as a rhetorical process which tends to maintain prevailing reality constructions.…
Virtual Reality in the Classroom.
ERIC Educational Resources Information Center
Pantelidis, Veronica S.
1993-01-01
Considers the concept of virtual reality; reviews its history; describes general uses of virtual reality, including entertainment, medicine, and design applications; discusses classroom uses of virtual reality, including a software program called Virtus WalkThrough for use with a computer monitor; and suggests future possibilities. (34 references)…