Huang, Wei Tao; Luo, Hong Qun; Li, Nian Bing
2014-05-06
The most serious, and yet unsolved, problem of constructing molecular computing devices consists in connecting all of these molecular events into a usable device. This report demonstrates the use of Boolean logic tree for analyzing the chemical event network based on graphene, organic dye, thrombin aptamer, and Fenton reaction, organizing and connecting these basic chemical events. And this chemical event network can be utilized to implement fluorescent combinatorial logic (including basic logic gates and complex integrated logic circuits) and fuzzy logic computing. On the basis of the Boolean logic tree analysis and logic computing, these basic chemical events can be considered as programmable "words" and chemical interactions as "syntax" logic rules to construct molecular search engine for performing intelligent molecular search query. Our approach is helpful in developing the advanced logic program based on molecules for application in biosensing, nanotechnology, and drug delivery.
MIRAP, microcomputer reliability analysis program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jehee, J.N.T.
1989-01-01
A program for a microcomputer is outlined that can determine minimal cut sets from a specified fault tree logic. The speed and memory limitations of the microcomputers on which the program is implemented (Atari ST and IBM) are addressed by reducing the fault tree's size and by storing the cut set data on disk. Extensive well proven fault tree restructuring techniques, such as the identification of sibling events and of independent gate events, reduces the fault tree's size but does not alter its logic. New methods are used for the Boolean reduction of the fault tree logic. Special criteria formore » combining events in the 'AND' and 'OR' logic avoid the creation of many subsuming cut sets which all would cancel out due to existing cut sets. Figures and tables illustrates these methods. 4 refs., 5 tabs.« less
The FTA Method And A Possibility Of Its Application In The Area Of Road Freight Transport
NASA Astrophysics Data System (ADS)
Poliaková, Adela
2015-06-01
The Fault Tree process utilizes logic diagrams to portray and analyse potentially hazardous events. Three basic symbols (logic gates) are adequate for diagramming any fault tree. However, additional recently developed symbols can be used to reduce the time and effort required for analysis. A fault tree is a graphical representation of the relationship between certain specific events and the ultimate undesired event (2). This paper deals to method of Fault Tree Analysis basic description and provides a practical view on possibility of application by quality improvement in road freight transport company.
ERIC Educational Resources Information Center
Barker, Bruce O.; Petersen, Paul D.
This paper explores the fault-tree analysis approach to isolating failure modes within a system. Fault tree investigates potentially undesirable events and then looks for failures in sequence that would lead to their occurring. Relationships among these events are symbolized by AND or OR logic gates, AND used when single events must coexist to…
Dynamic Event Tree advancements and control logic improvements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego
The RAVEN code has been under development at the Idaho National Laboratory since 2012. Its main goal is to create a multi-purpose platform for the deploying of all the capabilities needed for Probabilistic Risk Assessment, uncertainty quantification, data mining analysis and optimization studies. RAVEN is currently equipped with three different sampling categories: Forward samplers (Monte Carlo, Latin Hyper Cube, Stratified, Grid Sampler, Factorials, etc.), Adaptive Samplers (Limit Surface search, Adaptive Polynomial Chaos, etc.) and Dynamic Event Tree (DET) samplers (Deterministic and Adaptive Dynamic Event Trees). The main subject of this document is to report the activities that have been donemore » in order to: start the migration of the RAVEN/RELAP-7 control logic system into MOOSE, and develop advanced dynamic sampling capabilities based on the Dynamic Event Tree approach. In order to provide to all MOOSE-based applications a control logic capability, in this Fiscal Year an initial migration activity has been initiated, moving the control logic system, designed for RELAP-7 by the RAVEN team, into the MOOSE framework. In this document, a brief explanation of what has been done is going to be reported. The second and most important subject of this report is about the development of a Dynamic Event Tree (DET) sampler named “Hybrid Dynamic Event Tree” (HDET) and its Adaptive variant “Adaptive Hybrid Dynamic Event Tree” (AHDET). As other authors have already reported, among the different types of uncertainties, it is possible to discern two principle types: aleatory and epistemic uncertainties. The classical Dynamic Event Tree is in charge of treating the first class (aleatory) uncertainties; the dependence of the probabilistic risk assessment and analysis on the epistemic uncertainties are treated by an initial Monte Carlo sampling (MCDET). From each Monte Carlo sample, a DET analysis is run (in total, N trees). The Monte Carlo employs a pre-sampling of the input space characterized by epistemic uncertainties. The consequent Dynamic Event Tree performs the exploration of the aleatory space. In the RAVEN code, a more general approach has been developed, not limiting the exploration of the epistemic space through a Monte Carlo method but using all the forward sampling strategies RAVEN currently employs. The user can combine a Latin Hyper Cube, Grid, Stratified and Monte Carlo sampling in order to explore the epistemic space, without any limitation. From this pre-sampling, the Dynamic Event Tree sampler starts its aleatory space exploration. As reported by the authors, the Dynamic Event Tree is a good fit to develop a goal-oriented sampling strategy. The DET is used to drive a Limit Surface search. The methodology that has been developed by the authors last year, performs a Limit Surface search in the aleatory space only. This report documents how this approach has been extended in order to consider the epistemic space interacting with the Hybrid Dynamic Event Tree methodology.« less
Master Logic Diagram: method for hazard and initiating event identification in process plants.
Papazoglou, I A; Aneziris, O N
2003-02-28
Master Logic Diagram (MLD), a method for identifying events initiating accidents in chemical installations, is presented. MLD is a logic diagram that resembles a fault tree but without the formal mathematical properties of the latter. MLD starts with a Top Event "Loss of Containment" and decomposes it into simpler contributing events. A generic MLD has been developed which may be applied to all chemical installations storing toxic and/or flammable substances. The method is exemplified through its application to an ammonia storage facility.
Evidential Networks for Fault Tree Analysis with Imprecise Knowledge
NASA Astrophysics Data System (ADS)
Yang, Jianping; Huang, Hong-Zhong; Liu, Yu; Li, Yan-Feng
2012-06-01
Fault tree analysis (FTA), as one of the powerful tools in reliability engineering, has been widely used to enhance system quality attributes. In most fault tree analyses, precise values are adopted to represent the probabilities of occurrence of those events. Due to the lack of sufficient data or imprecision of existing data at the early stage of product design, it is often difficult to accurately estimate the failure rates of individual events or the probabilities of occurrence of the events. Therefore, such imprecision and uncertainty need to be taken into account in reliability analysis. In this paper, the evidential networks (EN) are employed to quantify and propagate the aforementioned uncertainty and imprecision in fault tree analysis. The detailed conversion processes of some logic gates to EN are described in fault tree (FT). The figures of the logic gates and the converted equivalent EN, together with the associated truth tables and the conditional belief mass tables, are also presented in this work. The new epistemic importance is proposed to describe the effect of ignorance degree of event. The fault tree of an aircraft engine damaged by oil filter plugs is presented to demonstrate the proposed method.
Try Fault Tree Analysis, a Step-by-Step Way to Improve Organization Development.
ERIC Educational Resources Information Center
Spitzer, Dean
1980-01-01
Fault Tree Analysis, a systems safety engineering technology used to analyze organizational systems, is described. Explains the use of logic gates to represent the relationship between failure events, qualitative analysis, quantitative analysis, and effective use of Fault Tree Analysis. (CT)
Lu, Jiao Yang; Zhang, Xin Xing; Huang, Wei Tao; Zhu, Qiu Yan; Ding, Xue Zhi; Xia, Li Qiu; Luo, Hong Qun; Li, Nian Bing
2017-09-19
The most serious and yet unsolved problems of molecular logic computing consist in how to connect molecular events in complex systems into a usable device with specific functions and how to selectively control branchy logic processes from the cascading logic systems. This report demonstrates that a Boolean logic tree is utilized to organize and connect "plug and play" chemical events DNA, nanomaterials, organic dye, biomolecule, and denaturant for developing the dual-signal electrochemical evolution aptasensor system with good resettability for amplification detection of thrombin, controllable and selectable three-state logic computation, and keypad lock security operation. The aptasensor system combines the merits of DNA-functionalized nanoamplification architecture and simple dual-signal electroactive dye brilliant cresyl blue for sensitive and selective detection of thrombin with a wide linear response range of 0.02-100 nM and a detection limit of 1.92 pM. By using these aforementioned chemical events as inputs and the differential pulse voltammetry current changes at different voltages as dual outputs, a resettable three-input biomolecular keypad lock based on sequential logic is established. Moreover, the first example of controllable and selectable three-state molecular logic computation with active-high and active-low logic functions can be implemented and allows the output ports to assume a high impediment or nothing (Z) state in addition to the 0 and 1 logic levels, effectively controlling subsequent branchy logic computation processes. Our approach is helpful in developing the advanced controllable and selectable logic computing and sensing system in large-scale integration circuits for application in biomedical engineering, intelligent sensing, and control.
NASA Astrophysics Data System (ADS)
Riyadi, Eko H.
2014-09-01
Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logic model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.
Fuzzy branching temporal logic.
Moon, Seong-ick; Lee, Kwang H; Lee, Doheon
2004-04-01
Intelligent systems require a systematic way to represent and handle temporal information containing uncertainty. In particular, a logical framework is needed that can represent uncertain temporal information and its relationships with logical formulae. Fuzzy linear temporal logic (FLTL), a generalization of propositional linear temporal logic (PLTL) with fuzzy temporal events and fuzzy temporal states defined on a linear time model, was previously proposed for this purpose. However, many systems are best represented by branching time models in which each state can have more than one possible future path. In this paper, fuzzy branching temporal logic (FBTL) is proposed to address this problem. FBTL adopts and generalizes concurrent tree logic (CTL*), which is a classical branching temporal logic. The temporal model of FBTL is capable of representing fuzzy temporal events and fuzzy temporal states, and the order relation among them is represented as a directed graph. The utility of FBTL is demonstrated using a fuzzy job shop scheduling problem as an example.
Logic flowgraph methodology - A tool for modeling embedded systems
NASA Technical Reports Server (NTRS)
Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.
1991-01-01
The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.
A Fault Tree Approach to Analysis of Behavioral Systems: An Overview.
ERIC Educational Resources Information Center
Stephens, Kent G.
Developed at Brigham Young University, Fault Tree Analysis (FTA) is a technique for enhancing the probability of success in any system by analyzing the most likely modes of failure that could occur. It provides a logical, step-by-step description of possible failure events within a system and their interaction--the combinations of potential…
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Alfonsi; C. Rabiti; D. Mandelli
The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less
Risk assessment techniques with applicability in marine engineering
NASA Astrophysics Data System (ADS)
Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.
2015-11-01
Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.
Fault trees for decision making in systems analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lambert, Howard E.
1975-10-09
The application of fault tree analysis (FTA) to system safety and reliability is presented within the framework of system safety analysis. The concepts and techniques involved in manual and automated fault tree construction are described and their differences noted. The theory of mathematical reliability pertinent to FTA is presented with emphasis on engineering applications. An outline of the quantitative reliability techniques of the Reactor Safety Study is given. Concepts of probabilistic importance are presented within the fault tree framework and applied to the areas of system design, diagnosis and simulation. The computer code IMPORTANCE ranks basic events and cut setsmore » according to a sensitivity analysis. A useful feature of the IMPORTANCE code is that it can accept relative failure data as input. The output of the IMPORTANCE code can assist an analyst in finding weaknesses in system design and operation, suggest the most optimal course of system upgrade, and determine the optimal location of sensors within a system. A general simulation model of system failure in terms of fault tree logic is described. The model is intended for efficient diagnosis of the causes of system failure in the event of a system breakdown. It can also be used to assist an operator in making decisions under a time constraint regarding the future course of operations. The model is well suited for computer implementation. New results incorporated in the simulation model include an algorithm to generate repair checklists on the basis of fault tree logic and a one-step-ahead optimization procedure that minimizes the expected time to diagnose system failure.« less
NASA Astrophysics Data System (ADS)
Kamide, Norihiro; Kaneiwa, Ken
An extended full computation-tree logic, CTLS*, is introduced as a Kripke semantics with a sequence modal operator. This logic can appropriately represent hierarchical tree structures where sequence modal operators in CTLS* are applied to tree structures. An embedding theorem of CTLS* into CTL* is proved. The validity, satisfiability and model-checking problems of CTLS* are shown to be decidable. An illustrative example of biological taxonomy is presented using CTLS* formulas.
[The Application of the Fault Tree Analysis Method in Medical Equipment Maintenance].
Liu, Hongbin
2015-11-01
In this paper, the traditional fault tree analysis method is presented, detailed instructions for its application characteristics in medical instrument maintenance is made. It is made significant changes when the traditional fault tree analysis method is introduced into the medical instrument maintenance: gave up the logic symbolic, logic analysis and calculation, gave up its complicated programs, and only keep its image and practical fault tree diagram, and the fault tree diagram there are also differences: the fault tree is no longer a logical tree but the thinking tree in troubleshooting, the definition of the fault tree's nodes is different, the composition of the fault tree's branches is also different.
Lognormal Approximations of Fault Tree Uncertainty Distributions.
El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P
2018-01-26
Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riyadi, Eko H., E-mail: e.riyadi@bapeten.go.id
2014-09-30
Initiating event is defined as any event either internal or external to the nuclear power plants (NPPs) that perturbs the steady state operation of the plant, if operating, thereby initiating an abnormal event such as transient or loss of coolant accident (LOCA) within the NPPs. These initiating events trigger sequences of events that challenge plant control and safety systems whose failure could potentially lead to core damage or large early release. Selection for initiating events consists of two steps i.e. first step, definition of possible events, such as by evaluating a comprehensive engineering, and by constructing a top level logicmore » model. Then the second step, grouping of identified initiating event's by the safety function to be performed or combinations of systems responses. Therefore, the purpose of this paper is to discuss initiating events identification in event tree development process and to reviews other probabilistic safety assessments (PSA). The identification of initiating events also involves the past operating experience, review of other PSA, failure mode and effect analysis (FMEA), feedback from system modeling, and master logic diagram (special type of fault tree). By using the method of study for the condition of the traditional US PSA categorization in detail, could be obtained the important initiating events that are categorized into LOCA, transients and external events.« less
Evaluation of properties over phylogenetic trees using stochastic logics.
Requeno, José Ignacio; Colom, José Manuel
2016-06-14
Model checking has been recently introduced as an integrated framework for extracting information of the phylogenetic trees using temporal logics as a querying language, an extension of modal logics that imposes restrictions of a boolean formula along a path of events. The phylogenetic tree is considered a transition system modeling the evolution as a sequence of genomic mutations (we understand mutation as different ways that DNA can be changed), while this kind of logics are suitable for traversing it in a strict and exhaustive way. Given a biological property that we desire to inspect over the phylogeny, the verifier returns true if the specification is satisfied or a counterexample that falsifies it. However, this approach has been only considered over qualitative aspects of the phylogeny. In this paper, we repair the limitations of the previous framework for including and handling quantitative information such as explicit time or probability. To this end, we apply current probabilistic continuous-time extensions of model checking to phylogenetics. We reinterpret a catalog of qualitative properties in a numerical way, and we also present new properties that couldn't be analyzed before. For instance, we obtain the likelihood of a tree topology according to a mutation model. As case of study, we analyze several phylogenies in order to obtain the maximum likelihood with the model checking tool PRISM. In addition, we have adapted the software for optimizing the computation of maximum likelihoods. We have shown that probabilistic model checking is a competitive framework for describing and analyzing quantitative properties over phylogenetic trees. This formalism adds soundness and readability to the definition of models and specifications. Besides, the existence of model checking tools hides the underlying technology, omitting the extension, upgrade, debugging and maintenance of a software tool to the biologists. A set of benchmarks justify the feasibility of our approach.
Coloured Logic Petri Nets and analysis of their reachable trees
NASA Astrophysics Data System (ADS)
Wang, Jing; Du, YuYue; Yu, ShuXia
2015-11-01
Logic Petri nets (LPNs) can describe and analyse the batch processing function and passing value indeterminacy in cooperative systems, and alleviate the state space explosion problem. However, the indeterminate data of logical output transitions cannot be described explicitly in LPNs. Therefore, Coloured Logic Petri nets (CLPNs) are defined in this paper. It can determine the indeterminate data of logic output transitions in LPNs, i.e., the indeterminate data can be represented definitely in CLPNs. A vector matching method is proposed to judge the enabling transitions and analyse CLPNs. From the marking equation and the proposed reachable tree generation algorithm of CLPNs, a reachable tree can be built, and reachable markings are calculated. The advantage of CLPNs can be shown based on the number of leaf nodes of the reachability tree, and CLPNs can solve the indeterminate data of logical output transitions. Finally, an example shows that CLPNs can further reduce the dimensionality of reachable markings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarrack, A.G.
The purpose of this report is to document fault tree analyses which have been completed for the Defense Waste Processing Facility (DWPF) safety analysis. Logic models for equipment failures and human error combinations that could lead to flammable gas explosions in various process tanks, or failure of critical support systems were developed for internal initiating events and for earthquakes. These fault trees provide frequency estimates for support systems failures and accidents that could lead to radioactive and hazardous chemical releases both on-site and off-site. Top event frequency results from these fault trees will be used in further APET analyses tomore » calculate accident risk associated with DWPF facility operations. This report lists and explains important underlying assumptions, provides references for failure data sources, and briefly describes the fault tree method used. Specific commitments from DWPF to provide new procedural/administrative controls or system design changes are listed in the ''Facility Commitments'' section. The purpose of the ''Assumptions'' section is to clarify the basis for fault tree modeling, and is not necessarily a list of items required to be protected by Technical Safety Requirements (TSRs).« less
Preventing medical errors by designing benign failures.
Grout, John R
2003-07-01
One way to successfully reduce medical errors is to design health care systems that are more resistant to the tendencies of human beings to err. One interdisciplinary approach entails creating design changes, mitigating human errors, and making human error irrelevant to outcomes. This approach is intended to facilitate the creation of benign failures, which have been called mistake-proofing devices and forcing functions elsewhere. USING FAULT TREES TO DESIGN FORCING FUNCTIONS: A fault tree is a graphical tool used to understand the relationships that either directly cause or contribute to the cause of a particular failure. A careful analysis of a fault tree enables the analyst to anticipate how the process will behave after the change. EXAMPLE OF AN APPLICATION: A scenario in which a patient is scalded while bathing can serve as an example of how multiple fault trees can be used to design forcing functions. The first fault tree shows the undesirable event--patient scalded while bathing. The second fault tree has a benign event--no water. Adding a scald valve changes the outcome from the undesirable event ("patient scalded while bathing") to the benign event ("no water") Analysis of fault trees does not ensure or guarantee that changes necessary to eliminate error actually occur. Most mistake-proofing is used to prevent simple errors and to create well-defended processes, but complex errors can also result. The utilization of mistake-proofing or forcing functions can be thought of as changing the logic of a process. Errors that formerly caused undesirable failures can be converted into the causes of benign failures. The use of fault trees can provide a variety of insights into the design of forcing functions that will improve patient safety.
Recent Mega-Thrust Tsunamigenic Earthquakes and PTHA
NASA Astrophysics Data System (ADS)
Lorito, S.
2013-05-01
The occurrence of several mega-thrust tsunamigenic earthquakes in the last decade, including but not limited to the 2004 Sumatra-Andaman, the 2010 Maule, and 2011 Tohoku earthquakes, has been a dramatic reminder of the limitations in our capability of assessing earthquake and tsunami hazard and risk. However, the increasingly high-quality geophysical observational networks allowed the retrieval of most accurate than ever models of the rupture process of mega-thrust earthquakes, thus paving the way for future improved hazard assessments. Probabilistic Tsunami Hazard Analysis (PTHA) methodology, in particular, is less mature than its seismic counterpart, PSHA. Worldwide recent research efforts of the tsunami science community allowed to start filling this gap, and to define some best practices that are being progressively employed in PTHA for different regions and coasts at threat. In the first part of my talk, I will briefly review some rupture models of recent mega-thrust earthquakes, and highlight some of their surprising features that likely result in bigger error bars associated to PTHA results. More specifically, recent events of unexpected size at a given location, and with unexpected rupture process features, posed first-order open questions which prevent the definition of an heterogeneous rupture probability along a subduction zone, despite of several recent promising results on the subduction zone seismic cycle. In the second part of the talk, I will dig a bit more into a specific ongoing effort for improving PTHA methods, in particular as regards epistemic and aleatory uncertainties determination, and the computational PTHA feasibility when considering the full assumed source variability. Only logic trees are usually explicated in PTHA studies, accounting for different possible assumptions on the source zone properties and behavior. The selection of the earthquakes to be actually modelled is then in general made on a qualitative basis or remains implicit, despite different methods like event trees have been used for different applications. I will define a quite general PTHA framework, based on the mixed use of logic and event trees. I will first discuss a particular class of epistemic uncertainties, i.e. those related to the parametric fault characterization in terms of geometry, kinematics, and assessment of activity rates. A systematic classification in six justification levels of epistemic uncertainty related with the existence and behaviour of fault sources will be presented. Then, a particular branch of the logic tree is chosen in order to discuss just the aleatory variability of earthquake parameters, represented with an event tree. Even so, PTHA based on numerical scenarios is a too demanding computational task, particularly when probabilistic inundation maps are needed. For trying to reduce the computational burden without under-representing the source variability, the event tree is first constructed by taking care of densely (over-)sampling the earthquake parameter space, and then the earthquakes are filtered basing on their associated tsunami impact offshore, before calculating inundation maps. I'll describe this approach by means of a case study in the Mediterranean Sea, namely the PTHA for some locations of Eastern Sicily coasts and Southern Crete coast due to potential subduction earthquakes occurring on the Hellenic Arc.
Knowledge engineering in volcanology: Practical claims and general approach
NASA Astrophysics Data System (ADS)
Pshenichny, Cyril A.
2014-10-01
Knowledge engineering, being a branch of artificial intelligence, offers a variety of methods for elicitation and structuring of knowledge in a given domain. Only a few of them (ontologies and semantic nets, event/probability trees, Bayesian belief networks and event bushes) are known to volcanologists. Meanwhile, the tasks faced by volcanology and the solutions found so far favor a much wider application of knowledge engineering, especially tools for handling dynamic knowledge. This raises some fundamental logical and mathematical problems and requires an organizational effort, but may strongly improve panel discussions, enhance decision support, optimize physical modeling and support scientific collaboration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Schroeder, J.A.; Russell, K.D.
The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with amore » unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Schroeder, J.A.; Russell, K.D.
The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according tomore » plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.« less
Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B
2010-11-15
The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.
MacLeod, Dave; Charlebois, Robert L; Doolittle, Ford; Bapteste, Eric
2005-01-01
Background When organismal phylogenies based on sequences of single marker genes are poorly resolved, a logical approach is to add more markers, on the assumption that weak but congruent phylogenetic signal will be reinforced in such multigene trees. Such approaches are valid only when the several markers indeed have identical phylogenies, an issue which many multigene methods (such as the use of concatenated gene sequences or the assembly of supertrees) do not directly address. Indeed, even when the true history is a mixture of vertical descent for some genes and lateral gene transfer (LGT) for others, such methods produce unique topologies. Results We have developed software that aims to extract evidence for vertical and lateral inheritance from a set of gene trees compared against an arbitrary reference tree. This evidence is then displayed as a synthesis showing support over the tree for vertical inheritance, overlaid with explicit lateral gene transfer (LGT) events inferred to have occurred over the history of the tree. Like splits-tree methods, one can thus identify nodes at which conflict occurs. Additionally one can make reasonable inferences about vertical and lateral signal, assigning putative donors and recipients. Conclusion A tool such as ours can serve to explore the reticulated dimensionality of molecular evolution, by dissecting vertical and lateral inheritance at high resolution. By this, we mean that individual nodes can be examined not only for congruence, but also for coherence in light of LGT. We assert that our tools will facilitate the comparison of phylogenetic trees, and the interpretation of conflicting data. PMID:15819979
NASA Astrophysics Data System (ADS)
Krechowicz, Maria
2017-10-01
Nowadays, one of the characteristic features of construction industry is an increased complexity of a growing number of projects. Almost each construction project is unique, has its project-specific purpose, its own project structural complexity, owner’s expectations, ground conditions unique to a certain location, and its own dynamics. Failure costs and costs resulting from unforeseen problems in complex construction projects are very high. Project complexity drivers pose many vulnerabilities to a successful completion of a number of projects. This paper discusses the process of effective risk management in complex construction projects in which renewable energy sources were used, on the example of the realization phase of the ENERGIS teaching-laboratory building, from the point of view of DORBUD S.A., its general contractor. This paper suggests a new approach to risk management for complex construction projects in which renewable energy sources were applied. The risk management process was divided into six stages: gathering information, identification of the top, critical project risks resulting from the project complexity, construction of the fault tree for each top, critical risks, logical analysis of the fault tree, quantitative risk assessment applying fuzzy logic and development of risk response strategy. A new methodology for the qualitative and quantitative risk assessment for top, critical risks in complex construction projects was developed. Risk assessment was carried out applying Fuzzy Fault Tree analysis on the example of one top critical risk. Application of the Fuzzy sets theory to the proposed model allowed to decrease uncertainty and eliminate problems with gaining the crisp values of the basic events probability, common during expert risk assessment with the objective to give the exact risk score of each unwanted event probability.
Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua
2015-01-01
The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the approximate nature of the model and known limitations, means the applicability of UCERF3 should be evaluated on a case-by-case basis.
Canonical multi-valued input Reed-Muller trees and forms
NASA Technical Reports Server (NTRS)
Perkowski, M. A.; Johnson, P. D.
1991-01-01
There is recently an increased interest in logic synthesis using EXOR gates. The paper introduces the fundamental concept of Orthogonal Expansion, which generalizes the ring form of the Shannon expansion to the logic with multiple-valued (mv) inputs. Based on this concept we are able to define a family of canonical tree circuits. Such circuits can be considered for binary and multiple-valued input cases. They can be multi-level (trees and DAG's) or flattened to two-level AND-EXOR circuits. Input decoders similar to those used in Sum of Products (SOP) PLA's are used in realizations of multiple-valued input functions. In the case of the binary logic the family of flattened AND-EXOR circuits includes several forms discussed by Davio and Green. For the case of the logic with multiple-valued inputs, the family of the flattened mv AND-EXOR circuits includes three expansions known from literature and two new expansions.
A computational framework for prime implicants identification in noncoherent dynamic systems.
Di Maio, Francesco; Baronchelli, Samuele; Zio, Enrico
2015-01-01
Dynamic reliability methods aim at complementing the capability of traditional static approaches (e.g., event trees [ETs] and fault trees [FTs]) by accounting for the system dynamic behavior and its interactions with the system state transition process. For this, the system dynamics is here described by a time-dependent model that includes the dependencies with the stochastic transition events. In this article, we present a novel computational framework for dynamic reliability analysis whose objectives are i) accounting for discrete stochastic transition events and ii) identifying the prime implicants (PIs) of the dynamic system. The framework entails adopting a multiple-valued logic (MVL) to consider stochastic transitions at discretized times. Then, PIs are originally identified by a differential evolution (DE) algorithm that looks for the optimal MVL solution of a covering problem formulated for MVL accident scenarios. For testing the feasibility of the framework, a dynamic noncoherent system composed of five components that can fail at discretized times has been analyzed, showing the applicability of the framework to practical cases. © 2014 Society for Risk Analysis.
Design and implementation of the tree-based fuzzy logic controller.
Liu, B D; Huang, C Y
1997-01-01
In this paper, a tree-based approach is proposed to design the fuzzy logic controller. Based on the proposed methodology, the fuzzy logic controller has the following merits: the fuzzy control rule can be extracted automatically from the input-output data of the system and the extraction process can be done in one-pass; owing to the fuzzy tree inference structure, the search spaces of the fuzzy inference process are largely reduced; the operation of the inference process can be simplified as a one-dimensional matrix operation because of the fuzzy tree approach; and the controller has regular and modular properties, so it is easy to be implemented by hardware. Furthermore, the proposed fuzzy tree approach has been applied to design the color reproduction system for verifying the proposed methodology. The color reproduction system is mainly used to obtain a color image through the printer that is identical to the original one. In addition to the software simulation, an FPGA is used to implement the prototype hardware system for real-time application. Experimental results show that the effect of color correction is quite good and that the prototype hardware system can operate correctly under the condition of 30 MHz clock rate.
Trimming the UCERF2 hazard logic tree
Porter, Keith A.; Field, Edward H.; Milner, Kevin
2012-01-01
The Uniform California Earthquake Rupture Forecast 2 (UCERF2) is a fully time‐dependent earthquake rupture forecast developed with sponsorship of the California Earthquake Authority (Working Group on California Earthquake Probabilities [WGCEP], 2007; Field et al., 2009). UCERF2 contains 480 logic‐tree branches reflecting choices among nine modeling uncertainties in the earthquake rate model shown in Figure 1. For seismic hazard analysis, it is also necessary to choose a ground‐motion‐prediction equation (GMPE) and set its parameters. Choosing among four next‐generation attenuation (NGA) relationships results in a total of 1920 hazard calculations per site. The present work is motivated by a desire to reduce the computational effort involved in a hazard analysis without understating uncertainty. We set out to assess which branching points of the UCERF2 logic tree contribute most to overall uncertainty, and which might be safely ignored (set to only one branch) without significantly biasing results or affecting some useful measure of uncertainty. The trimmed logic tree will have all of the original choices from the branching points that contribute significantly to uncertainty, but only one arbitrarily selected choice from the branching points that do not.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
2011-01-01
Analysis of the material protection, control, and accountability (MPC&A) system is necessary to understand the limits and vulnerabilities of the system to internal threats. A self-appraisal helps the facility be prepared to respond to internal threats and reduce the risk of theft or diversion of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) fault tree was developed to depict the failure of the MPC&A system as a result of poor practices and random failures in the MC&A system. It can also be employed as a basis for assessing deliberate threats against a facility. MSET uses faultmore » tree analysis, which is a top-down approach to examining system failure. The analysis starts with identifying a potential undesirable event called a 'top event' and then determining the ways it can occur (e.g., 'Fail To Maintain Nuclear Materials Under The Purview Of The MC&A System'). The analysis proceeds by determining how the top event can be caused by individual or combined lower level faults or failures. These faults, which are the causes of the top event, are 'connected' through logic gates. The MSET model uses AND-gates and OR-gates and propagates the effect of event failure using Boolean algebra. To enable the fault tree analysis calculations, the basic events in the fault tree are populated with probability risk values derived by conversion of questionnaire data to numeric values. The basic events are treated as independent variables. This assumption affects the Boolean algebraic calculations used to calculate results. All the necessary calculations are built into the fault tree codes, but it is often useful to estimate the probabilities manually as a check on code functioning. The probability of failure of a given basic event is the probability that the basic event primary question fails to meet the performance metric for that question. The failure probability is related to how well the facility performs the task identified in that basic event over time (not just one performance or exercise). Fault tree calculations provide a failure probability for the top event in the fault tree. The basic fault tree calculations establish a baseline relative risk value for the system. This probability depicts relative risk, not absolute risk. Subsequent calculations are made to evaluate the change in relative risk that would occur if system performance is improved or degraded. During the development effort of MSET, the fault tree analysis program used was SAPHIRE. SAPHIRE is an acronym for 'Systems Analysis Programs for Hands-on Integrated Reliability Evaluations.' Version 1 of the SAPHIRE code was sponsored by the Nuclear Regulatory Commission in 1987 as an innovative way to draw, edit, and analyze graphical fault trees primarily for safe operation of nuclear power reactors. When the fault tree calculations are performed, the fault tree analysis program will produce several reports that can be used to analyze the MPC&A system. SAPHIRE produces reports showing risk importance factors for all basic events in the operational MC&A system. The risk importance information is used to examine the potential impacts when performance of certain basic events increases or decreases. The initial results produced by the SAPHIRE program are considered relative risk values. None of the results can be interpreted as absolute risk values since the basic event probability values represent estimates of risk associated with the performance of MPC&A tasks throughout the material balance area (MBA). The RRR for a basic event represents the decrease in total system risk that would result from improvement of that one event to a perfect performance level. Improvement of the basic event with the greatest RRR value produces a greater decrease in total system risk than improvement of any other basic event. Basic events with the greatest potential for system risk reduction are assigned performance improvement values, and new fault tree calculations show the improvement in total system risk. The operational impact or cost-effectiveness from implementing the performance improvements can then be evaluated. The improvements being evaluated can be system performance improvements, or they can be potential, or actual, upgrades to the system. The RIR for a basic event represents the increase in total system risk that would result from failure of that one event. Failure of the basic event with the greatest RIR value produces a greater increase in total system risk than failure of any other basic event. Basic events with the greatest potential for system risk increase are assigned failure performance values, and new fault tree calculations show the increase in total system risk. This evaluation shows the importance of preventing performance degradation of the basic events. SAPHIRE identifies combinations of basic events where concurrent failure of the events results in failure of the top event.« less
A comparative critical study between FMEA and FTA risk analysis methods
NASA Astrophysics Data System (ADS)
Cristea, G.; Constantinescu, DM
2017-10-01
Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.
The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overviewmore » of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.« less
A seismic hazard uncertainty analysis for the New Madrid seismic zone
Cramer, C.H.
2001-01-01
A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.
Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)
NASA Astrophysics Data System (ADS)
Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.
2016-06-01
We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.
The Quantification of Consistent Subjective Logic Tree Branch Weights for PSHA
NASA Astrophysics Data System (ADS)
Runge, A. K.; Scherbaum, F.
2012-04-01
The development of quantitative models for the rate of exceedance of seismically generated ground motion parameters is the target of probabilistic seismic hazard analysis (PSHA). In regions of low to moderate seismicity, the selection and evaluation of source- and/or ground-motion models is often a major challenge to hazard analysts and affected by large epistemic uncertainties. In PSHA this type of uncertainties is commonly treated within a logic tree framework in which the branch weights express the degree-of-belief values of an expert in the corresponding set of models. For the calculation of the distribution of hazard curves, these branch weights are subsequently used as subjective probabilities. However the quality of the results depends strongly on the "quality" of the expert knowledge. A major challenge for experts in this context is to provide weight estimates which are logically consistent (in the sense of Kolmogorov's axioms) and to be aware of and to deal with the multitude of heuristics and biases which affect human judgment under uncertainty. For example, people tend to give smaller weights to each branch of a logic tree the more branches it has, starting with equal weights for all branches and then adjusting this uniform distribution based on his/her beliefs about how the branches differ. This effect is known as pruning bias.¹ A similar unwanted effect, which may even wrongly suggest robustness of the corresponding hazard estimates, will appear in cases where all models are first judged according to some numerical quality measure approach and the resulting weights are subsequently normalized to sum up to one.2 To address these problems, we have developed interactive graphical tools for the determination of logic tree branch weights in form of logically consistent subjective probabilities, based on the concepts suggested in Curtis and Wood (2004).3 Instead of determining the set of weights for all the models in a single step, the computer driven elicitation process is performed as a sequence of evaluations of relative weights for small subsets of models which are presented to the analyst. From these, the distribution of logic tree weights for the whole model set is determined as solution of an optimization problem. The model subset presented to the analyst in each step is designed to maximize the expected information. The result of this process is a set of logically consistent weights together with a measure of confidence determined from the amount of conflicting information which is provided by the expert during the relative weighting process.
Shao, Q; Rowe, R C; York, P
2007-06-01
Understanding of the cause-effect relationships between formulation ingredients, process conditions and product properties is essential for developing a quality product. However, the formulation knowledge is often hidden in experimental data and not easily interpretable. This study compares neurofuzzy logic and decision tree approaches in discovering hidden knowledge from an immediate release tablet formulation database relating formulation ingredients (silica aerogel, magnesium stearate, microcrystalline cellulose and sodium carboxymethylcellulose) and process variables (dwell time and compression force) to tablet properties (tensile strength, disintegration time, friability, capping and drug dissolution at various time intervals). Both approaches successfully generated useful knowledge in the form of either "if then" rules or decision trees. Although different strategies are employed by the two approaches in generating rules/trees, similar knowledge was discovered in most cases. However, as decision trees are not able to deal with continuous dependent variables, data discretisation procedures are generally required.
LEGO-MM: LEarning structured model by probabilistic loGic Ontology tree for MultiMedia.
Tang, Jinhui; Chang, Shiyu; Qi, Guo-Jun; Tian, Qi; Rui, Yong; Huang, Thomas S
2016-09-22
Recent advances in Multimedia ontology have resulted in a number of concept models, e.g., LSCOM and Mediamill 101, which are accessible and public to other researchers. However, most current research effort still focuses on building new concepts from scratch, very few work explores the appropriate method to construct new concepts upon the existing models already in the warehouse. To address this issue, we propose a new framework in this paper, termed LEGO1-MM, which can seamlessly integrate both the new target training examples and the existing primitive concept models to infer the more complex concept models. LEGOMM treats the primitive concept models as the lego toy to potentially construct an unlimited vocabulary of new concepts. Specifically, we first formulate the logic operations to be the lego connectors to combine existing concept models hierarchically in probabilistic logic ontology trees. Then, we incorporate new target training information simultaneously to efficiently disambiguate the underlying logic tree and correct the error propagation. Extensive experiments are conducted on a large vehicle domain data set from ImageNet. The results demonstrate that LEGO-MM has significantly superior performance over existing state-of-the-art methods, which build new concept models from scratch.
Flexible programmable logic module
Robertson, Perry J.; Hutchinson, Robert L.; Pierson, Lyndon G.
2001-01-01
The circuit module of this invention is a VME board containing a plurality of programmable logic devices (PLDs), a controlled impedance clock tree, and interconnecting buses. The PLDs are arranged to permit systolic processing of a problem by offering wide data buses and a plurality of processing nodes. The board contains a clock reference and clock distribution tree that can drive each of the PLDs with two critically timed clock references. External clock references can be used to drive additional circuit modules all operating from the same synchronous clock reference.
Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques
NASA Technical Reports Server (NTRS)
Hardy, Terry L.; Rapp, Douglas C.
1994-01-01
The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.
Radiation tolerant combinational logic cell
NASA Technical Reports Server (NTRS)
Maki, Gary R. (Inventor); Whitaker, Sterling (Inventor); Gambles, Jody W. (Inventor)
2009-01-01
A system has a reduced sensitivity to Single Event Upset and/or Single Event Transient(s) compared to traditional logic devices. In a particular embodiment, the system includes an input, a logic block, a bias stage, a state machine, and an output. The logic block is coupled to the input. The logic block is for implementing a logic function, receiving a data set via the input, and generating a result f by applying the data set to the logic function. The bias stage is coupled to the logic block. The bias stage is for receiving the result from the logic block and presenting it to the state machine. The state machine is coupled to the bias stage. The state machine is for receiving, via the bias stage, the result generated by the logic block. The state machine is configured to retain a state value for the system. The state value is typically based on the result generated by the logic block. The output is coupled to the state machine. The output is for providing the value stored by the state machine. Some embodiments of the invention produce dual rail outputs Q and Q'. The logic block typically contains combinational logic and is similar, in size and transistor configuration, to a conventional CMOS combinational logic design. However, only a very small portion of the circuits of these embodiments, is sensitive to Single Event Upset and/or Single Event Transients.
Fresch, Barbara; Bocquel, Juanita; Hiluf, Dawit; Rogge, Sven; Levine, Raphael D; Remacle, Françoise
2017-07-05
To realize low-power, compact logic circuits, one can explore parallel operation on single nanoscale devices. An added incentive is to use multivalued (as distinct from Boolean) logic. Here, we theoretically demonstrate that the computation of all the possible outputs of a multivariate, multivalued logic function can be implemented in parallel by electrical addressing of a molecule made up of three interacting dopant atoms embedded in Si. The electronic states of the dopant molecule are addressed by pulsing a gate voltage. By simulating the time evolution of the non stationary electronic density built by the gate voltage, we show that one can implement a molecular decision tree that provides in parallel all the outputs for all the inputs of the multivariate, multivalued logic function. The outputs are encoded in the populations and in the bond orders of the dopant molecule, which can be measured using an STM tip. We show that the implementation of the molecular logic tree is equivalent to a spectral function decomposition. The function that is evaluated can be field-programmed by changing the time profile of the pulsed gate voltage. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula
2011-01-01
Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.
NASA Technical Reports Server (NTRS)
Guarro, Sergio B.
2010-01-01
This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.
Considering potential seismic sources in earthquake hazard assessment for Northern Iran
NASA Astrophysics Data System (ADS)
Abdollahzadeh, Gholamreza; Sazjini, Mohammad; Shahaky, Mohsen; Tajrishi, Fatemeh Zahedi; Khanmohammadi, Leila
2014-07-01
Located on the Alpine-Himalayan earthquake belt, Iran is one of the seismically active regions of the world. Northern Iran, south of Caspian Basin, a hazardous subduction zone, is a densely populated and developing area of the country. Historical and instrumental documented seismicity indicates the occurrence of severe earthquakes leading to many deaths and large losses in the region. With growth of seismological and tectonic data, updated seismic hazard assessment is a worthwhile issue in emergency management programs and long-term developing plans in urban and rural areas of this region. In the present study, being armed with up-to-date information required for seismic hazard assessment including geological data and active tectonic setting for thorough investigation of the active and potential seismogenic sources, and historical and instrumental events for compiling the earthquake catalogue, probabilistic seismic hazard assessment is carried out for the region using three recent ground motion prediction equations. The logic tree method is utilized to capture epistemic uncertainty of the seismic hazard assessment in delineation of the seismic sources and selection of attenuation relations. The results are compared to a recent practice in code-prescribed seismic hazard of the region and are discussed in detail to explore their variation in each branch of logic tree approach. Also, seismic hazard maps of peak ground acceleration in rock site for 475- and 2,475-year return periods are provided for the region.
CRANS - CONFIGURABLE REAL-TIME ANALYSIS SYSTEM
NASA Technical Reports Server (NTRS)
Mccluney, K.
1994-01-01
In a real-time environment, the results of changes or failures in a complex, interconnected system need evaluation quickly. Tabulations showing the effects of changes and/or failures of a given item in the system are generally only useful for a single input, and only with regard to that item. Subsequent changes become harder to evaluate as combinations of failures produce a cascade effect. When confronted by multiple indicated failures in the system, it becomes necessary to determine a single cause. In this case, failure tables are not very helpful. CRANS, the Configurable Real-time ANalysis System, can interpret a logic tree, constructed by the user, describing a complex system and determine the effects of changes and failures in it. Items in the tree are related to each other by Boolean operators. The user is then able to change the state of these items (ON/OFF FAILED/UNFAILED). The program then evaluates the logic tree based on these changes and determines any resultant changes to other items in the tree. CRANS can also search for a common cause for multiple item failures, and allow the user to explore the logic tree from within the program. A "help" mode and a reference check provide the user with a means of exploring an item's underlying logic from within the program. A commonality check determines single point failures for an item or group of items. Output is in the form of a user-defined matrix or matrices of colored boxes, each box representing an item or set of items from the logic tree. Input is via mouse selection of the matrix boxes, using the mouse buttons to toggle the state of the item. CRANS is written in C-language and requires the MIT X Window System, Version 11 Revision 4 or Revision 5. It requires 78K of RAM for execution and a three button mouse. It has been successfully implemented on Sun4 workstations running SunOS, HP9000 workstations running HP-UX, and DECstations running ULTRIX. No executable is provided on the distribution medium; however, a sample makefile is included. Sample input files are also included. The standard distribution medium is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. This program was developed in 1992.
NASA Astrophysics Data System (ADS)
Ghosh, Amal K.; Bhattacharya, Animesh; Raul, Moumita; Basuray, Amitabha
2012-07-01
Arithmetic logic unit (ALU) is the most important unit in any computing system. Optical computing is becoming popular day-by-day because of its ultrahigh processing speed and huge data handling capability. Obviously for the fast processing we need the optical TALU compatible with the multivalued logic. In this regard we are communicating the trinary arithmetic and logic unit (TALU) in modified trinary number (MTN) system, which is suitable for the optical computation and other applications in multivalued logic system. Here the savart plate and spatial light modulator (SLM) based optoelectronic circuits have been used to exploit the optical tree architecture (OTA) in optical interconnection network.
Trimming a hazard logic tree with a new model-order-reduction technique
Porter, Keith; Field, Edward; Milner, Kevin R
2017-01-01
The size of the logic tree within the Uniform California Earthquake Rupture Forecast Version 3, Time-Dependent (UCERF3-TD) model can challenge risk analyses of large portfolios. An insurer or catastrophe risk modeler concerned with losses to a California portfolio might have to evaluate a portfolio 57,600 times to estimate risk in light of the hazard possibility space. Which branches of the logic tree matter most, and which can one ignore? We employed two model-order-reduction techniques to simplify the model. We sought a subset of parameters that must vary, and the specific fixed values for the remaining parameters, to produce approximately the same loss distribution as the original model. The techniques are (1) a tornado-diagram approach we employed previously for UCERF2, and (2) an apparently novel probabilistic sensitivity approach that seems better suited to functions of nominal random variables. The new approach produces a reduced-order model with only 60 of the original 57,600 leaves. One can use the results to reduce computational effort in loss analyses by orders of magnitude.
Pre-Modeling Ensures Accurate Solid Models
ERIC Educational Resources Information Center
Gow, George
2010-01-01
Successful solid modeling requires a well-organized design tree. The design tree is a list of all the object's features and the sequential order in which they are modeled. The solid-modeling process is faster and less prone to modeling errors when the design tree is a simple and geometrically logical definition of the modeled object. Few high…
Requeno, José Ignacio; Colom, José Manuel
2014-12-01
Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.
Requeno, José Ignacio; Colom, José Manuel
2014-10-23
Model checking is a generic verification technique that allows the phylogeneticist to focus on models and specifications instead of on implementation issues. Phylogenetic trees are considered as transition systems over which we interrogate phylogenetic questions written as formulas of temporal logic. Nonetheless, standard logics become insufficient for certain practices of phylogenetic analysis since they do not allow the inclusion of explicit time and probabilities. The aim of this paper is to extend the application of model checking techniques beyond qualitative phylogenetic properties and adapt the existing logical extensions and tools to the field of phylogeny. The introduction of time and probabilities in phylogenetic specifications is motivated by the study of a real example: the analysis of the ratio of lactose intolerance in some populations and the date of appearance of this phenotype.
Development of a methodology for assessing the safety of embedded software systems
NASA Technical Reports Server (NTRS)
Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.
1993-01-01
A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.
System on chip module configured for event-driven architecture
Robbins, Kevin; Brady, Charles E.; Ashlock, Tad A.
2017-10-17
A system on chip (SoC) module is described herein, wherein the SoC modules comprise a processor subsystem and a hardware logic subsystem. The processor subsystem and hardware logic subsystem are in communication with one another, and transmit event messages between one another. The processor subsystem executes software actors, while the hardware logic subsystem includes hardware actors, the software actors and hardware actors conform to an event-driven architecture, such that the software actors receive and generate event messages and the hardware actors receive and generate event messages.
Using Fault Trees to Advance Understanding of Diagnostic Errors.
Rogith, Deevakar; Iyengar, M Sriram; Singh, Hardeep
2017-11-01
Diagnostic errors annually affect at least 5% of adults in the outpatient setting in the United States. Formal analytic techniques are only infrequently used to understand them, in part because of the complexity of diagnostic processes and clinical work flows involved. In this article, diagnostic errors were modeled using fault tree analysis (FTA), a form of root cause analysis that has been successfully used in other high-complexity, high-risk contexts. How factors contributing to diagnostic errors can be systematically modeled by FTA to inform error understanding and error prevention is demonstrated. A team of three experts reviewed 10 published cases of diagnostic error and constructed fault trees. The fault trees were modeled according to currently available conceptual frameworks characterizing diagnostic error. The 10 trees were then synthesized into a single fault tree to identify common contributing factors and pathways leading to diagnostic error. FTA is a visual, structured, deductive approach that depicts the temporal sequence of events and their interactions in a formal logical hierarchy. The visual FTA enables easier understanding of causative processes and cognitive and system factors, as well as rapid identification of common pathways and interactions in a unified fashion. In addition, it enables calculation of empirical estimates for causative pathways. Thus, fault trees might provide a useful framework for both quantitative and qualitative analysis of diagnostic errors. Future directions include establishing validity and reliability by modeling a wider range of error cases, conducting quantitative evaluations, and undertaking deeper exploration of other FTA capabilities. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.
The Decision Tree for Teaching Management of Uncertainty
ERIC Educational Resources Information Center
Knaggs, Sara J.; And Others
1974-01-01
A 'decision tree' consists of an outline of the patient's symptoms and a logic for decision and action. It is felt that this approach to the decisionmaking process better facilitates each learner's application of his own level of knowledge and skills. (Author)
Initiating Event Analysis of a Lithium Fluoride Thorium Reactor
NASA Astrophysics Data System (ADS)
Geraci, Nicholas Charles
The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to arrive at a list of key initiating events that can be used to address vulnerabilities during the design phases of LFTR development.
Toward the Decision Tree for Inferring Requirements Maturation Types
NASA Astrophysics Data System (ADS)
Nakatani, Takako; Kondo, Narihito; Shirogane, Junko; Kaiya, Haruhiko; Hori, Shozo; Katamine, Keiichi
Requirements are elicited step by step during the requirements engineering (RE) process. However, some types of requirements are elicited completely after the scheduled requirements elicitation process is finished. Such a situation is regarded as problematic situation. In our study, the difficulties of eliciting various kinds of requirements is observed by components. We refer to the components as observation targets (OTs) and introduce the word “Requirements maturation.” It means when and how requirements are elicited completely in the project. The requirements maturation is discussed on physical and logical OTs. OTs Viewed from a logical viewpoint are called logical OTs, e.g. quality requirements. The requirements of physical OTs, e.g., modules, components, subsystems, etc., includes functional and non-functional requirements. They are influenced by their requesters' environmental changes, as well as developers' technical changes. In order to infer the requirements maturation period of each OT, we need to know how much these factors influence the OTs' requirements maturation. According to the observation of actual past projects, we defined the PRINCE (Pre Requirements Intelligence Net Consideration and Evaluation) model. It aims to guide developers in their observation of the requirements maturation of OTs. We quantitatively analyzed the actual cases with their requirements elicitation process and extracted essential factors that influence the requirements maturation. The results of interviews of project managers are analyzed by WEKA, a data mining system, from which the decision tree was derived. This paper introduces the PRINCE model and the category of logical OTs to be observed. The decision tree that helps developers infer the maturation type of an OT is also described. We evaluate the tree through real projects and discuss its ability to infer the requirements maturation types.
Briggs, Andrew H; Ades, A E; Price, Martin J
2003-01-01
In structuring decision models of medical interventions, it is commonly recommended that only 2 branches be used for each chance node to avoid logical inconsistencies that can arise during sensitivity analyses if the branching probabilities do not sum to 1. However, information may be naturally available in an unconditional form, and structuring a tree in conditional form may complicate rather than simplify the sensitivity analysis of the unconditional probabilities. Current guidance emphasizes using probabilistic sensitivity analysis, and a method is required to provide probabilistic probabilities over multiple branches that appropriately represents uncertainty while satisfying the requirement that mutually exclusive event probabilities should sum to 1. The authors argue that the Dirichlet distribution, the multivariate equivalent of the beta distribution, is appropriate for this purpose and illustrate its use for generating a fully probabilistic transition matrix for a Markov model. Furthermore, they demonstrate that by adopting a Bayesian approach, the problem of observing zero counts for transitions of interest can be overcome.
Graded Alternating-Time Temporal Logic
NASA Astrophysics Data System (ADS)
Faella, Marco; Napoli, Margherita; Parente, Mimmo
Graded modalities enrich the universal and existential quantifiers with the capability to express the concept of at least k or all but k, for a non-negative integer k. Recently, temporal logics such as μ-calculus and Computational Tree Logic, Ctl, augmented with graded modalities have received attention from the scientific community, both from a theoretical side and from an applicative perspective. Both μ-calculus and Ctl naturally apply as specification languages for closed systems: in this paper, we add graded modalities to the Alternating-time Temporal Logic (Atl) introduced by Alur et al., to study how these modalities may affect specification languages for open systems.
NASA Technical Reports Server (NTRS)
Canaris, J.
1991-01-01
A new logic family, which is immune to single event upsets, is described. Members of the logic family are capable of recovery, regardless of the shape of the upsetting event. Glitch propagation from an upset node is also blocked. Logic diagrams for an Inverter, Nor, Nand, and Complex Gates are provided. The logic family can be implemented in a standard, commercial CMOS process with no additional masks. DC, transient, static power, upset recovery and layout characteristics of the new family, based on a commercial 1 micron CMOS N-Well process, are described.
Multi scenario seismic hazard assessment for Egypt
NASA Astrophysics Data System (ADS)
Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed
2018-01-01
Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.
Multi scenario seismic hazard assessment for Egypt
NASA Astrophysics Data System (ADS)
Mostafa, Shaimaa Ismail; Abd el-aal, Abd el-aziz Khairy; El-Eraki, Mohamed Ahmed
2018-05-01
Egypt is located in the northeastern corner of Africa within a sensitive seismotectonic location. Earthquakes are concentrated along the active tectonic boundaries of African, Eurasian, and Arabian plates. The study area is characterized by northward increasing sediment thickness leading to more damage to structures in the north due to multiple reflections of seismic waves. Unfortunately, man-made constructions in Egypt were not designed to resist earthquake ground motions. So, it is important to evaluate the seismic hazard to reduce social and economic losses and preserve lives. The probabilistic seismic hazard assessment is used to evaluate the hazard using alternative seismotectonic models within a logic tree framework. Alternate seismotectonic models, magnitude-frequency relations, and various indigenous attenuation relationships were amended within a logic tree formulation to compute and develop the regional exposure on a set of hazard maps. Hazard contour maps are constructed for peak ground acceleration as well as 0.1-, 0.2-, 0.5-, 1-, and 2-s spectral periods for 100 and 475 years return periods for ground motion on rock. The results illustrate that Egypt is characterized by very low to high seismic activity grading from the west to the eastern part of the country. The uniform hazard spectra are estimated at some important cities distributed allover Egypt. The deaggregation of seismic hazard is estimated at some cities to identify the scenario events that contribute to a selected seismic hazard level. The results of this study can be used in seismic microzonation, risk mitigation, and earthquake engineering purposes.
The Case of the Similar Trees.
ERIC Educational Resources Information Center
Meyer, Rochelle Wilson
1982-01-01
A possible logical flaw based on similar triangles is discussed with the Sherlock Holmes mystery, "The Muskgrave Ritual." The possible flaw has to do with the need for two trees to have equal growth rates over a 250-year period in order for the solution presented to work. (MP)
NASA Astrophysics Data System (ADS)
Sheehan, T.; Baker, B.; Degagne, R. S.
2015-12-01
With the abundance of data sources, analytical methods, and computer models, land managers are faced with the overwhelming task of making sense of a profusion of data of wildly different types. Luckily, fuzzy logic provides a method to work with different types of data using language-based propositions such as "the landscape is undisturbed," and a simple set of logic constructs. Just as many surveys allow different levels of agreement with a proposition, fuzzy logic allows values reflecting different levels of truth for a proposition. Truth levels fall within a continuum ranging from Fully True to Fully False. Hence a fuzzy logic model produces continuous results. The Environmental Evaluation Modeling System (EEMS) is a platform-independent, tree-based, fuzzy logic modeling framework. An EEMS model provides a transparent definition of an evaluation model and is commonly developed as a collaborative effort among managers, scientists, and GIS experts. Managers specify a set of evaluative propositions used to characterize the landscape. Scientists, working with managers, formulate functions that convert raw data values into truth values for the propositions and produce a logic tree to combine results into a single metric used to guide decisions. Managers, scientists, and GIS experts then work together to implement and iteratively tune the logic model and produce final results. We present examples of two successful EEMS projects that provided managers with map-based results suitable for guiding decisions: sensitivity and climate change exposure in Utah and the Colorado Plateau modeled for the Bureau of Land Management; and terrestrial ecological intactness in the Mojave and Sonoran region of southern California modeled for the Desert Renewable Energy Conservation Plan.
Topological and Orthomodular Modeling of Context in Behavioral Science
NASA Astrophysics Data System (ADS)
Narens, Louis
2017-02-01
Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.
The decision tree approach to classification
NASA Technical Reports Server (NTRS)
Wu, C.; Landgrebe, D. A.; Swain, P. H.
1975-01-01
A class of multistage decision tree classifiers is proposed and studied relative to the classification of multispectral remotely sensed data. The decision tree classifiers are shown to have the potential for improving both the classification accuracy and the computation efficiency. Dimensionality in pattern recognition is discussed and two theorems on the lower bound of logic computation for multiclass classification are derived. The automatic or optimization approach is emphasized. Experimental results on real data are reported, which clearly demonstrate the usefulness of decision tree classifiers.
NASA Astrophysics Data System (ADS)
Fulkerson, David E.
2010-02-01
This paper describes a new methodology for characterizing the electrical behavior and soft error rate (SER) of CMOS and SiGe HBT integrated circuits that are struck by ions. A typical engineering design problem is to calculate the SER of a critical path that commonly includes several circuits such as an input buffer, several logic gates, logic storage, clock tree circuitry, and an output buffer. Using multiple 3D TCAD simulations to solve this problem is too costly and time-consuming for general engineering use. The new and simple methodology handles the problem with ease by simple SPICE simulations. The methodology accurately predicts the measured threshold linear energy transfer (LET) of a bulk CMOS SRAM. It solves for circuit currents and voltage spikes that are close to those predicted by expensive 3D TCAD simulations. It accurately predicts the measured event cross-section vs. LET curve of an experimental SiGe HBT flip-flop. The experimental cross section vs. frequency behavior and other subtle effects are also accurately predicted.
NASA Astrophysics Data System (ADS)
Sanchez-Vila, X.; de Barros, F.; Bolster, D.; Nowak, W.
2010-12-01
Assessing the potential risk of hydro(geo)logical supply systems to human population is an interdisciplinary field. It relies on the expertise in fields as distant as hydrogeology, medicine, or anthropology, and needs powerful translation concepts to provide decision support and policy making. Reliable health risk estimates need to account for the uncertainties in hydrological, physiological and human behavioral parameters. We propose the use of fault trees to address the task of probabilistic risk analysis (PRA) and to support related management decisions. Fault trees allow decomposing the assessment of health risk into individual manageable modules, thus tackling a complex system by a structural “Divide and Conquer” approach. The complexity within each module can be chosen individually according to data availability, parsimony, relative importance and stage of analysis. The separation in modules allows for a true inter- and multi-disciplinary approach. This presentation highlights the three novel features of our work: (1) we define failure in terms of risk being above a threshold value, whereas previous studies used auxiliary events such as exceedance of critical concentration levels, (2) we plot an integrated fault tree that handles uncertainty in both hydrological and health components in a unified way, and (3) we introduce a new form of stochastic fault tree that allows to weaken the assumption of independent subsystems that is required by a classical fault tree approach. We illustrate our concept in a simple groundwater-related setting.
Safety Study of TCAS II for Logic Version 6.04
1992-07-01
used in the fault tree of the 198 tdy. The fu given for Logic and Altimetry effects represent the site averages, and we bued upon TCAS RAs always being...comparison with the results of Monte Carlo simulations. Five million iterations were carril out for each of the four cases (eqs. 3, 4, 6 and 7
Submicron Systems Architecture Project
1981-11-01
This project is concerned with the architecture , design , and testing of VLSI Systems. The principal activities in this report period include: The Tree Machine; COPE, The Homogeneous Machine; Computational Arrays; Switch-Level Model for MOS Logic Design; Testing; Local Network and Designer Workstations; Self-timed Systems; Characterization of Deadlock Free Resource Contention; Concurrency Algebra; Language Design and Logic for Program Verification.
Energy-Efficient Wide Datapath Integer Arithmetic Logic Units Using Superconductor Logic
NASA Astrophysics Data System (ADS)
Ayala, Christopher Lawrence
Complementary Metal-Oxide-Semiconductor (CMOS) technology is currently the most widely used integrated circuit technology today. As CMOS approaches the physical limitations of scaling, it is unclear whether or not it can provide long-term support for niche areas such as high-performance computing and telecommunication infrastructure, particularly with the emergence of cloud computing. Alternatively, superconductor technologies based on Josephson junction (JJ) switching elements such as Rapid Single Flux Quantum (RSFQ) logic and especially its new variant, Energy-Efficient Rapid Single Flux Quantum (ERSFQ) logic have the capability to provide an ultra-high-speed, low power platform for digital systems. The objective of this research is to design and evaluate energy-efficient, high-speed 32-bit integer Arithmetic Logic Units (ALUs) implemented using RSFQ and ERSFQ logic as the first steps towards achieving practical Very-Large-Scale-Integration (VLSI) complexity in digital superconductor electronics. First, a tunable VHDL superconductor cell library is created to provide a mechanism to conduct design exploration and evaluation of superconductor digital circuits from the perspectives of functionality, complexity, performance, and energy-efficiency. Second, hybrid wave-pipelining techniques developed earlier for wide datapath RSFQ designs have been used for efficient arithmetic and logic circuit implementations. To develop the core foundation of the ALU, the ripple-carry adder and the Kogge-Stone parallel prefix carry look-ahead adder are studied as representative candidates on opposite ends of the design spectrum. By combining the high-performance features of the Kogge-Stone structure and the low complexity of the ripple-carry adder, a 32-bit asynchronous wave-pipelined hybrid sparse-tree ALU has been designed and evaluated using the VHDL cell library tuned to HYPRES' gate-level characteristics. The designs and techniques from this research have been implemented using RSFQ logic and prototype chips have been fabricated. As a joint work with HYPRES, a 20 GHz 8-bit Kogge-Stone ALU consisting of 7,950 JJs total has been fabricated using a 1.5 μm 4.5 kA/cm2 process and fully demonstrated. An 8-bit sparse-tree ALU (8,832 JJs total) and a 16-bit sparse-tree adder (12,785 JJs total) have also been fabricated using a 1.0 μm 10 kA/cm 2 process and demonstrated under collaboration with Yokohama National University and Nagoya University (Japan).
Parallel Adaptive Mesh Refinement Library
NASA Technical Reports Server (NTRS)
Mac-Neice, Peter; Olson, Kevin
2005-01-01
Parallel Adaptive Mesh Refinement Library (PARAMESH) is a package of Fortran 90 subroutines designed to provide a computer programmer with an easy route to extension of (1) a previously written serial code that uses a logically Cartesian structured mesh into (2) a parallel code with adaptive mesh refinement (AMR). Alternatively, in its simplest use, and with minimal effort, PARAMESH can operate as a domain-decomposition tool for users who want to parallelize their serial codes but who do not wish to utilize adaptivity. The package builds a hierarchy of sub-grids to cover the computational domain of a given application program, with spatial resolution varying to satisfy the demands of the application. The sub-grid blocks form the nodes of a tree data structure (a quad-tree in two or an oct-tree in three dimensions). Each grid block has a logically Cartesian mesh. The package supports one-, two- and three-dimensional models.
Deciding Full Branching Time Logic by Program Transformation
NASA Astrophysics Data System (ADS)
Pettorossi, Alberto; Proietti, Maurizio; Senni, Valerio
We present a method based on logic program transformation, for verifying Computation Tree Logic (CTL*) properties of finite state reactive systems. The finite state systems and the CTL* properties we want to verify, are encoded as logic programs on infinite lists. Our verification method consists of two steps. In the first step we transform the logic program that encodes the given system and the given property, into a monadic ω -program, that is, a stratified program defining nullary or unary predicates on infinite lists. This transformation is performed by applying unfold/fold rules that preserve the perfect model of the initial program. In the second step we verify the property of interest by using a proof method for monadic ω-programs.
A logic-based method for integer programming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hooker, J.; Natraj, N.R.
1994-12-31
We propose a logic-based approach to integer programming that replaces traditional branch-and-cut techniques with logical analogs. Integer variables are regarded as atomic propositions. The constraints give rise to logical formulas that are analogous to separating cuts. No continuous relaxation is used. Rather, the cuts are selected so that they can be easily solved as a discrete relaxation. (In fact, defining a relaxation and generating cuts are best seen as the same problem.) We experiment with relaxations that have a k-tree structure and can be solved by nonserial dynamic programming. We also present logic-based analogs of facet-defining cuts, Chv{acute a}tal rank,more » etc. We conclude with some preliminary computational results.« less
The role of West Virginia's division of forestry
Asher W. Kelly
1980-01-01
Trees are best suited for reclaiming stripped areas with a valuable product. Wildlife and forestry considerations should be the concern of operators, landowners, foresters, and wildlife biologists with an objective of returning the disturbed land to its most capable productivity. In West Virginia, trees are the most logical and realistic product of the land.
Planning effectiveness may grow on fault trees.
Chow, C W; Haddad, K; Mannino, B
1991-10-01
The first step of a strategic planning process--identifying and analyzing threats and opportunities--requires subjective judgments. By using an analytical tool known as a fault tree, healthcare administrators can reduce the unreliability of subjective decision making by creating a logical structure for problem solving and decision making. A case study of 11 healthcare administrators showed that an analysis technique called prospective hindsight can add to a fault tree's ability to improve a strategic planning process.
Risk-Based Prioritization of Research for Aviation Security Using Logic-Evolved Decision Analysis
NASA Technical Reports Server (NTRS)
Eisenhawer, S. W.; Bott, T. F.; Sorokach, M. R.; Jones, F. P.; Foggia, J. R.
2004-01-01
The National Aeronautics and Space Administration is developing advanced technologies to reduce terrorist risk for the air transportation system. Decision support tools are needed to help allocate assets to the most promising research. An approach to rank ordering technologies (using logic-evolved decision analysis), with risk reduction as the metric, is presented. The development of a spanning set of scenarios using a logic-gate tree is described. Baseline risk for these scenarios is evaluated with an approximate reasoning model. Illustrative risk and risk reduction results are presented.
Parodi, Stefano; Manneschi, Chiara; Verda, Damiano; Ferrari, Enrico; Muselli, Marco
2018-03-01
This study evaluates the performance of a set of machine learning techniques in predicting the prognosis of Hodgkin's lymphoma using clinical factors and gene expression data. Analysed samples from 130 Hodgkin's lymphoma patients included a small set of clinical variables and more than 54,000 gene features. Machine learning classifiers included three black-box algorithms ( k-nearest neighbour, Artificial Neural Network, and Support Vector Machine) and two methods based on intelligible rules (Decision Tree and the innovative Logic Learning Machine method). Support Vector Machine clearly outperformed any of the other methods. Among the two rule-based algorithms, Logic Learning Machine performed better and identified a set of simple intelligible rules based on a combination of clinical variables and gene expressions. Decision Tree identified a non-coding gene ( XIST) involved in the early phases of X chromosome inactivation that was overexpressed in females and in non-relapsed patients. XIST expression might be responsible for the better prognosis of female Hodgkin's lymphoma patients.
A three-sided rearrangeable switching network for a binary fat tree
NASA Astrophysics Data System (ADS)
Yen, Mao-Hsu; Yu, Chu; Shin, Haw-Yun; Chen, Sao-Jie
2011-06-01
A binary fat tree needs an internal node to interconnect the left-children, right-children and parent terminals to each other. In this article, we first propose a three-stage, 3-sided rearrangeable switching network for the implementation of a binary fat tree. The main component of this 3-sided switching network (3SSN) consists of a polygonal switch block (PSB) interconnected by crossbars. With the same size and the same number of switches as our 3SSN, a three-stage, 3-sided clique-based switching network is shown to be not rearrangeable. Also, the effects of the rearrangeable structure and the number of terminals on the network switch-efficiency are explored and a proper set of parameters has been determined to minimise the number of switches. We derive that a rearrangeable 3-sided switching network with switches proportional to N 3/2 is most suitable to interconnect N terminals. Moreover, we propose a new Polygonal Field Programmable Gate Array (PFPGA) that consists of logic blocks interconnected by our 3SSN, such that the logic blocks in this PFPGA can be grouped into clusters to implement different logic functions. Since the programmable switches usually have high resistance and capacitance and occupy a large area, we have to consider the effect of the 3SSN structure and the granularity of its cluster logic blocks on the switch efficiency of PFPGA. Experiments on benchmark circuits show that the switch and speed performances are significantly improved. Based on the experimental results, we can determine the parameters of PFPGA for the VLSI implementation.
Dokas, Ioannis M; Panagiotakopoulos, Demetrios C
2006-08-01
The available expertise on managing and operating solid waste management (SWM) facilities varies among countries and among types of facilities. Few experts are willing to record their experience, while few researchers systematically investigate the chains of events that could trigger operational failures in a facility; expertise acquisition and dissemination, in SWM, is neither popular nor easy, despite the great need for it. This paper presents a knowledge acquisition process aimed at capturing, codifying and expanding reliable expertise and propagating it to non-experts. The knowledge engineer (KE), the person performing the acquisition, must identify the events (or causes) that could trigger a failure, determine whether a specific event could trigger more than one failure, and establish how various events are related among themselves and how they are linked to specific operational problems. The proposed process, which utilizes logic diagrams (fault trees) widely used in system safety and reliability analyses, was used for the analysis of 24 common landfill operational problems. The acquired knowledge led to the development of a web-based expert system (Landfill Operation Management Advisor, http://loma.civil.duth.gr), which estimates the occurrence possibility of operational problems, provides advice and suggests solutions.
ERIC Educational Resources Information Center
MacNaughton, Glenda
2004-01-01
This paper engages with questions of logic and its politics to explore how those of us in early childhood education can become critical consumers of "brain research". The research truths we use to construct classroom practices decide the meanings of our actions, thoughts and feelings and our interactions with children. Following Foucault (1980), I…
The ''Coconut Tree'' Model of Careers: The Case of French Academia
ERIC Educational Resources Information Center
Altman, Yochanan; Bournois, Frank
2004-01-01
This research note sets out to explain the main features of the French university academic career--the ''coconut tree,'' as it is colloquially known, setting it firmly within a social and cultural context; outlining the logic and functions of career stages, explaining its rituals and conventions, its rewards and pitfalls. These are narrated by two…
An Overview of the Runtime Verification Tool Java PathExplorer
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2002-01-01
We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.
NASA Astrophysics Data System (ADS)
Rodak, C. M.; McHugh, R.; Wei, X.
2016-12-01
The development and combination of horizontal drilling and hydraulic fracturing has unlocked unconventional hydrocarbon reserves around the globe. These advances have triggered a number of concerns regarding aquifer contamination and over-exploitation, leading to scientific studies investigating potential risks posed by directional hydraulic fracturing activities. These studies, balanced with potential economic benefits of energy production, are a crucial source of information for communities considering the development of unconventional reservoirs. However, probabilistic quantification of the overall risk posed by hydraulic fracturing at the system level are rare. Here we present the concept of fault tree analysis to determine the overall probability of groundwater contamination or over-exploitation, broadly referred to as the probability of failure. The potential utility of fault tree analysis for the quantification and communication of risks is approached with a general application. However, the fault tree design is robust and can handle various combinations of regional-specific data pertaining to relevant spatial scales, geological conditions, and industry practices where available. All available data are grouped into quantity and quality-based impacts and sub-divided based on the stage of the hydraulic fracturing process in which the data is relevant as described by the USEPA. Each stage is broken down into the unique basic events required for failure; for example, to quantify the risk of an on-site spill we must consider the likelihood, magnitude, composition, and subsurface transport of the spill. The structure of the fault tree described above can be used to render a highly complex system of variables into a straightforward equation for risk calculation based on Boolean logic. This project shows the utility of fault tree analysis for the visual communication of the potential risks of hydraulic fracturing activities on groundwater resources.
Reliability analysis of the solar array based on Fault Tree Analysis
NASA Astrophysics Data System (ADS)
Jianing, Wu; Shaoze, Yan
2011-07-01
The solar array is an important device used in the spacecraft, which influences the quality of in-orbit operation of the spacecraft and even the launches. This paper analyzes the reliability of the mechanical system and certifies the most vital subsystem of the solar array. The fault tree analysis (FTA) model is established according to the operating process of the mechanical system based on DFH-3 satellite; the logical expression of the top event is obtained by Boolean algebra and the reliability of the solar array is calculated. The conclusion shows that the hinges are the most vital links between the solar arrays. By analyzing the structure importance(SI) of the hinge's FTA model, some fatal causes, including faults of the seal, insufficient torque of the locking spring, temperature in space, and friction force, can be identified. Damage is the initial stage of the fault, so limiting damage is significant to prevent faults. Furthermore, recommendations for improving reliability associated with damage limitation are discussed, which can be used for the redesigning of the solar array and the reliability growth planning.
NASA Technical Reports Server (NTRS)
English, Thomas
2005-01-01
A standard tool of reliability analysis used at NASA-JSC is the event tree. An event tree is simply a probability tree, with the probabilities determining the next step through the tree specified at each node. The nodal probabilities are determined by a reliability study of the physical system at work for a particular node. The reliability study performed at a node is typically referred to as a fault tree analysis, with the potential of a fault tree existing.for each node on the event tree. When examining an event tree it is obvious why the event tree/fault tree approach has been adopted. Typical event trees are quite complex in nature, and the event tree/fault tree approach provides a systematic and organized approach to reliability analysis. The purpose of this study was two fold. Firstly, we wanted to explore the possibility that a semi-Markov process can create dependencies between sojourn times (the times it takes to transition from one state to the next) that can decrease the uncertainty when estimating time to failures. Using a generalized semi-Markov model, we studied a four element reliability model and were able to demonstrate such sojourn time dependencies. Secondly, we wanted to study the use of semi-Markov processes to introduce a time variable into the event tree diagrams that are commonly developed in PRA (Probabilistic Risk Assessment) analyses. Event tree end states which change with time are more representative of failure scenarios than are the usual static probability-derived end states.
High performance static latches with complete single event upset immunity
Corbett, Wayne T.; Weaver, Harry T.
1994-01-01
An asymmetric response latch providing immunity to single event upset without loss of speed. The latch has cross-coupled inverters having a hardened logic state and a soft state, wherein the logic state of the first inverter can only be changed when the voltage on the coupling node of that inverter is low and the logic state of the second inverter can only be changed when the coupling of that inverter is high. One of more of the asymmetric response latches may be configured into a memory cell having complete immunity, which protects information rather than logic states.
NASA Astrophysics Data System (ADS)
Ghosh, Amal K.; Basuray, Amitabha
2008-11-01
The memory devices in multi-valued logic are of most significance in modern research. This paper deals with the implementation of basic memory devices in multi-valued logic using Savart plate and spatial light modulator (SLM) based optoelectronic circuits. Photons are used here as the carrier to speed up the operations. Optical tree architecture (OTA) has been also utilized in the optical interconnection network. We have exploited the advantages of Savart plates, SLMs and OTA and proposed the SLM based high speed JK, D-type and T-type flip-flops in a trinary system.
Conversion of Questionnaire Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
During the survey, respondents are asked to provide qualitative answers (well, adequate, needs improvement) on how well material control and accountability (MC&A) functions are being performed. These responses can be used to develop failure probabilities for basic events performed during routine operation of the MC&A systems. The failure frequencies for individual events may be used to estimate total system effectiveness using a fault tree in a probabilistic risk analysis (PRA). Numeric risk values are required for the PRA fault tree calculations that are performed to evaluate system effectiveness. So, the performance ratings in the questionnaire must be converted to relativemore » risk values for all of the basic MC&A tasks performed in the facility. If a specific material protection, control, and accountability (MPC&A) task is being performed at the 'perfect' level, the task is considered to have a near zero risk of failure. If the task is performed at a less than perfect level, the deficiency in performance represents some risk of failure for the event. As the degree of deficiency in performance increases, the risk of failure increases. If a task that should be performed is not being performed, that task is in a state of failure. The failure probabilities of all basic events contribute to the total system risk. Conversion of questionnaire MPC&A system performance data to numeric values is a separate function from the process of completing the questionnaire. When specific questions in the questionnaire are answered, the focus is on correctly assessing and reporting, in an adjectival manner, the actual performance of the related MC&A function. Prior to conversion, consideration should not be given to the numeric value that will be assigned during the conversion process. In the conversion process, adjectival responses to questions on system performance are quantified based on a log normal scale typically used in human error analysis (see A.D. Swain and H.E. Guttmann, 'Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications,' NUREG/CR-1278). This conversion produces the basic event risk of failure values required for the fault tree calculations. The fault tree is a deductive logic structure that corresponds to the operational nuclear MC&A system at a nuclear facility. The conventional Delphi process is a time-honored approach commonly used in the risk assessment field to extract numerical values for the failure rates of actions or activities when statistically significant data is absent.« less
Black, Dolores Archuleta; Robinson, William H.; Wilcox, Ian Zachary; ...
2015-08-07
Single event effects (SEE) are a reliability concern for modern microelectronics. Bit corruptions can be caused by single event upsets (SEUs) in the storage cells or by sampling single event transients (SETs) from a logic path. Likewise, an accurate prediction of soft error susceptibility from SETs requires good models to convert collected charge into compact descriptions of the current injection process. This paper describes a simple, yet effective, method to model the current waveform resulting from a charge collection event for SET circuit simulations. The model uses two double-exponential current sources in parallel, and the results illustrate why a conventionalmore » model based on one double-exponential source can be incomplete. Furthermore, a small set of logic cells with varying input conditions, drive strength, and output loading are simulated to extract the parameters for the dual double-exponential current sources. As a result, the parameters are based upon both the node capacitance and the restoring current (i.e., drive strength) of the logic cell.« less
Tiger in the fault tree jungle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, P.
1976-01-01
There is yet little evidence of serious efforts to apply formal reliability analysis methods to evaluate, or even to identify, potential common-mode failures (CMF) of reactor safeguard systems. The prospects for event logic modeling in this regard are examined by the primitive device of reviewing actual CMF experience in terms of what the analyst might have perceived a priori. Further insights of the probability and risks aspects of CMFs are sought through consideration of three key likelihood factors: (1) prior probability of cause ever existing, (2) opportunities for removing cause, and (3) probability that a CMF cause will be activatedmore » by conditions associated with a real system challenge. It was concluded that the principal needs for formal logical discipline in the endeavor to decrease CMF-related risks are to discover and to account for strong ''energetic'' dependency couplings that could arise in the major accidents usually classed as ''hypothetical.'' This application would help focus research, design and quality assurance efforts to cope with major CMF causes. But without extraordinary challenges to the reactor safeguard systems, there must continue to be virtually no statistical evidence pertinent to that class of failure dependencies.« less
Groundwater Circulating Well Assessment and Guidance
1998-04-03
47 3 . 1 Decis ion Tree and Process Description...two GCW systems p laced c lose enough to affect each other significantly (Herding et al. , 1 994). This type of wel l spaci ng may be requ ired to...3.1 Decision Tree and Process Description The process for screening the GCW technology is a logical sequence of steps during which site specific
Data mining for multiagent rules, strategies, and fuzzy decision tree structure
NASA Astrophysics Data System (ADS)
Smith, James F., III; Rhyne, Robert D., II; Fisher, Kristin
2002-03-01
A fuzzy logic based resource manager (RM) has been developed that automatically allocates electronic attack resources in real-time over many dissimilar platforms. Two different data mining algorithms have been developed to determine rules, strategies, and fuzzy decision tree structure. The first data mining algorithm uses a genetic algorithm as a data mining function and is called from an electronic game. The game allows a human expert to play against the resource manager in a simulated battlespace with each of the defending platforms being exclusively directed by the fuzzy resource manager and the attacking platforms being controlled by the human expert or operating autonomously under their own logic. This approach automates the data mining problem. The game automatically creates a database reflecting the domain expert's knowledge. It calls a data mining function, a genetic algorithm, for data mining of the database as required and allows easy evaluation of the information mined in the second step. The criterion for re- optimization is discussed as well as experimental results. Then a second data mining algorithm that uses a genetic program as a data mining function is introduced to automatically discover fuzzy decision tree structures. Finally, a fuzzy decision tree generated through this process is discussed.
Jeagle: a JAVA Runtime Verification Tool
NASA Technical Reports Server (NTRS)
DAmorim, Marcelo; Havelund, Klaus
2005-01-01
We introduce the temporal logic Jeagle and its supporting tool for runtime verification of Java programs. A monitor for an Jeagle formula checks if a finite trace of program events satisfies the formula. Jeagle is a programming oriented extension of the rule-based powerful Eagle logic that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace. Jeagle extends Eagle with constructs for capturing parameterized program events such as method calls and method returns. Parameters can be the objects that methods are called upon, arguments to methods, and return values. Jeagle allows one to refer to these in formulas. The tool performs automated program instrumentation using AspectJ. We show the transformational semantics of Jeagle.
Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil
NASA Astrophysics Data System (ADS)
de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.
2018-05-01
A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.
High performance static latches with complete single event upset immunity
Corbett, W.T.; Weaver, H.T.
1994-04-26
An asymmetric response latch providing immunity to single event upset without loss of speed is described. The latch has cross-coupled inverters having a hardened logic state and a soft state, wherein the logic state of the first inverter can only be changed when the voltage on the coupling node of that inverter is low and the logic state of the second inverter can only be changed when the coupling of that inverter is high. One of more of the asymmetric response latches may be configured into a memory cell having complete immunity, which protects information rather than logic states. 5 figures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Kesheng
2007-08-02
An index in a database system is a data structure that utilizes redundant information about the base data to speed up common searching and retrieval operations. Most commonly used indexes are variants of B-trees, such as B+-tree and B*-tree. FastBit implements a set of alternative indexes call compressed bitmap indexes. Compared with B-tree variants, these indexes provide very efficient searching and retrieval operations by sacrificing the efficiency of updating the indexes after the modification of an individual record. In addition to the well-known strengths of bitmap indexes, FastBit has a special strength stemming from the bitmap compression scheme used. Themore » compression method is called the Word-Aligned Hybrid (WAH) code. It reduces the bitmap indexes to reasonable sizes and at the same time allows very efficient bitwise logical operations directly on the compressed bitmaps. Compared with the well-known compression methods such as LZ77 and Byte-aligned Bitmap code (BBC), WAH sacrifices some space efficiency for a significant improvement in operational efficiency. Since the bitwise logical operations are the most important operations needed to answer queries, using WAH compression has been shown to answer queries significantly faster than using other compression schemes. Theoretical analyses showed that WAH compressed bitmap indexes are optimal for one-dimensional range queries. Only the most efficient indexing schemes such as B+-tree and B*-tree have this optimality property. However, bitmap indexes are superior because they can efficiently answer multi-dimensional range queries by combining the answers to one-dimensional queries.« less
Estimating Single-Event Logic Cross Sections in Advanced Technologies
NASA Astrophysics Data System (ADS)
Harrington, R. C.; Kauppila, J. S.; Warren, K. M.; Chen, Y. P.; Maharrey, J. A.; Haeffner, T. D.; Loveless, T. D.; Bhuva, B. L.; Bounasser, M.; Lilja, K.; Massengill, L. W.
2017-08-01
Reliable estimation of logic single-event upset (SEU) cross section is becoming increasingly important for predicting the overall soft error rate. As technology scales and single-event transient (SET) pulse widths shrink to widths on the order of the setup-and-hold time of flip-flops, the probability of latching an SET as an SEU must be reevaluated. In this paper, previous assumptions about the relationship of SET pulsewidth to the probability of latching an SET are reconsidered and a model for transient latching probability has been developed for advanced technologies. A method using the improved transient latching probability and SET data is used to predict logic SEU cross section. The presented model has been used to estimate combinational logic SEU cross sections in 32-nm partially depleted silicon-on-insulator (SOI) technology given experimental heavy-ion SET data. Experimental SEU data show good agreement with the model presented in this paper.
Defect-sensitivity analysis of an SEU immune CMOS logic family
NASA Technical Reports Server (NTRS)
Ingermann, Erik H.; Frenzel, James F.
1992-01-01
Fault testing of resistive manufacturing defects is done on a recently developed single event upset immune logic family. Resistive ranges and delay times are compared with those of traditional CMOS logic. Reaction of the logic to these defects is observed for a NOR gate, and an evaluation of its ability to cope with them is determined.
Health Management Applications for International Space Station
NASA Technical Reports Server (NTRS)
Alena, Richard; Duncavage, Dan
2005-01-01
Traditional mission and vehicle management involves teams of highly trained specialists monitoring vehicle status and crew activities, responding rapidly to any anomalies encountered during operations. These teams work from the Mission Control Center and have access to engineering support teams with specialized expertise in International Space Station (ISS) subsystems. Integrated System Health Management (ISHM) applications can significantly augment these capabilities by providing enhanced monitoring, prognostic and diagnostic tools for critical decision support and mission management. The Intelligent Systems Division of NASA Ames Research Center is developing many prototype applications using model-based reasoning, data mining and simulation, working with Mission Control through the ISHM Testbed and Prototypes Project. This paper will briefly describe information technology that supports current mission management practice, and will extend this to a vision for future mission control workflow incorporating new ISHM applications. It will describe ISHM applications currently under development at NASA and will define technical approaches for implementing our vision of future human exploration mission management incorporating artificial intelligence and distributed web service architectures using specific examples. Several prototypes are under development, each highlighting a different computational approach. The ISStrider application allows in-depth analysis of Caution and Warning (C&W) events by correlating real-time telemetry with the logical fault trees used to define off-nominal events. The application uses live telemetry data and the Livingstone diagnostic inference engine to display the specific parameters and fault trees that generated the C&W event, allowing a flight controller to identify the root cause of the event from thousands of possibilities by simply navigating animated fault tree models on their workstation. SimStation models the functional power flow for the ISS Electrical Power System and can predict power balance for nominal and off-nominal conditions. SimStation uses realtime telemetry data to keep detailed computational physics models synchronized with actual ISS power system state. In the event of failure, the application can then rapidly diagnose root cause, predict future resource levels and even correlate technical documents relevant to the specific failure. These advanced computational models will allow better insight and more precise control of ISS subsystems, increasing safety margins by speeding up anomaly resolution and reducing,engineering team effort and cost. This technology will make operating ISS more efficient and is directly applicable to next-generation exploration missions and Crew Exploration Vehicles.
Risk-informed Maintenance for Non-coherent Systems
NASA Astrophysics Data System (ADS)
Tao, Ye
Probabilistic Safety Assessment (PSA) is a systematic and comprehensive methodology to evaluate risks associated with a complex engineered technological entity. The information provided by PSA has been increasingly implemented for regulatory purposes but rarely used in providing information for operation and maintenance activities. As one of the key parts in PSA, Fault Tree Analysis (FTA) attempts to model and analyze failure processes of engineering and biological systems. The fault trees are composed of logic diagrams that display the state of the system and are constructed using graphical design techniques. Risk Importance Measures (RIMs) are information that can be obtained from both qualitative and quantitative aspects of FTA. Components within a system can be ranked with respect to each specific criterion defined by each RIM. Through a RIM, a ranking of the components or basic events can be obtained and provide valuable information for risk-informed decision making. Various RIMs have been applied in various applications. In order to provide a thorough understanding of RIMs and interpret the results, they are categorized with respect to risk significance (RS) and safety significance (SS) in this thesis. This has also tied them into different maintenance activities. When RIMs are used for maintenance purposes, it is called risk-informed maintenance. On the other hand, the majority of work produced on the FTA method has been concentrated on failure logic diagrams restricted to the direct or implied use of AND and OR operators. Such systems are considered as coherent systems. However, the NOT logic can also contribute to the information produced by PSA. The importance analysis of non-coherent systems is rather limited, even though the field has received more and more attention over the years. The non-coherent systems introduce difficulties in both qualitative and quantitative assessment of the fault tree compared with the coherent systems. In this thesis, a set of RIMs is analyzed and investigated. The 8 commonly used RIMs (Birnbaum's Measure, Criticality Importance Factor, Fussell-Vesely Measure, Improvement Potential, Conditional Probability, Risk Achievement, Risk Achievement Worth, and Risk Reduction Worth) are extended to non-coherent forms. Both coherent and non-coherent forms are classified into different categories in order to assist different types of maintenance activities. The real systems such as the Steam Generator Level Control System in CANDU Nuclear Power Plant (NPP), a Gas Detection System, and the Automatic Power Control System of the experimental nuclear reactor are presented to demonstrate the application of the results as case studies.
NASA Astrophysics Data System (ADS)
Molina, S.; Lang, D. H.; Lindholm, C. D.
2010-03-01
The era of earthquake risk and loss estimation basically began with the seminal paper on hazard by Allin Cornell in 1968. Following the 1971 San Fernando earthquake, the first studies placed strong emphasis on the prediction of human losses (number of casualties and injured used to estimate the needs in terms of health care and shelters in the immediate aftermath of a strong event). In contrast to these early risk modeling efforts, later studies have focused on the disruption of the serviceability of roads, telecommunications and other important lifeline systems. In the 1990s, the National Institute of Building Sciences (NIBS) developed a tool (HAZUS ®99) for the Federal Emergency Management Agency (FEMA), where the goal was to incorporate the best quantitative methodology in earthquake loss estimates. Herein, the current version of the open-source risk and loss estimation software SELENA v4.1 is presented. While using the spectral displacement-based approach (capacity spectrum method), this fully self-contained tool analytically computes the degree of damage on specific building typologies as well as the associated economic losses and number of casualties. The earthquake ground shaking estimates for SELENA v4.1 can be calculated or provided in three different ways: deterministic, probabilistic or based on near-real-time data. The main distinguishing feature of SELENA compared to other risk estimation software tools is that it is implemented in a 'logic tree' computation scheme which accounts for uncertainties of any input (e.g., scenario earthquake parameters, ground-motion prediction equations, soil models) or inventory data (e.g., building typology, capacity curves and fragility functions). The data used in the analysis is assigned with a decimal weighting factor defining the weight of the respective branch of the logic tree. The weighting of the input parameters accounts for the epistemic and aleatoric uncertainties that will always follow the necessary parameterization of the different types of input data. Like previous SELENA versions, SELENA v4.1 is coded in MATLAB which allows for easy dissemination among the scientific-technical community. Furthermore, any user has access to the source code in order to adapt, improve or refine the tool according to his or her particular needs. The handling of SELENA's current version and the provision of input data is customized for an academic environment but which can then support decision-makers of local, state and regional governmental agencies in estimating possible losses from future earthquakes.
Improved FTA methodology and application to subsea pipeline reliability design.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
The decision tree classifier - Design and potential. [for Landsat-1 data
NASA Technical Reports Server (NTRS)
Hauska, H.; Swain, P. H.
1975-01-01
A new classifier has been developed for the computerized analysis of remote sensor data. The decision tree classifier is essentially a maximum likelihood classifier using multistage decision logic. It is characterized by the fact that an unknown sample can be classified into a class using one or several decision functions in a successive manner. The classifier is applied to the analysis of data sensed by Landsat-1 over Kenosha Pass, Colorado. The classifier is illustrated by a tree diagram which for processing purposes is encoded as a string of symbols such that there is a unique one-to-one relationship between string and decision tree.
Improved FTA Methodology and Application to Subsea Pipeline Reliability Design
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681
Engineering risk assessment for emergency disposal projects of sudden water pollution incidents.
Shi, Bin; Jiang, Jiping; Liu, Rentao; Khan, Afed Ullah; Wang, Peng
2017-06-01
Without an engineering risk assessment for emergency disposal in response to sudden water pollution incidents, responders are prone to be challenged during emergency decision making. To address this gap, the concept and framework of emergency disposal engineering risks are reported in this paper. The proposed risk index system covers three stages consistent with the progress of an emergency disposal project. Fuzzy fault tree analysis (FFTA), a logical and diagrammatic method, was developed to evaluate the potential failure during the process of emergency disposal. The probability of basic events and their combination, which caused the failure of an emergency disposal project, were calculated based on the case of an emergency disposal project of an aniline pollution incident in the Zhuozhang River, Changzhi, China, in 2014. The critical events that can cause the occurrence of a top event (TE) were identified according to their contribution. Finally, advices on how to take measures using limited resources to prevent the failure of a TE are given according to the quantified results of risk magnitude. The proposed approach could be a potential useful safeguard for the implementation of an emergency disposal project during the process of emergency response.
An effective XML based name mapping mechanism within StoRM
NASA Astrophysics Data System (ADS)
Corso, E.; Forti, A.; Ghiselli, A.; Magnoni, L.; Zappi, R.
2008-07-01
In a Grid environment the naming capability allows users to refer to specific data resources in a physical storage system using a high level logical identifier. This logical identifier is typically organized in a file system like structure, a hierarchical tree of names. Storage Resource Manager (SRM) services map the logical identifier to the physical location of data evaluating a set of parameters as the desired quality of services and the VOMS attributes specified in the requests. StoRM is a SRM service developed by INFN and ICTP-EGRID to manage file and space on standard POSIX and high performing parallel and cluster file systems. An upcoming requirement in the Grid data scenario is the orthogonality of the logical name and the physical location of data, in order to refer, with the same identifier, to different copies of data archived in various storage areas with different quality of service. The mapping mechanism proposed in StoRM is based on a XML document that represents the different storage components managed by the service, the storage areas defined by the site administrator, the quality of service they provide and the Virtual Organization that want to use the storage area. An appropriate directory tree is realized in each storage component reflecting the XML schema. In this scenario StoRM is able to identify the physical location of a requested data evaluating the logical identifier and the specified attributes following the XML schema, without querying any database service. This paper presents the namespace schema defined, the different entities represented and the technical details of the StoRM implementation.
ERIC Educational Resources Information Center
Hestad, Marsha; Avellone, Kathy
This 9-week curriculum unit on trees is designed for gifted students in grades 1-5. The lessons are designed for 40-minute classes meeting two or three times a week and stress the development of creative thinking skills, creative problem solving and decision making skills, and critical and logical thinking skills. Each of the 12 lesson plans…
Global tree network for computing structures enabling global processing operations
Blumrich; Matthias A.; Chen, Dong; Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Hoenicke, Dirk; Steinmacher-Burow, Burkhard D.; Takken, Todd E.; Vranas, Pavlos M.
2010-01-19
A system and method for enabling high-speed, low-latency global tree network communications among processing nodes interconnected according to a tree network structure. The global tree network enables collective reduction operations to be performed during parallel algorithm operations executing in a computer structure having a plurality of the interconnected processing nodes. Router devices are included that interconnect the nodes of the tree via links to facilitate performance of low-latency global processing operations at nodes of the virtual tree and sub-tree structures. The global operations performed include one or more of: broadcast operations downstream from a root node to leaf nodes of a virtual tree, reduction operations upstream from leaf nodes to the root node in the virtual tree, and point-to-point message passing from any node to the root node. The global tree network is configurable to provide global barrier and interrupt functionality in asynchronous or synchronized manner, and, is physically and logically partitionable.
Temperature Dependence Of Single-Event Effects
NASA Technical Reports Server (NTRS)
Coss, James R.; Nichols, Donald K.; Smith, Lawrence S.; Huebner, Mark A.; Soli, George A.
1990-01-01
Report describes experimental study of effects of temperature on vulnerability of integrated-circuit memories and other electronic logic devices to single-event effects - spurious bit flips or latch-up in logic state caused by impacts of energetic ions. Involved analysis of data on 14 different device types. In most cases examined, vulnerability to these effects increased or remain constant with temperature.
Risk analysis with a fuzzy-logic approach of a complex installation
NASA Astrophysics Data System (ADS)
Peikert, Tim; Garbe, Heyno; Potthast, Stefan
2016-09-01
This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.
Introduction to Concurrent Engineering: Electronic Circuit Design and Production Applications
1992-09-01
STD-1629. Failure mode distribution data for many different types of parts may be found in RAC publication FMD -91. FMEA utilizes inductive logic in a...contrasts with a Fault Tree Analysis ( FTA ) which utilizes deductive logic in a "top down" approach. In FTA , a system failure is assumed and traced down...Analysis ( FTA ) is a graphical method of risk analysis used to identify critical failure modes within a system or equipment. Utilizing a pictorial approach
Direct evaluation of fault trees using object-oriented programming techniques
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Koen, B. V.
1989-01-01
Object-oriented programming techniques are used in an algorithm for the direct evaluation of fault trees. The algorithm combines a simple bottom-up procedure for trees without repeated events with a top-down recursive procedure for trees with repeated events. The object-oriented approach results in a dynamic modularization of the tree at each step in the reduction process. The algorithm reduces the number of recursive calls required to solve trees with repeated events and calculates intermediate results as well as the solution of the top event. The intermediate results can be reused if part of the tree is modified. An example is presented in which the results of the algorithm implemented with conventional techniques are compared to those of the object-oriented approach.
NASA Astrophysics Data System (ADS)
Breshears, D. D.; Allen, C. D.; McDowell, N. G.; Adams, H. D.; Barnes, M.; Barron-Gafford, G.; Bradford, J. B.; Cobb, N.; Field, J. P.; Froend, R.; Fontaine, J. B.; Garcia, E.; Hardy, G. E. S. J.; Huxman, T. E.; Kala, J.; Lague, M. M.; Martinez-Yrizar, A.; Matusick, G.; Minor, D. M.; Moore, D. J.; Ng, M.; Ruthrof, K. X.; Saleska, S. R.; Stark, S. C.; Swann, A. L. S.; Villegas, J. C.; Williams, A. P.; Zou, C.
2017-12-01
Evidence that tree mortality is increasingly likely occur in extensive die-off events across the terrestrial biosphere continues to mount. The consequences of such extensive mortality events are potentially profound, not only for the locations where die-off events occur, but also for other locations that could be impacted via ecoclimate teleconnections, whereby the land surface changes associated with die-off in one location could alter atmospheric circulation patterns and affect vegetation elsewhere. Here, we (1) recap the background of tree mortality as an emerging environmental issue, (2) highlight recent advances that could help us improve predictions of the vulnerability to tree mortality, including the underlying importance of hydraulic failure, the potential to develop climatic envelopes specific to tree mortality events, and consideration of the role of heat waves; and (3) initial bounding simulations that indicate the potential for tree die-off events in different locations to alter ecoclimate teleconnections. As we move toward globally coordinated carbon accounting and management, the high vulnerability to tree die-off events and the potential for such events to affect vegetation elsewhere will both need to be accounted for.
1983-03-01
Decision Tree -------------------- 62 4-E. PACKAGE unitrep Action/Area Selection flow Chart 82 4-7. PACKAGE unitrep Control Flow Chart...the originetor wculd manually draft simple, readable, formatted iressages using "-i predef.ined forms and decision logic trees . This alternative was...Study Analysis DATA CCNTENT ERRORS PERCENT OF ERRORS Character Type 2.1 Calcvlations/Associations 14.3 Message Identification 4.? Value Pisiratch 22.E
A Survey of Logic Formalisms to Support Mishap Analysis
NASA Technical Reports Server (NTRS)
Johnson, Chris; Holloway, C. M.
2003-01-01
Mishap investigations provide important information about adverse events and near miss incidents. They are intended to help avoid any recurrence of previous failures. Over time, they can also yield statistical information about incident frequencies that helps to detect patterns of failure and can validate risk assessments. However, the increasing complexity of many safety critical systems is posing new challenges for mishap analysis. Similarly, the recognition that many failures have complex, systemic causes has helped to widen the scope of many mishap investigations. These two factors have combined to pose new challenges for the analysis of adverse events. A new generation of formal and semi-formal techniques have been proposed to help investigators address these problems. We introduce the term mishap logics to collectively describe these notations that might be applied to support the analysis of mishaps. The proponents of these notations have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. These proofs can be used to reduce the bias that is often perceived to effect the interpretation of adverse events. Others have argued that one cannot use logic formalisms to prove causes in the same way that one might prove propositions or theorems. Such mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators must use in their analysis of adverse events. This paper provides an overview of these mishap logics. It also identifies several additional classes of logic that might also be used to support mishap analysis.
Efficient Multiplexer FPGA Block Structures Based on G4FETs
NASA Technical Reports Server (NTRS)
Vatan, Farrokh; Fijany, Amir
2009-01-01
Generic structures have been conceived for multiplexer blocks to be implemented in field-programmable gate arrays (FPGAs) based on four-gate field-effect transistors (G(sup 4)FETs). This concept is a contribution to the continuing development of digital logic circuits based on G4FETs and serves as a further demonstration that logic circuits based on G(sup 4)FETs could be more efficient (in the sense that they could contain fewer transistors), relative to functionally equivalent logic circuits based on conventional transistors. Results in this line of development at earlier stages were summarized in two previous NASA Tech Briefs articles: "G(sup 4)FETs as Universal and Programmable Logic Gates" (NPO-41698), Vol. 31, No. 7 (July 2007), page 44, and "Efficient G4FET-Based Logic Circuits" (NPO-44407), Vol. 32, No. 1 ( January 2008), page 38 . As described in the first-mentioned previous article, a G4FET can be made to function as a three-input NOT-majority gate, which has been shown to be a universal and programmable logic gate. The universality and programmability could be exploited to design logic circuits containing fewer components than are required for conventional transistor-based circuits performing the same logic functions. The second-mentioned previous article reported results of a comparative study of NOT-majority-gate (G(sup 4)FET)-based logic-circuit designs and equivalent NOR- and NAND-gate-based designs utilizing conventional transistors. [NOT gates (inverters) were also included, as needed, in both the G(sup 4)FET- and the NOR- and NAND-based designs.] In most of the cases studied, fewer logic gates (and, hence, fewer transistors), were required in the G(sup 4)FET-based designs. There are two popular categories of FPGA block structures or architectures: one based on multiplexers, the other based on lookup tables. In standard multiplexer- based architectures, the basic building block is a tree-like configuration of multiplexers, with possibly a few additional logic gates such as ANDs or ORs. Interconnections are realized by means of programmable switches that may connect the input terminals of a block to output terminals of other blocks, may bridge together some of the inputs, or may connect some of the input terminals to signal sources representing constant logical levels 0 or 1. The left part of the figure depicts a four-to-one G(sup 4)FET-based multiplexer tree; the right part of the figure depicts a functionally equivalent four-to-one multiplexer based on conventional transistors. The G(sup 4)FET version would contains 54 transistors; the conventional version contains 70 transistors.
2014 Update of the Pacific Northwest portion of the U.S. National Seismic Hazard Maps
Frankel, Arthur; Chen, Rui; Petersen, Mark; Moschetti, Morgan P.; Sherrod, Brian
2015-01-01
Several aspects of the earthquake characterization were changed for the Pacific Northwest portion of the 2014 update of the national seismic hazard maps, reflecting recent scientific findings. New logic trees were developed for the recurrence parameters of M8-9 earthquakes on the Cascadia subduction zone (CSZ) and for the eastern edge of their rupture zones. These logic trees reflect recent findings of additional M8 CSZ earthquakes using offshore deposits of turbidity flows and onshore tsunami deposits and subsidence. These M8 earthquakes each rupture a portion of the CSZ and occur in the time periods between M9 earthquakes that have an average recurrence interval of about 500 years. The maximum magnitude was increased for deep intraslab earthquakes. An areal source zone to account for the possibility of deep earthquakes under western Oregon was expanded. The western portion of the Tacoma fault was added to the hazard maps.
NASA Astrophysics Data System (ADS)
Bréda, Nathalie; Badeau, Vincent
2008-09-01
The aim of this paper is to illustrate how some extreme events could affect forest ecosystems. Forest tree response can be analysed using dendroecological methods, as tree-ring widths are strongly controlled by climatic or biotic events. Years with such events induce similar tree responses and are called pointer years. They can result from extreme climatic events like frost, a heat wave, spring water logging, drought or insect damage… Forest tree species showed contrasting responses to climatic hazards, depending on their sensitivity to water shortage or temperature hardening, as illustrated from our dendrochronological database. For foresters, a drought or a pest disease is an extreme event if visible and durable symptoms are induced (leaf discolouration, leaf loss, perennial organs mortality, tree dieback and mortality). These symptoms here are shown, lagging one or several years behind a climatic or biotic event, from forest decline cases in progress since the 2003 drought or attributed to previous severe droughts or defoliations in France. Tree growth or vitality recovery is illustrated, and the functional interpretation of the long lasting memory of trees is discussed. A coupled approach linking dendrochronology and ecophysiology helps in discussing vulnerability of forest stands, and suggests management advices in order to mitigate extreme drought and cope with selective mortality.
Testing for Independence between Evolutionary Processes.
Behdenna, Abdelkader; Pothier, Joël; Abby, Sophie S; Lambert, Amaury; Achaz, Guillaume
2016-09-01
Evolutionary events co-occurring along phylogenetic trees usually point to complex adaptive phenomena, sometimes implicating epistasis. While a number of methods have been developed to account for co-occurrence of events on the same internal or external branch of an evolutionary tree, there is a need to account for the larger diversity of possible relative positions of events in a tree. Here we propose a method to quantify to what extent two or more evolutionary events are associated on a phylogenetic tree. The method is applicable to any discrete character, like substitutions within a coding sequence or gains/losses of a biological function. Our method uses a general approach to statistically test for significant associations between events along the tree, which encompasses both events inseparable on the same branch, and events genealogically ordered on different branches. It assumes that the phylogeny and themapping of branches is known without errors. We address this problem from the statistical viewpoint by a linear algebra representation of the localization of the evolutionary events on the tree.We compute the full probability distribution of the number of paired events occurring in the same branch or in different branches of the tree, under a null model of independence where each type of event occurs at a constant rate uniformly inthephylogenetic tree. The strengths andweaknesses of themethodare assessed via simulations;we then apply the method to explore the loss of cell motility in intracellular pathogens. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Understanding the dynamic effects of returning patients toward emergency department density
NASA Astrophysics Data System (ADS)
Ahmad, Norazura; Zulkepli, Jafri; Ramli, Razamin; Ghani, Noraida Abdul; Teo, Aik Howe
2017-11-01
This paper presents the development of a dynamic hypothesis for the effect of returning patients to the emergency department (ED). A logical tree from the Theory of Constraint known as Current Reality Tree was used to identify the key variables. Then, a hypothetical framework portraying the interrelated variables and its influencing relationships was developed using causal loop diagrams (CLD). The conceptual framework was designed as the basis for the development of a system dynamics model.
ERIC Educational Resources Information Center
Steinhauer, Karsten; Drury, John E.; Portner, Paul; Walenski, Matthew; Ullman, Michael T.
2010-01-01
Logic has been intertwined with the study of language and meaning since antiquity, and such connections persist in present day research in linguistic theory (formal semantics) and cognitive psychology (e.g., studies of human reasoning). However, few studies in cognitive neuroscience have addressed logical dimensions of sentence-level language…
75 FR 39798 - Airworthiness Directives; Bombardier, Inc. Model DHC-8-400, -401, and -402 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-13
.... 1 hydraulic system. In one case, the hydraulic system control logic did not shut down the PTU and... unit (PTU) control logic, including the provision of automatic PTU shutdown in the event of loss of... one case, the hydraulic system control logic did not shut down the PTU and the overspeed condition...
Model Checking Temporal Logic Formulas Using Sticker Automata
Feng, Changwei; Wu, Huanmei
2017-01-01
As an important complex problem, the temporal logic model checking problem is still far from being fully resolved under the circumstance of DNA computing, especially Computation Tree Logic (CTL), Interval Temporal Logic (ITL), and Projection Temporal Logic (PTL), because there is still a lack of approaches for DNA model checking. To address this challenge, a model checking method is proposed for checking the basic formulas in the above three temporal logic types with DNA molecules. First, one-type single-stranded DNA molecules are employed to encode the Finite State Automaton (FSA) model of the given basic formula so that a sticker automaton is obtained. On the other hand, other single-stranded DNA molecules are employed to encode the given system model so that the input strings of the sticker automaton are obtained. Next, a series of biochemical reactions are conducted between the above two types of single-stranded DNA molecules. It can then be decided whether the system satisfies the formula or not. As a result, we have developed a DNA-based approach for checking all the basic formulas of CTL, ITL, and PTL. The simulated results demonstrate the effectiveness of the new method. PMID:29119114
On the Integration of Logic Programming and Functional Programming.
1985-06-01
be performed with simple handtools and devices. However, if the problem is more complex, say involving the cylinders, camshaft , or drive train, then...f(x,x) with f(y, g(y)), and would bind x to g(x) (Ref. 7]. The problem, of course, is that the attempt to prune the search tree allows circularity...combinatorial-explosion, since the search trees generated can grow very unpredictably (Re£. 19: p. 2293. Somewhat akin to the halting problem, it means that a
2015-09-17
network intrusion detection systems NIST National Institute of Standards and Technology p-tree protocol tree PI protocol informatics PLC programmable logic...electrical, water, oil , natural gas, manufacturing, and pharmaceutical industries, to name a few. The differences between SCADA and DCS systems are often... Oil Company, also known as Saudi Aramco, suffered huge data loss that resulted in the disruption of daily operations for nearly two weeks [BTR13]. As it
STRIDE: Species Tree Root Inference from Gene Duplication Events.
Emms, David M; Kelly, Steven
2017-12-01
The correct interpretation of any phylogenetic tree is dependent on that tree being correctly rooted. We present STRIDE, a fast, effective, and outgroup-free method for identification of gene duplication events and species tree root inference in large-scale molecular phylogenetic analyses. STRIDE identifies sets of well-supported in-group gene duplication events from a set of unrooted gene trees, and analyses these events to infer a probability distribution over an unrooted species tree for the location of its root. We show that STRIDE correctly identifies the root of the species tree in multiple large-scale molecular phylogenetic data sets spanning a wide range of timescales and taxonomic groups. We demonstrate that the novel probability model implemented in STRIDE can accurately represent the ambiguity in species tree root assignment for data sets where information is limited. Furthermore, application of STRIDE to outgroup-free inference of the origin of the eukaryotic tree resulted in a root probability distribution that provides additional support for leading hypotheses for the origin of the eukaryotes. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
77 FR 61024 - Notice of Public Meeting and Request for Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-05
... public meeting and public comments--The National Christmas Tree Lighting and the subsequent 26-day event... National Christmas Tree Lighting and the subsequent 26-day event. The general plan and theme for the event... comments and suggestions on the planning of the 2012 National Christmas Tree Lighting and the subsequent 26...
75 FR 20787 - Airworthiness Directives; Bombardier, Inc. Model DHC-8-400, -401, and -402 Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-21
... increased fluid flow within the No. 1 hydraulic system. In one case, the hydraulic system control logic did... (PTU) control logic, including the provision of automatic PTU shutdown in the event of loss of fluid in... one case, the hydraulic system control logic did not shut down the PTU and the overspeed condition...
Romme, William H.; Allen, Craig D.; Bailey, John D.; Baker, William L.; Bestelmeyer, Brandon T.; Brown, Peter M.; Eisenhart, Karen S.; Floyd, M. Lisa; Huffman, David W.; Jacobs, Brian F.; Miller, Richard F.; Muldavin, Esteban H.; Swetnam, Thomas W.; Tausch, Robin J.; Weisberg, Peter J.
2009-01-01
Piñon–juniper is a major vegetation type in western North America. Effective management of these ecosystems has been hindered by inadequate understanding of 1) the variability in ecosystem structure and ecological processes that exists among the diverse combinations of piñons, junipers, and associated shrubs, herbs, and soil organisms; 2) the prehistoric and historic disturbance regimes; and 3) the mechanisms driving changes in vegetation structure and composition during the past 150 yr. This article summarizes what we know (and don't know) about three fundamentally different kinds of piñon–juniper vegetation. Persistent woodlands are found where local soils, climate, and disturbance regimes are favorable for piñon, juniper, or a mix of both; fires have always been infrequent in these woodlands. Piñon–juniper savannas are found where local soils and climate are suitable for both trees and grasses; it is logical that low-severity fires may have maintained low tree densities before disruption of fire regimes following Euro-American settlement, but information is insufficient to support any confident statements about historical disturbance regimes in these savannas. Wooded shrublands are found where local soils and climate support a shrub community, but trees can increase during moist climatic conditions and periods without disturbance and decrease during droughts and following disturbance. Dramatic increases in tree density have occurred in portions of all three types of piñon–juniper vegetation, although equally dramatic mortality events have also occurred in some areas. The potential mechanisms driving increases in tree density—such as recovery from past disturbance, natural range expansion, livestock grazing, fire exclusion, climatic variability, and CO2 fertilization—generally have not received enough empirical or experimental investigation to predict which is most important in any given location. The intent of this synthesis is 1) to provide a source of information for managers and policy makers; and 2) to stimulate researchers to address the most important unanswered questions.
A resource-saving collective approach to biomedical semantic role labeling
2014-01-01
Background Biomedical semantic role labeling (BioSRL) is a natural language processing technique that identifies the semantic roles of the words or phrases in sentences describing biological processes and expresses them as predicate-argument structures (PAS’s). Currently, a major problem of BioSRL is that most systems label every node in a full parse tree independently; however, some nodes always exhibit dependency. In general SRL, collective approaches based on the Markov logic network (MLN) model have been successful in dealing with this problem. However, in BioSRL such an approach has not been attempted because it would require more training data to recognize the more specialized and diverse terms found in biomedical literature, increasing training time and computational complexity. Results We first constructed a collective BioSRL system based on MLN. This system, called collective BIOSMILE (CBIOSMILE), is trained on the BioProp corpus. To reduce the resources used in BioSRL training, we employ a tree-pruning filter to remove unlikely nodes from the parse tree and four argument candidate identifiers to retain candidate nodes in the tree. Nodes not recognized by any candidate identifier are discarded. The pruned annotated parse trees are used to train a resource-saving MLN-based system, which is referred to as resource-saving collective BIOSMILE (RCBIOSMILE). Our experimental results show that our proposed CBIOSMILE system outperforms BIOSMILE, which is the top BioSRL system. Furthermore, our proposed RCBIOSMILE maintains the same level of accuracy as CBIOSMILE using 92% less memory and 57% less training time. Conclusions This greatly improved efficiency makes RCBIOSMILE potentially suitable for training on much larger BioSRL corpora over more biomedical domains. Compared to real-world biomedical corpora, BioProp is relatively small, containing only 445 MEDLINE abstracts and 30 event triggers. It is not large enough for practical applications, such as pathway construction. We consider it of primary importance to pursue SRL training on large corpora in the future. PMID:24884358
NASA Astrophysics Data System (ADS)
Chen, Chunfeng; Liu, Hua; Fan, Ge
2005-02-01
In this paper we consider the problem of designing a network of optical cross-connects(OXCs) to provide end-to-end lightpath services to label switched routers (LSRs). Like some previous work, we select the number of OXCs as our objective. Compared with the previous studies, we take into account the fault-tolerant characteristic of logical topology. First of all, using a Prufer number randomly generated, we generate a tree. By adding some edges to the tree, we can obtain a physical topology which consists of a certain number of OXCs and fiber links connecting OXCs. It is notable that we for the first time limit the number of layers of the tree produced according to the method mentioned above. Then we design the logical topologies based on the physical topologies mentioned above. In principle, we will select the shortest path in addition to some consideration on the load balancing of links and the limitation owing to the SRLG. Notably, we implement the routing algorithm for the nodes in increasing order of the degree of the nodes. With regarding to the problem of the wavelength assignment, we adopt the heuristic algorithm of the graph coloring commonly used. It is clear our problem is computationally intractable especially when the scale of the network is large. We adopt the taboo search algorithm to find the near optimal solution to our objective. We present numerical results for up to 1000 LSRs and for a wide range of system parameters such as the number of wavelengths supported by each fiber link and traffic. The results indicate that it is possible to build large-scale optical networks with rich connectivity in a cost-effective manner, using relatively few but properly dimensioned OXCs.
Document page structure learning for fixed-layout e-books using conditional random fields
NASA Astrophysics Data System (ADS)
Tao, Xin; Tang, Zhi; Xu, Canhui
2013-12-01
In this paper, a model is proposed to learn logical structure of fixed-layout document pages by combining support vector machine (SVM) and conditional random fields (CRF). Features related to each logical label and their dependencies are extracted from various original Portable Document Format (PDF) attributes. Both local evidence and contextual dependencies are integrated in the proposed model so as to achieve better logical labeling performance. With the merits of SVM as local discriminative classifier and CRF modeling contextual correlations of adjacent fragments, it is capable of resolving the ambiguities of semantic labels. The experimental results show that CRF based models with both tree and chain graph structures outperform the SVM model with an increase of macro-averaged F1 by about 10%.
Apparatus for and method of eliminating single event upsets in combinational logic
NASA Technical Reports Server (NTRS)
Gambles, Jody W. (Inventor); Hass, Kenneth J. (Inventor); Cameron, Kelly B. (Inventor)
2001-01-01
An apparatus for and method of eliminating single event upsets (or SEU) in combinational logic are used to prevent error propagation as a result of cosmic particle strikes to the combinational logic. The apparatus preferably includes a combinational logic block electrically coupled to a delay element, a latch and an output buffer. In operation, a signal from the combinational logic is electrically coupled to a first input of the latch. In addition, the signal is routed through the delay element to produce a delayed signal. The delayed signal is routed to a second input of the latch. The latch used in the apparatus for preventing SEU preferably includes latch outputs and a feature that the latch outputs will not change state unless both latch inputs are correct. For example, the latch outputs may not change state unless both latch inputs have the same logical state. When a cosmic particle strikes the combinational logic, a transient disturbance with a predetermined length may appear in the signal. However, a function of the delay element is to preferably provide a time delay greater than the length of the transient disturbance. Therefore, the transient disturbance will not reach both latch inputs simultaneously. As a result, the latch outputs will not permanently change state in error due to the transient disturbance. In addition, the output buffer preferably combines the latch outputs in such a way that the correct state is preserved at all times. Thus, combinational logic with protection from SEU is provided.
Ordering Traces Logically to Identify Lateness in Message Passing Programs
Isaacs, Katherine E.; Gamblin, Todd; Bhatele, Abhinav; ...
2015-03-30
Event traces are valuable for understanding the behavior of parallel programs. However, automatically analyzing a large parallel trace is difficult, especially without a specific objective. We aid this endeavor by extracting a trace's logical structure, an ordering of trace events derived from happened-before relationships, while taking into account developer intent. Using this structure, we can calculate an operation's delay relative to its peers on other processes. The logical structure also serves as a platform for comparing and clustering processes as well as highlighting communication patterns in a trace visualization. We present an algorithm for determining this idealized logical structure frommore » traces of message passing programs, and we develop metrics to quantify delays and differences among processes. We implement our techniques in Ravel, a parallel trace visualization tool that displays both logical and physical timelines. Rather than showing the duration of each operation, we display where delays begin and end, and how they propagate. As a result, we apply our approach to the traces of several message passing applications, demonstrating the accuracy of our extracted structure and its utility in analyzing these codes.« less
2015-01-01
Implementing parallel and multivalued logic operations at the molecular scale has the potential to improve the miniaturization and efficiency of a new generation of nanoscale computing devices. Two-dimensional photon-echo spectroscopy is capable of resolving dynamical pathways on electronic and vibrational molecular states. We experimentally demonstrate the implementation of molecular decision trees, logic operations where all possible values of inputs are processed in parallel and the outputs are read simultaneously, by probing the laser-induced dynamics of populations and coherences in a rhodamine dye mounted on a short DNA duplex. The inputs are provided by the bilinear interactions between the molecule and the laser pulses, and the output values are read from the two-dimensional molecular response at specific frequencies. Our results highlights how ultrafast dynamics between multiple molecular states induced by light–matter interactions can be used as an advantage for performing complex logic operations in parallel, operations that are faster than electrical switching. PMID:25984269
Secure Multicast Tree Structure Generation Method for Directed Diffusion Using A* Algorithms
NASA Astrophysics Data System (ADS)
Kim, Jin Myoung; Lee, Hae Young; Cho, Tae Ho
The application of wireless sensor networks to areas such as combat field surveillance, terrorist tracking, and highway traffic monitoring requires secure communication among the sensor nodes within the networks. Logical key hierarchy (LKH) is a tree based key management model which provides secure group communication. When a sensor node is added or evicted from the communication group, LKH updates the group key in order to ensure the security of the communications. In order to efficiently update the group key in directed diffusion, we propose a method for secure multicast tree structure generation, an extension to LKH that reduces the number of re-keying messages by considering the addition and eviction ratios of the history data. For the generation of the proposed key tree structure the A* algorithm is applied, in which the branching factor at each level can take on different value. The experiment results demonstrate the efficiency of the proposed key tree structure against the existing key tree structures of fixed branching factors.
Uncertain decision tree inductive inference
NASA Astrophysics Data System (ADS)
Zarban, L.; Jafari, S.; Fakhrahmad, S. M.
2011-10-01
Induction is the process of reasoning in which general rules are formulated based on limited observations of recurring phenomenal patterns. Decision tree learning is one of the most widely used and practical inductive methods, which represents the results in a tree scheme. Various decision tree algorithms have already been proposed such as CLS, ID3, Assistant C4.5, REPTree and Random Tree. These algorithms suffer from some major shortcomings. In this article, after discussing the main limitations of the existing methods, we introduce a new decision tree induction algorithm, which overcomes all the problems existing in its counterparts. The new method uses bit strings and maintains important information on them. This use of bit strings and logical operation on them causes high speed during the induction process. Therefore, it has several important features: it deals with inconsistencies in data, avoids overfitting and handles uncertainty. We also illustrate more advantages and the new features of the proposed method. The experimental results show the effectiveness of the method in comparison with other methods existing in the literature.
1986-10-01
Three-year reproducibility of elevated (above 120 mmHg) systolic BP (BPs) in twelve- and thirteen-year-olds was studied on the basis of the data of an international collaborative study in juvenile arterial hypertension. Rules for the classification and regulation of the adolescents of both sexes by their tendency to reproduce elevated BPs were derived, using logical functions. The rules are presented as logical solution trees that make it possible to assess the probability of elevated BPs being maintained and identify the adolescents prone to persistent BPs rise.
NASA Technical Reports Server (NTRS)
Dailey, C. L.; Abotteen, K. M. (Principal Investigator)
1980-01-01
The development and testing of an analysis procedure which was developed to improve the consistency and objectively of crop identification using Landsat data is described. The procedure was developed to identify corn and soybean crops in the U.S. corn belt region. The procedure consists of a series of decision points arranged in a tree-like structure, the branches of which lead an analyst to crop labels. The specific decision logic is designed to maximize the objectively of the identification process and to promote the possibility of future automation. Significant results are summarized.
Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis
NASA Astrophysics Data System (ADS)
Wright, Heather; Pallister, John; Newhall, Chris
2015-04-01
Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the public. VDAP trees evaluate probabilities of: magmatic intrusion, likelihood of eruption, magnitude of eruption, and types of associated hazardous events and their extents. In a few cases, trees have been extended to also assess and communicate vulnerability and relative risk.
1990-04-01
focus of attention ). The inherent local control in the FA/C model allows it to achieve just that, since it only requires a global goal to become...Computing Terms Agent Modelling : is concerned with modelling actor’s intentions and plans, and their modification in the light of information... model or program that is based on a mathematical system of logic. B-tree : or "binary-tree" is a self organising storage mechanism that works by taking
Master Logic Diagram: An Approach to Identify Initiating Events of HTGRs
NASA Astrophysics Data System (ADS)
Purba, J. H.
2018-02-01
Initiating events of a nuclear power plant being evaluated need to be firstly identified prior to applying probabilistic safety assessment on that plant. Various types of master logic diagrams (MLDs) have been proposedforsearching initiating events of the next generation of nuclear power plants, which have limited data and operating experiences. Those MLDs are different in the number of steps or levels and different in the basis for developing them. This study proposed another type of MLD approach to find high temperature gas cooled reactor (HTGR) initiating events. It consists of five functional steps starting from the top event representing the final objective of the safety functions to the basic event representing the goal of the MLD development, which is an initiating event. The application of the proposed approach to search for two HTGR initiating events, i.e. power turbine generator trip and loss of offsite power, is provided. The results confirmed that the proposed MLD is feasiblefor finding HTGR initiating events.
The Tunguska event in 1908: evidence from tree-ring anatomy.
Vaganov, Evgenii A; Hughes, Malcolm K; Silkin, Pavel P; Nesvetailo, Valery D
2004-01-01
We analyzed tree rings in wood samples collected from some of the few surviving trees found close to the epicenter (within 4-5 km) of the Tunguska event that occurred on the last day of June 1908. Tree-ring growth shows a depression starting in the year after the event and continuing during a 4-5-year period. The most remarkable traces of the event were found in the rings' anatomical structure: (1) formation of "light" rings and a reduction of maximum density in 1908; (2) non-thickened tracheids (the cells that make up most of the wood volume) in the transition and latewood zones (the middle and last-formed parts of the ring, respectively); and (3) deformed tracheids, which are located on the 1908 annual ring outer boundary. In the majority of samples, normal earlywood and latewood tracheids were formed in all annual rings after 1908. The observed anomalies in wood anatomy suggest two main impacts of the Tunguska event on surviving trees--(1) defoliation and (2) direct mechanical stress on active xylem tissue. The mechanical stress needed to fell trees is less than the stress needed to cause the deformation of differentiating tracheids observed in trees close to the epicenter. In order to resolve this apparent contradiction, work is suggested on possible topographic modification of the overpressure experienced by these trees, as is an experimental test of the effects of such stresses on precisely analogous growing trees.
Visualizing a Procedure with Nassi-Schneiderman Charts.
ERIC Educational Resources Information Center
Weiss, Edmond H.
1990-01-01
Argues that Nassi-Schneiderman (NS) charts, when used to diagram human procedures, can eliminate prose ambiguities. Asserts that these devices provide most of the advantages of decision tables and trees. Suggests using NS charts in testing the logic and completeness of traditional procedures, or even in place of many traditional publications. (SG)
Assessing Institutional Ineffectiveness: A Strategy for Improvement.
ERIC Educational Resources Information Center
Cameron, Kim S.
1984-01-01
Based on the theory that institutional change and improvement are motivated more by knowledge of problems than by knowledge of successes, a fault tree analysis technique using Boolean logic for assessing institutional ineffectiveness by determining weaknesses in the system is presented. Advantages and disadvantages of focusing on weakness rather…
Inferring phylogenetic trees from the knowledge of rare evolutionary events.
Hellmuth, Marc; Hernandez-Rosales, Maribel; Long, Yangjing; Stadler, Peter F
2018-06-01
Rare events have played an increasing role in molecular phylogenetics as potentially homoplasy-poor characters. In this contribution we analyze the phylogenetic information content from a combinatorial point of view by considering the binary relation on the set of taxa defined by the existence of a single event separating two taxa. We show that the graph-representation of this relation must be a tree. Moreover, we characterize completely the relationship between the tree of such relations and the underlying phylogenetic tree. With directed operations such as tandem-duplication-random-loss events in mind we demonstrate how non-symmetric information constrains the position of the root in the partially reconstructed phylogeny.
Relating phylogenetic trees to transmission trees of infectious disease outbreaks.
Ypma, Rolf J F; van Ballegooijen, W Marijn; Wallinga, Jacco
2013-11-01
Transmission events are the fundamental building blocks of the dynamics of any infectious disease. Much about the epidemiology of a disease can be learned when these individual transmission events are known or can be estimated. Such estimations are difficult and generally feasible only when detailed epidemiological data are available. The genealogy estimated from genetic sequences of sampled pathogens is another rich source of information on transmission history. Optimal inference of transmission events calls for the combination of genetic data and epidemiological data into one joint analysis. A key difficulty is that the transmission tree, which describes the transmission events between infected hosts, differs from the phylogenetic tree, which describes the ancestral relationships between pathogens sampled from these hosts. The trees differ both in timing of the internal nodes and in topology. These differences become more pronounced when a higher fraction of infected hosts is sampled. We show how the phylogenetic tree of sampled pathogens is related to the transmission tree of an outbreak of an infectious disease, by the within-host dynamics of pathogens. We provide a statistical framework to infer key epidemiological and mutational parameters by simultaneously estimating the phylogenetic tree and the transmission tree. We test the approach using simulations and illustrate its use on an outbreak of foot-and-mouth disease. The approach unifies existing methods in the emerging field of phylodynamics with transmission tree reconstruction methods that are used in infectious disease epidemiology.
Paleo-event data standards for dendrochronology
Elaine Kennedy Sutherland; P. Brewer; W. Gross
2017-01-01
Extreme environmental events, such as storm winds, landslides, insect infestations, and wildfire, cause loss of life, resources, and human infrastructure. Disaster riskreduction analysis can be improved with information about past frequency, intensity, and spatial patterns of extreme events. Tree-ring analyses can provide such information: tree rings reflect events as...
Specialized Binary Analysis for Vetting Android APPS Using GUI Logic
2016-04-01
the use of high- level reasoning based on the GUI design logic of an app to enable a security analyst to diagnose and triage the potentially sensitive...execution paths of an app. Levels of Inconsistency We have identified three- levels of logical inconsistencies: Event- level inconsistency A sensitive...operation (e.g., taking a picture) is not trigged by user action on a GUI component. Layout- level inconsistency A sensitive operation is triggered by
DG TO FT - AUTOMATIC TRANSLATION OF DIGRAPH TO FAULT TREE MODELS
NASA Technical Reports Server (NTRS)
Iverson, D. L.
1994-01-01
Fault tree and digraph models are frequently used for system failure analysis. Both types of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Each model has its advantages. While digraphs can be derived in a fairly straightforward manner from system schematics and knowledge about component failure modes and system design, fault tree structure allows for fast processing using efficient techniques developed for tree data structures. The similarities between digraphs and fault trees permits the information encoded in the digraph to be translated into a logically equivalent fault tree. The DG TO FT translation tool will automatically translate digraph models, including those with loops or cycles, into fault tree models that have the same minimum cut set solutions as the input digraph. This tool could be useful, for example, if some parts of a system have been modeled using digraphs and others using fault trees. The digraphs could be translated and incorporated into the fault trees, allowing them to be analyzed using a number of powerful fault tree processing codes, such as cut set and quantitative solution codes. A cut set for a given node is a group of failure events that will cause the failure of the node. A minimum cut set for a node is any cut set that, if any of the failures in the set were to be removed, the occurrence of the other failures in the set will not cause the failure of the event represented by the node. Cut sets calculations can be used to find dependencies, weak links, and vital system components whose failures would cause serious systems failure. The DG TO FT translation system reads in a digraph with each node listed as a separate object in the input file. The user specifies a terminal node for the digraph that will be used as the top node of the resulting fault tree. A fault tree basic event node representing the failure of that digraph node is created and becomes a child of the terminal root node. A subtree is created for each of the inputs to the digraph terminal node and the root of those subtrees are added as children of the top node of the fault tree. Every node in the digraph upstream of the terminal node will be visited and converted. During the conversion process, the algorithm keeps track of the path from the digraph terminal node to the current digraph node. If a node is visited twice, then the program has found a cycle in the digraph. This cycle is broken by finding the minimal cut sets of the twice visited digraph node and forming those cut sets into subtrees. Another implementation of the algorithm resolves loops by building a subtree based on the digraph minimal cut sets calculation. It does not reduce the subtree to minimal cut set form. This second implementation produces larger fault trees, but runs much faster than the version using minimal cut sets since it does not spend time reducing the subtrees to minimal cut sets. The fault trees produced by DG TO FT will contain OR gates, AND gates, Basic Event nodes, and NOP gates. The results of a translation can be output as a text object description of the fault tree similar to the text digraph input format. The translator can also output a LISP language formatted file and an augmented LISP file which can be used by the FTDS (ARC-13019) diagnosis system, available from COSMIC, which performs diagnostic reasoning using the fault tree as a knowledge base. DG TO FT is written in C-language to be machine independent. It has been successfully implemented on a Sun running SunOS, a DECstation running ULTRIX, a Macintosh running System 7, and a DEC VAX running VMS. The RAM requirement varies with the size of the models. DG TO FT is available in UNIX tar format on a .25 inch streaming magnetic tape cartridge (standard distribution) or on a 3.5 inch diskette. It is also available on a 3.5 inch Macintosh format diskette or on a 9-track 1600 BPI magnetic tape in DEC VAX FILES-11 format. Sample input and sample output are provided on the distribution medium. An electronic copy of the documentation in Macintosh Microsoft Word format is provided on the distribution medium. DG TO FT was developed in 1992. Sun, and SunOS are trademarks of Sun Microsystems, Inc. DECstation, ULTRIX, VAX, and VMS are trademarks of Digital Equipment Corporation. UNIX is a registered trademark of AT&T Bell Laboratories. Macintosh is a registered trademark of Apple Computer, Inc. System 7 is a trademark of Apple Computers Inc. Microsoft Word is a trademark of Microsoft Corporation.
36 CFR 292.46 - Timber harvesting activities.
Code of Federal Regulations, 2010 CFR
2010-07-01
... hazard trees; or to respond to natural events such as wildfire, flood, earthquake, volcanic eruption, high winds, and disease or insect infestation. (2) Where authorized, trees may be harvested by... trees, or to respond to natural events provided that the activity is consistent with the Wild and Scenic...
Distributed Events in Sentinel: Design and Implementation of a Global Event Detector
1999-01-01
local event detector and a global event detector to detect events. Global event detector in this case plays the role of a message sending/receiving than...significant in this case . The system performance will decrease with increase in the number of applications involved in global event detection. Yet from a...Figure 8: A Global event tree (2) 1. Global composite event is detected at the GED In this case , the whole global composite event tree is sent to the
Constructing event trees for volcanic crises
Newhall, C.; Hoblitt, R.
2002-01-01
Event trees are useful frameworks for discussing probabilities of possible outcomes of volcanic unrest. Each branch of the tree leads from a necessary prior event to a more specific outcome, e.g., from an eruption to a pyroclastic flow. Where volcanic processes are poorly understood, probability estimates might be purely empirical - utilizing observations of past and current activity and an assumption that the future will mimic the past or follow a present trend. If processes are better understood, probabilities might be estimated from a theoritical model, either subjectively or by numerical simulations. Use of Bayes' theorem aids in the estimation of how fresh unrest raises (or lowers) the probabilities of eruptions. Use of event trees during volcanic crises can help volcanologists to critically review their analysis of hazard, and help officials and individuals to compare volcanic risks with more familiar risks. Trees also emphasize the inherently probabilistic nature of volcano forecasts, with multiple possible outcomes.
Stolzer, Maureen; Lai, Han; Xu, Minli; Sathaye, Deepa; Vernot, Benjamin; Durand, Dannie
2012-09-15
Gene duplication (D), transfer (T), loss (L) and incomplete lineage sorting (I) are crucial to the evolution of gene families and the emergence of novel functions. The history of these events can be inferred via comparison of gene and species trees, a process called reconciliation, yet current reconciliation algorithms model only a subset of these evolutionary processes. We present an algorithm to reconcile a binary gene tree with a nonbinary species tree under a DTLI parsimony criterion. This is the first reconciliation algorithm to capture all four evolutionary processes driving tree incongruence and the first to reconcile non-binary species trees with a transfer model. Our algorithm infers all optimal solutions and reports complete, temporally feasible event histories, giving the gene and species lineages in which each event occurred. It is fixed-parameter tractable, with polytime complexity when the maximum species outdegree is fixed. Application of our algorithms to prokaryotic and eukaryotic data show that use of an incomplete event model has substantial impact on the events inferred and resulting biological conclusions. Our algorithms have been implemented in Notung, a freely available phylogenetic reconciliation software package, available at http://www.cs.cmu.edu/~durand/Notung. mstolzer@andrew.cmu.edu.
A vector matching method for analysing logic Petri nets
NASA Astrophysics Data System (ADS)
Du, YuYue; Qi, Liang; Zhou, MengChu
2011-11-01
Batch processing function and passing value indeterminacy in cooperative systems can be described and analysed by logic Petri nets (LPNs). To directly analyse the properties of LPNs, the concept of transition enabling vector sets is presented and a vector matching method used to judge the enabling transitions is proposed in this article. The incidence matrix of LPNs is defined; an equation about marking change due to a transition's firing is given; and a reachable tree is constructed. The state space explosion is mitigated to a certain extent from directly analysing LPNs. Finally, the validity and reliability of the proposed method are illustrated by an example in electronic commerce.
Sequence-invariant state machines
NASA Technical Reports Server (NTRS)
Whitaker, Sterling R.; Manjunath, Shamanna K.; Maki, Gary K.
1991-01-01
A synthesis method and an MOS VLSI architecture are presented to realize sequential circuits that have the ability to implement any state machine having N states and m inputs, regardless of the actual sequence specified in the flow table. The design method utilizes binary tree structured (BTS) logic to implement regular and dense circuits. The desired state sequence can be hardwired with power supply connections or can be dynamically reallocated if stored in a register. This allows programmable VLSI controllers to be designed with a compact size and performance approaching that of dedicated logic. Results of ICV implementations are reported and an example sequence-invariant state machine is contrasted with implementations based on traditional methods.
Early changes in physical tree characteristics during an oak decline event in the Ozark highlands
Martin A. Spetich
2006-01-01
An oak decline event is severely affecting up to 120 000 ha in the Ozark National Forest of Arkansas. Results of early changes in physical tree characteristics during that event are presented. In the fall and winter of 1999 and 2000, we established research plots on a site that would become a center of severe oak decline. In August 2000, standing trees > 14 cm in...
Logic regression and its extensions.
Schwender, Holger; Ruczinski, Ingo
2010-01-01
Logic regression is an adaptive classification and regression procedure, initially developed to reveal interacting single nucleotide polymorphisms (SNPs) in genetic association studies. In general, this approach can be used in any setting with binary predictors, when the interaction of these covariates is of primary interest. Logic regression searches for Boolean (logic) combinations of binary variables that best explain the variability in the outcome variable, and thus, reveals variables and interactions that are associated with the response and/or have predictive capabilities. The logic expressions are embedded in a generalized linear regression framework, and thus, logic regression can handle a variety of outcome types, such as binary responses in case-control studies, numeric responses, and time-to-event data. In this chapter, we provide an introduction to the logic regression methodology, list some applications in public health and medicine, and summarize some of the direct extensions and modifications of logic regression that have been proposed in the literature. Copyright © 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lasher, Mark E.; Henderson, Thomas B.; Drake, Barry L.; Bocker, Richard P.
1986-09-01
The modified signed-digit (MSD) number representation offers full parallel, carry-free addition. A MSD adder has been described by the authors. This paper describes how the adder can be used in a tree structure to implement an optical multiply algorithm. Three different optical schemes, involving position, polarization, and intensity encoding, are proposed for realizing the trinary logic system. When configured in the generic multiplier architecture, these schemes yield the combinatorial logic necessary to carry out the multiplication algorithm. The optical systems are essentially three dimensional arrangements composed of modular units. Of course, this modularity is important for design considerations, while the parallelism and noninterfering communication channels of optical systems are important from the standpoint of reduced complexity. The authors have also designed electronic hardware to demonstrate and model the combinatorial logic required to carry out the algorithm. The electronic and proposed optical systems will be compared in terms of complexity and speed.
Efficient FPT Algorithms for (Strict) Compatibility of Unrooted Phylogenetic Trees.
Baste, Julien; Paul, Christophe; Sau, Ignasi; Scornavacca, Celine
2017-04-01
In phylogenetics, a central problem is to infer the evolutionary relationships between a set of species X; these relationships are often depicted via a phylogenetic tree-a tree having its leaves labeled bijectively by elements of X and without degree-2 nodes-called the "species tree." One common approach for reconstructing a species tree consists in first constructing several phylogenetic trees from primary data (e.g., DNA sequences originating from some species in X), and then constructing a single phylogenetic tree maximizing the "concordance" with the input trees. The obtained tree is our estimation of the species tree and, when the input trees are defined on overlapping-but not identical-sets of labels, is called "supertree." In this paper, we focus on two problems that are central when combining phylogenetic trees into a supertree: the compatibility and the strict compatibility problems for unrooted phylogenetic trees. These problems are strongly related, respectively, to the notions of "containing as a minor" and "containing as a topological minor" in the graph community. Both problems are known to be fixed parameter tractable in the number of input trees k, by using their expressibility in monadic second-order logic and a reduction to graphs of bounded treewidth. Motivated by the fact that the dependency on k of these algorithms is prohibitively large, we give the first explicit dynamic programming algorithms for solving these problems, both running in time [Formula: see text], where n is the total size of the input.
Oknina, L B; Kuptsova, S V; Romanov, A S; Masherov, E L; Kuznetsova, O A; Sharova, E V
2012-01-01
The going of present pilot study is an analysis of features changes of EEG short pieces registered from 32 sites, at perception of musical melodies healthy examinees depending on logic (cognizance) and emotional (it was pleasant it was not pleasant) melody estimations. For this purpose changes of event-related synchronization/desynchronization, and also wavelet-synchrony of EEG-responses at 31 healthy examinees at the age from 18 till 60 years were compared. It is shown that at a logic estimation of music the melody cognizance is accompanied the event-related desynchronization in the left fronto-parietal-temporal area. At an emotional estimation of a melody the event-related synchronization in left fronto - temporal area for the pleasant melodies, desynchronization in temporal area for not pleasant and desynchronization in occipital area for the melodies which are not causing the emotional response is typical. At the analysis of wavelet-synchrony of EEG characterizing jet changes of interaction of cortical zones, it is revealed that the most distinct topographical distinctions concern type of processing of the heard music: logic (has learned-hasn't learned) or emotional (it was pleasant-it was not pleasant). If at an emotional estimation changes interhemispheric communications between associative cortical zones (central, frontal, temporal), are more expressed at logic - between inter - and intrahemispheric communications of projective zones of the acoustic analyzer (temporal area). It is supposed that the revealed event-related synchronization/desynhronization reflects, most likely, an activation component of an estimation of musical fragments whereas the wavelet-analysis provides guidance on character of processing of musical stimulus.
NASA Astrophysics Data System (ADS)
Stefaneas, Petros; Vandoulakis, Ioannis M.
2015-12-01
This paper outlines a logical representation of certain aspects of the process of mathematical proving that are important from the point of view of Artificial Intelligence. Our starting-point is the concept of proof-event or proving, introduced by Goguen, instead of the traditional concept of mathematical proof. The reason behind this choice is that in contrast to the traditional static concept of mathematical proof, proof-events are understood as processes, which enables their use in Artificial Intelligence in such contexts, in which problem-solving procedures and strategies are studied. We represent proof-events as problem-centered spatio-temporal processes by means of the language of the calculus of events, which captures adequately certain temporal aspects of proof-events (i.e. that they have history and form sequences of proof-events evolving in time). Further, we suggest a "loose" semantics for the proof-events, by means of Kolmogorov's calculus of problems. Finally, we expose the intented interpretations for our logical model from the fields of automated theorem-proving and Web-based collective proving.
Improvements to Earthquake Location with a Fuzzy Logic Approach
NASA Astrophysics Data System (ADS)
Gökalp, Hüseyin
2018-01-01
In this study, improvements to the earthquake location method were investigated using a fuzzy logic approach proposed by Lin and Sanford (Bull Seismol Soc Am 91:82-93, 2001). The method has certain advantages compared to the inverse methods in terms of eliminating the uncertainties of arrival times and reading errors. In this study, adopting this approach, epicentral locations were determined based on the results of a fuzzy logic space concerning the uncertainties in the velocity models. To map the uncertainties in arrival times into the fuzzy logic space, a trapezoidal membership function was constructed by directly using the travel time difference between the two stations for the P- and S-arrival times instead of the P- and S-wave models to eliminate the need for obtaining information concerning the velocity structure of the study area. The results showed that this method worked most effectively when earthquakes occurred away from a network or when the arrival time data contained phase reading errors. In this study, to resolve the problems related to determining the epicentral locations of the events, a forward modeling method like the grid search technique was used by applying different logical operations (i.e., intersection, union, and their combination) with a fuzzy logic approach. The locations of the events were depended on results of fuzzy logic outputs in fuzzy logic space by searching in a gridded region. The process of location determination with the defuzzification of only the grid points with the membership value of 1 obtained by normalizing all the maximum fuzzy output values of the highest values resulted in more reliable epicentral locations for the earthquakes than the other approaches. In addition, throughout the process, the center-of-gravity method was used as a defuzzification operation.
2009-10-01
SPACES AND FUZZY LOGIC FOR PASSIVE SPEECH INTERPRETATION Katie T. McConky Research Scientist CUBRC Buffalo, NY, U.S.A. mcconky@cubrc.org...ORGANIZATION NAME(S) AND ADDRESS(ES) CUBRC ,4455 Genesee Street, Suite 106,Buffalo,NY,14225 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING
Avise, John C
2008-08-12
The field of molecular genetics has many roles in biodiversity assessment and conservation. I summarize three of those standard roles and propose logical extensions of each. First, many biologists suppose that a comprehensive picture of the Tree of Life will soon emerge from multilocus DNA sequence data interpreted in concert with fossils and other evidence. If nonreticulate trees are indeed valid metaphors for life's history, then a well dated global phylogeny will offer an opportunity to erect a universally standardized scheme of biological classification. If life's history proves to be somewhat reticulate, a web-like phylogenetic pattern should become evident and will offer opportunities to reevaluate the fundamental nature of evolutionary processes. Second, extensive networks of wildlife sanctuaries offer some hope for shepherding appreciable biodiversity through the ongoing extinction crisis, and molecular genetics can assist in park design by helping to identify key species, historically important biotic areas, and biodiversity hotspots. An opportunity centers on the concept of Pleistocene Parks that could protect "legacy biotas" in much the same way that traditional national parks preserve special geological features and historical landmarks honor legacy events in human affairs. Third, genetic perspectives have become an integral part of many focused conservation efforts by unveiling ecological, behavioral, or evolutionary phenomena relevant to population management. They also can open opportunities to educate the public about the many intellectual gifts and aesthetic marvels of the natural world.
Three ambitious (and rather unorthodox) assignments for the field of biodiversity genetics
Avise, John C.
2008-01-01
The field of molecular genetics has many roles in biodiversity assessment and conservation. I summarize three of those standard roles and propose logical extensions of each. First, many biologists suppose that a comprehensive picture of the Tree of Life will soon emerge from multilocus DNA sequence data interpreted in concert with fossils and other evidence. If nonreticulate trees are indeed valid metaphors for life's history, then a well dated global phylogeny will offer an opportunity to erect a universally standardized scheme of biological classification. If life's history proves to be somewhat reticulate, a web-like phylogenetic pattern should become evident and will offer opportunities to reevaluate the fundamental nature of evolutionary processes. Second, extensive networks of wildlife sanctuaries offer some hope for shepherding appreciable biodiversity through the ongoing extinction crisis, and molecular genetics can assist in park design by helping to identify key species, historically important biotic areas, and biodiversity hotspots. An opportunity centers on the concept of Pleistocene Parks that could protect “legacy biotas” in much the same way that traditional national parks preserve special geological features and historical landmarks honor legacy events in human affairs. Third, genetic perspectives have become an integral part of many focused conservation efforts by unveiling ecological, behavioral, or evolutionary phenomena relevant to population management. They also can open opportunities to educate the public about the many intellectual gifts and aesthetic marvels of the natural world. PMID:18695224
Synthesizing Dynamic Programming Algorithms from Linear Temporal Logic Formulae
NASA Technical Reports Server (NTRS)
Rosu, Grigore; Havelund, Klaus
2001-01-01
The problem of testing a linear temporal logic (LTL) formula on a finite execution trace of events, generated by an executing program, occurs naturally in runtime analysis of software. We present an algorithm which takes an LTL formula and generates an efficient dynamic programming algorithm. The generated algorithm tests whether the LTL formula is satisfied by a finite trace of events given as input. The generated algorithm runs in linear time, its constant depending on the size of the LTL formula. The memory needed is constant, also depending on the size of the formula.
75 FR 66125 - Notice of Public Meeting and Request for Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-27
...--The National Christmas Tree Lighting and the subsequent 23 day event. SUMMARY: The National Park Service is seeking public comments and suggestions on the planning of the 2010 National Christmas Tree... Christmas Tree Lighting and the subsequent 23 day event, which opens on December 9, 2010, on the Ellipse...
NASA Technical Reports Server (NTRS)
Strahler, Alan H.; Jupp, David L. B.
1990-01-01
Geometric-optical discrete-element mathematical models for forest canopies have been developed using the Boolean logic and models of Serra. The geometric-optical approach is considered to be particularly well suited to describing the bidirectional reflectance of forest woodland canopies, where the concentration of leaf material within crowns and the resulting between-tree gaps make plane-parallel, radiative-transfer models inappropriate. The approach leads to invertible formulations, in which the spatial and directional variance provides the means for remote estimation of tree crown size, shape, and total cover from remotedly sensed imagery.
Hierarchical Poly Tree Configurations for the Solution of Dynamically Refined Finte Element Models
NASA Technical Reports Server (NTRS)
Gute, G. D.; Padovan, J.
1993-01-01
This paper demonstrates how a multilevel substructuring technique, called the Hierarchical Poly Tree (HPT), can be used to integrate a localized mesh refinement into the original finite element model more efficiently. The optimal HPT configurations for solving isoparametrically square h-, p-, and hp-extensions on single and multiprocessor computers is derived. In addition, the reduced number of stiffness matrix elements that must be stored when employing this type of solution strategy is quantified. Moreover, the HPT inherently provides localize 'error-trapping' and a logical, efficient means with which to isolate physically anomalous and analytically singular behavior.
Dataset for forensic analysis of B-tree file system.
Wani, Mohamad Ahtisham; Bhat, Wasim Ahmad
2018-06-01
Since B-tree file system (Btrfs) is set to become de facto standard file system on Linux (and Linux based) operating systems, Btrfs dataset for forensic analysis is of great interest and immense value to forensic community. This article presents a novel dataset for forensic analysis of Btrfs that was collected using a proposed data-recovery procedure. The dataset identifies various generalized and common file system layouts and operations, specific node-balancing mechanisms triggered, logical addresses of various data structures, on-disk records, recovered-data as directory entries and extent data from leaf and internal nodes, and percentage of data recovered.
NASA Astrophysics Data System (ADS)
Carlyle-Moses, D. E.; Schooling, J. T.
2014-12-01
Urban tree canopy processes affect the volume and biogeochemistry of inputs to the hydrological cycle in cities. We studied stemflow from 37 isolated deciduous trees in an urban park in Kamloops, British Columbia which has a semi-arid climate dominated by small precipitation events. Precipitation and stemflow were measured on an event basis from June 12, 2012 to November 3, 2013. To clarify the effect of canopy traits on stemflow thresholds, rates, yields, percent, and funneling ratios, we analyzed branch angles, bark roughness, tree size, cover, leaf size, and branch and leader counts. High branch angles promoted stemflow in all trees, while bark roughness influenced stemflow differently for single- and multi-leader trees. The association between stemflow and numerous leaders deserves further study. Columnar-form trees often partitioned a large percentage of precipitation into stemflow, with event-scale values as high as 27.9 % recorded for an Armstrong Freeman Maple (Acer x freemanii 'Armstrong'). Under growing-season conditions funneling ratios as high as 196.9 were derived for an American Beech (Fagus grandifolia) individual. Among meteorological variables, rain depth was strongly correlated with stemflow yields; intra-storm break duration, rainfall intensity, rainfall inclination, wind speed, and vapour pressure deficit also played roles. Greater stemflow was associated with leafless canopies and with rain or mixed events versus snow. Results can inform climate-sensitive selection and siting of urban trees towards integrated rainwater management. For example, previous studies suggest that the reduction in storm-water generation by urban trees is accomplished through canopy interception loss alone. However, trees that partition large quantities of precipitation canopy-drainage as stemflow to the base of their trunks, where it has the potential to infiltrate into the soil media rather than fall on impervious surfaces as throughfall, may assist in reducing stormwater flow.
Coordination Logic for Repulsive Resolution Maneuvers
NASA Technical Reports Server (NTRS)
Narkawicz, Anthony J.; Munoz, Cesar A.; Dutle, Aaron M.
2016-01-01
This paper presents an algorithm for determining the direction an aircraft should maneuver in the event of a potential conflict with another aircraft. The algorithm is implicitly coordinated, meaning that with perfectly reliable computations and information, it will in- dependently provide directional information that is guaranteed to be coordinated without any additional information exchange or direct communication. The logic is inspired by the logic of TCAS II, the airborne system designed to reduce the risk of mid-air collisions between aircraft. TCAS II provides pilots with only vertical resolution advice, while the proposed algorithm, using a similar logic, provides implicitly coordinated vertical and horizontal directional advice.
Redundant single event upset supression system
Hoff, James R.
2006-04-04
CMOS transistors are configured to operate as either a redundant, SEU-tolerant, positive-logic, cross-coupled Nor Gate SR-flip flop or a redundant, SEU-tolerant, negative-logic, cross-coupled Nand Gate SR-flip flop. The register can operate as a memory, and further as a memory that can overcome the effects of radiation. As an SR-flip flop, the invention can be altered into any known type of latch or flip-flop by the application of external logic, thereby extending radiation tolerance to devices previously incapable of radiation tolerance. Numerous registers can be logically connected and replicated thereby being electronically configured to operate as a redundant circuit.
Monitoring Java Programs with Java PathExplorer
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)
2001-01-01
We present recent work on the development Java PathExplorer (JPAX), a tool for monitoring the execution of Java programs. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems. The tool facilitates automated instrumentation of a program's late code which will then omit events to an observer during its execution. The observer checks the events against user provided high level requirement specifications, for example temporal logic formulae, and against lower level error detection procedures, for example concurrency related such as deadlock and data race algorithms. High level requirement specifications together with their underlying logics are defined in the Maude rewriting logic, and then can either be directly checked using the Maude rewriting engine, or be first translated to efficient data structures and then checked in Java.
Gating Out Misinformation: Can Young Children Follow Instructions to Ignore False Information?
Schaaf, Jennifer M; Bederian-Gardner, Daniel; Goodman, Gail S
2015-08-01
The current study investigated the effects of misinformation on children's memory reports after practice with the logic-of-opposition instruction at time of test. Four- and 6-year-old children participated in a play event in Session 1. During a two-week delay, parents presented their children with either misinformation or correct information about the play event. Prior to a memory interview in Session 2, some misled children were given a developmentally appropriate logic-of-opposition instruction to not report information provided by their parents. Results indicated that children were misled by the incorrect information, but that the logic-of-opposition instruction aided in the children's retrieval of the original memory, particularly for the 6-year-olds. Implications of the results for memory malleability and social demand effects in children are discussed. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
La Spina, Sylvie; de Cannière, Charles; Molenberg, Jean-Marc; Vincke, Caroline; Deman, Déborah; Grégoire, Jean-Claude
2010-05-01
Climate change tends to induce more frequent abiotic and biotic extreme events, having large impacts on tree vitality. Weakened trees are then more susceptible to secondary insect outbreaks, as it happened in Belgium in the early 2000s: after an early frost event, secondary Scolytine ambrosia beetles attacks were observed on beech trees. In this study, we test if a combination of stress, i.e. a soil water deficit preceding an early frost, could render trees more attractive to beetles. An experimental study was set in autumn 2008. Two parcels of a beech forest were covered with plastic tents to induce a water stress by rain interception. The parcels were surrounded by 2-meters depth trenches to avoid water supply by streaming. Soil water content and different indicators of tree water use (sap flow, predawn leaf water potential, tree radial growth) were followed. In autumn 2010, artificial frost injuries will be inflicted to trees using dry ice. Trees attractivity for Scolytine insects, and the success of insect colonization will then be studied. The poster will focus on experiment setting and first results (impacts of soil water deficit on trees).
An earthquake rate forecast for Europe based on smoothed seismicity and smoothed fault contribution
NASA Astrophysics Data System (ADS)
Hiemer, Stefan; Woessner, Jochen; Basili, Roberto; Wiemer, Stefan
2013-04-01
The main objective of project SHARE (Seismic Hazard Harmonization in Europe) is to develop a community-based seismic hazard model for the Euro-Mediterranean region. The logic tree of earthquake rupture forecasts comprises several methodologies including smoothed seismicity approaches. Smoothed seismicity thus represents an alternative concept to express the degree of spatial stationarity of seismicity and provides results that are more objective, reproducible, and testable. Nonetheless, the smoothed-seismicity approach suffers from the common drawback of being generally based on earthquake catalogs alone, i.e. the wealth of knowledge from geology is completely ignored. We present a model that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults and subductions. The result is mainly driven by the data, being independent of subjective delineation of seismic source zones. The core parts of our model are two distinct location probability densities: The first is computed by smoothing past seismicity (using variable kernel smoothing to account for varying data density). The second is obtained by smoothing fault moment rate contributions. The fault moment rates are calculated by summing the moment rate of each fault patch on a fully parameterized and discretized fault as available from the SHARE fault database. We assume that the regional frequency-magnitude distribution of the entire study area is well known and estimate the a- and b-value of a truncated Gutenberg-Richter magnitude distribution based on a maximum likelihood approach that considers the spatial and temporal completeness history of the seismic catalog. The two location probability densities are linearly weighted as a function of magnitude assuming that (1) the occurrence of past seismicity is a good proxy to forecast occurrence of future seismicity and (2) future large-magnitude events occur more likely in the vicinity of known faults. Consequently, the underlying location density of our model depends on the magnitude. We scale the density with the estimated a-value in order to construct a forecast that specifies the earthquake rate in each longitude-latitude-magnitude bin. The model is intended to be one branch of SHARE's logic tree of rupture forecasts and provides rates of events in the magnitude range of 5 <= m <= 8.5 for the entire region of interest and is suitable for comparison with other long-term models in the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP).
Using Pipelined XNOR Logic to Reduce SEU Risks in State Machines
NASA Technical Reports Server (NTRS)
Le, Martin; Zheng, Xin; Katanyoutant, Sunant
2008-01-01
Single-event upsets (SEUs) pose great threats to avionic systems state machine control logic, which are frequently used to control sequence of events and to qualify protocols. The risks of SEUs manifest in two ways: (a) the state machine s state information is changed, causing the state machine to unexpectedly transition to another state; (b) due to the asynchronous nature of SEU, the state machine's state registers become metastable, consequently causing any combinational logic associated with the metastable registers to malfunction temporarily. Effect (a) can be mitigated with methods such as triplemodular redundancy (TMR). However, effect (b) cannot be eliminated and can degrade the effectiveness of any mitigation method of effect (a). Although there is no way to completely eliminate the risk of SEU-induced errors, the risk can be made very small by use of a combination of very fast state-machine logic and error-detection logic. Therefore, one goal of two main elements of the present method is to design the fastest state-machine logic circuitry by basing it on the fastest generic state-machine design, which is that of a one-hot state machine. The other of the two main design elements is to design fast error-detection logic circuitry and to optimize it for implementation in a field-programmable gate array (FPGA) architecture: In the resulting design, the one-hot state machine is fitted with a multiple-input XNOR gate for detection of illegal states. The XNOR gate is implemented with lookup tables and with pipelines for high speed. In this method, the task of designing all the logic must be performed manually because no currently available logic synthesis software tool can produce optimal solutions of design problems of this type. However, some assistance is provided by a script, written for this purpose in the Python language (an object-oriented interpretive computer language) to automatically generate hardware description language (HDL) code from state-transition rules.
Revenue Risk Modelling and Assessment on BOT Highway Project
NASA Astrophysics Data System (ADS)
Novianti, T.; Setyawan, H. Y.
2018-01-01
The infrastructure project which is considered as a public-private partnership approach under BOT (Build-Operate-Transfer) arrangement, such as a highway, is risky. Therefore, assessment on risk factors is essential as the project have a concession period and is influenced by macroeconomic factors and consensus period. In this study, pre-construction risks of a highway were examined by using a Delphi method to create a space for offline expert discussions; a fault tree analysis to map intuition of experts and to create a model from the underlying risk events; a fuzzy logic to interpret the linguistic data of risk models. The loss of revenue for risk tariff, traffic volume, force majeure, and income were then measured. The results showed that the loss of revenue caused by the risk tariff was 10.5% of the normal total revenue. The loss of revenue caused by the risk of traffic volume was 21.0% of total revenue. The loss of revenue caused by the force majeure was 12.2% of the normal income. The loss of income caused by the non-revenue events was 6.9% of the normal revenue. It was also found that the volume of traffic was the major risk of a highway project because it related to customer preferences.
Heuristic and analytic processes in reasoning: an event-related potential study of belief bias.
Banks, Adrian P; Hope, Christopher
2014-03-01
Human reasoning involves both heuristic and analytic processes. This study of belief bias in relational reasoning investigated whether the two processes occur serially or in parallel. Participants evaluated the validity of problems in which the conclusions were either logically valid or invalid and either believable or unbelievable. Problems in which the conclusions presented a conflict between the logically valid response and the believable response elicited a more positive P3 than problems in which there was no conflict. This shows that P3 is influenced by the interaction of belief and logic rather than either of these factors on its own. These findings indicate that belief and logic influence reasoning at the same time, supporting models in which belief-based and logical evaluations occur in parallel but not theories in which belief-based heuristic evaluations precede logical analysis.
Piezoelectric-based self-powered electronic adjustable impulse switches
NASA Astrophysics Data System (ADS)
Rastegar, Jahangir; Kwok, Philip
2018-03-01
Novel piezoelectric-based self-powered impulse detecting switches are presented. The switches are designed to detect shock loading events resulting in acceleration or deceleration above prescribed levels and durations. The prescribed acceleration level and duration thresholds are adjustable. They are provided with false trigger protection logic. The impulse switches are provided with electronic and logic circuitry to detect prescribed impulse events and reject events such as high amplitude but short duration shocks, and transportation vibration and similar low amplitude and relatively long duration events. They can be mounted directly onto electronics circuit boards, thereby significantly simplifying the electrical and electronic circuitry, simplifying the assembly process and total cost, significantly reducing the occupied volume, and in some applications eliminating the need for physical wiring to and from the impulse switches. The design of prototypes and testing under realistic conditions are presented.
Evaluating growth models: A case study using PrognosisBC
Peter Marshall; Pablo Parysow; Shadrach Akindele
2008-01-01
The ability of the PrognosisBC (Version 3.0) growth model to predict tree and stand growth was assessed against a series of remeasured permanent sample plots, including some which had been precommercially thinned. In addition, the model was evaluated for logical consistency across a variety of stand structures using simulation. By the end of the...
Classification and evaluation for forest sites in the Cumberland Mountains
Glendon W. Smalley
1984-01-01
This report classifies and evaluates forest sites in the Cumberland Mountains (fig. 1) for the management of several commercially valuable tree species. It provides forest managers with a land classification system that will enable them to subdivide forest land into logical segments (landtypes), allow them to rate productivity, and alert them to any limitations and...
The Two-By-Two Array: An Aid in Conceptualization and Problem Solving
ERIC Educational Resources Information Center
Eberhart, James
2004-01-01
The fields of mathematics, science, and engineering are replete with diagrams of many varieties. They range in nature from the Venn diagrams of symbolic logic to the Periodic Chart of the Elements; and from the fault trees of risk assessment to the flow charts used to describe laboratory procedures, industrial processes, and computer programs. All…
A tree-ring based fire history of riparian reserves in the Klamath Mountains.
Carl N. Skinner
2003-01-01
Surprisingly little fire history information is available for riparian environments despite their ecological importance. Thus, there is a great deal of uncertainty about the ecological role of fire in riparian environments. Considering the Mediterranean climate and the general pattern of frequent low-moderate severity fires in most vegetation types, it is logical to...
Klemen Novak; Martin de Luis; Miguel A. Saz; Luis A. Longares; Roberto Serrano-Notivoli; Josep Raventos; Katarina Cufar; Jozica Gricar; Alfredo Di Filippo; Gianluca Piovesan; Cyrille B.K. Rathgeber; Andreas Papadopoulos; Kevin T. Smith
2016-01-01
Climate predictions for the Mediterranean Basin include increased temperatures, decreased precipitation, and increased frequency of extreme climatic events (ECE). These conditions are associated with decreased tree growth and increased vulnerability to pests and diseases. The anatomy of tree rings responds to these environmental conditions. Quantitatively, the width of...
A diagnosis system using object-oriented fault tree models
NASA Technical Reports Server (NTRS)
Iverson, David L.; Patterson-Hine, F. A.
1990-01-01
Spaceborne computing systems must provide reliable, continuous operation for extended periods. Due to weight, power, and volume constraints, these systems must manage resources very effectively. A fault diagnosis algorithm is described which enables fast and flexible diagnoses in the dynamic distributed computing environments planned for future space missions. The algorithm uses a knowledge base that is easily changed and updated to reflect current system status. Augmented fault trees represented in an object-oriented form provide deep system knowledge that is easy to access and revise as a system changes. Given such a fault tree, a set of failure events that have occurred, and a set of failure events that have not occurred, this diagnosis system uses forward and backward chaining to propagate causal and temporal information about other failure events in the system being diagnosed. Once the system has established temporal and causal constraints, it reasons backward from heuristically selected failure events to find a set of basic failure events which are a likely cause of the occurrence of the top failure event in the fault tree. The diagnosis system has been implemented in common LISP using Flavors.
NASA Technical Reports Server (NTRS)
Ng, Tak-kwong (Inventor); Herath, Jeffrey A. (Inventor)
2010-01-01
An integrated system mitigates the effects of a single event upset (SEU) on a reprogrammable field programmable gate array (RFPGA). The system includes (i) a RFPGA having an internal configuration memory, and (ii) a memory for storing a configuration associated with the RFPGA. Logic circuitry programmed into the RFPGA and coupled to the memory reloads a portion of the configuration from the memory into the RFPGA's internal configuration memory at predetermined times. Additional SEU mitigation can be provided by logic circuitry on the RFPGA that monitors and maintains synchronized operation of the RFPGA's digital clock managers.
David J. Nowak; Jeffrey T. Walton; James Baldwin; Jerry Bond
2015-01-01
Information on street trees is critical for management of this important resource. Sampling of street tree populations provides an efficient means to obtain street tree population information. Long-term repeat measures of street tree samples supply additional information on street tree changes and can be used to report damages from catastrophic events. Analyses of...
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2014-01-01
The field of runtime verification has during the last decade seen a multitude of systems for monitoring event sequences (traces) emitted by a running system. The objective is to ensure correctness of a system by checking its execution traces against formal specifications representing requirements. A special challenge is data parameterized events, where monitors have to keep track of the combination of control states as well as data constraints, relating events and the data they carry across time points. This poses a challenge wrt. efficiency of monitors, as well as expressiveness of logics. Data automata is a form of automata where states are parameterized with data, supporting monitoring of data parameterized events. We describe the full details of a very simple API in the Scala programming language, an internal DSL (Domain-Specific Language), implementing data automata. The small implementation suggests a design pattern. Data automata allow transition conditions to refer to other states than the source state, and allow target states of transitions to be inlined, offering a temporal logic flavored notation. An embedding of a logic in a high-level language like Scala in addition allows monitors to be programmed using all of Scala's language constructs, offering the full flexibility of a programming language. The framework is demonstrated on an XML processing scenario previously addressed in related work.
Performance bounds on parallel self-initiating discrete-event
NASA Technical Reports Server (NTRS)
Nicol, David M.
1990-01-01
The use is considered of massively parallel architectures to execute discrete-event simulations of what is termed self-initiating models. A logical process in a self-initiating model schedules its own state re-evaluation times, independently of any other logical process, and sends its new state to other logical processes following the re-evaluation. The interest is in the effects of that communication on synchronization. The performance is considered of various synchronization protocols by deriving upper and lower bounds on optimal performance, upper bounds on Time Warp's performance, and lower bounds on the performance of a new conservative protocol. The analysis of Time Warp includes the overhead costs of state-saving and rollback. The analysis points out sufficient conditions for the conservative protocol to outperform Time Warp. The analysis also quantifies the sensitivity of performance to message fan-out, lookahead ability, and the probability distributions underlying the simulation.
Simulation of rare events in quantum error correction
NASA Astrophysics Data System (ADS)
Bravyi, Sergey; Vargo, Alexander
2013-12-01
We consider the problem of calculating the logical error probability for a stabilizer quantum code subject to random Pauli errors. To access the regime of large code distances where logical errors are extremely unlikely we adopt the splitting method widely used in Monte Carlo simulations of rare events and Bennett's acceptance ratio method for estimating the free energy difference between two canonical ensembles. To illustrate the power of these methods in the context of error correction, we calculate the logical error probability PL for the two-dimensional surface code on a square lattice with a pair of holes for all code distances d≤20 and all error rates p below the fault-tolerance threshold. Our numerical results confirm the expected exponential decay PL˜exp[-α(p)d] and provide a simple fitting formula for the decay rate α(p). Both noiseless and noisy syndrome readout circuits are considered.
Event Classification and Identification Based on the Characteristic Ellipsoid of Phasor Measurement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Jian; Diao, Ruisheng; Makarov, Yuri V.
2011-09-23
In this paper, a method to classify and identify power system events based on the characteristic ellipsoid of phasor measurement is presented. The decision tree technique is used to perform the event classification and identification. Event types, event locations and clearance times are identified by decision trees based on the indices of the characteristic ellipsoid. A sufficiently large number of transient events were simulated on the New England 10-machine 39-bus system based on different system configurations. Transient simulations taking into account different event types, clearance times and various locations are conducted to simulate phasor measurement. Bus voltage magnitudes and recordedmore » reactive and active power flows are used to build the characteristic ellipsoid. The volume, eccentricity, center and projection of the longest axis in the parameter space coordinates of the characteristic ellipsoids are used to classify and identify events. Results demonstrate that the characteristic ellipsoid and the decision tree are capable to detect the event type, location, and clearance time with very high accuracy.« less
Upper-Bound Estimates Of SEU in CMOS
NASA Technical Reports Server (NTRS)
Edmonds, Larry D.
1990-01-01
Theory of single-event upsets (SEU) (changes in logic state caused by energetic charged subatomic particles) in complementary metal oxide/semiconductor (CMOS) logic devices extended to provide upper-bound estimates of rates of SEU when limited experimental information available and configuration and dimensions of SEU-sensitive regions of devices unknown. Based partly on chord-length-distribution method.
ERIC Educational Resources Information Center
Zarcone, Alessandra; Padó, Sebastian; Lenci, Alessandro
2014-01-01
Logical metonymy resolution ("begin a book" ? "begin reading a book" or "begin writing a book") has traditionally been explained either through complex lexical entries (qualia structures) or through the integration of the implicit event via post-lexical access to world knowledge. We propose that recent work within the…
A Black Swan and Sub-continental Scale Dynamics in Humid, Late-Holocene Broadleaf Forests
NASA Astrophysics Data System (ADS)
Pederson, N.; Dyer, J.; McEwan, R.; Hessl, A. E.; Mock, C. J.; Orwig, D.; Rieder, H. E.; Cook, B. I.
2012-12-01
In humid regions with dense broadleaf-dominated forests where gap-dynamics is the prevailing disturbance regime, paleoecological evidence shows regional-scale changes in forest composition associated with climatic change. To investigate the potential for regional events in late-Holocene forests, we use tree-ring data from 76 populations covering 840,000 km2 and 5.3k tree recruitment dates spanning 1.4 million km2 in the eastern US to investigate the occurrence of simultaneous forest dynamics across a humid region. We compare regional forest dynamics with an independent set of annually-resolved tree ring record of hydroclimate to examine whether climate dynamics might drive forest dynamics in this humid region. In forests where light availability is an important limitation for tree recruitment, we document a pulse of tree recruitment during the mid- to late-1600s across the eastern US. This pulse, which can be inferred as large-scale canopy opening, occurred during an era that multiple proxies indicate as extended drought between two intense pluvial. Principal component analysis of the 76 populations indicates a step-change increase in average ring width during the late-1770s resembling a potential canopy accession event over 42,800 km2 of the southeastern US. Growth-release analysis of populations loading strongly on this eigenvector indicates severe canopy disturbance from 1775-1779 that peaked in 1776. The 1776 event follows a period with extended droughts and severe large-scale frost event. We hypothesize these climatic events lead to elevated tree mortality in the late-1770s and canopy accession for understory trees. Superposed epoch analysis reveals that spikes of elevated canopy disturbance from 1685-1850 CE are significantly associated with drought. Extreme value theory statistics indicates the 1776 event lies beyond the 99.9 quantile and nearly 7 sigmas above the 1685-1850 mean rate of disturbance. The time-series of canopy disturbance from 1685-1850 is so poorly described by a Gaussian distribution that it can be considered 'heavy tailed'. Preliminary results show that disturbance events that affect >3-5% of the trees in our dataset occur approximately every 200 years. The most extreme rates (>5%) occur approximately every 500-1000 years. These statistics indicate that the 1775-1779 heavy-tail event can also be considered a 'Black Swan', the rare event that has the potential to alter a system's trajectory further than common events. Our results challenge traditional views regarding characteristic disturbance regime in humid temperate forests, and speak to the importance of punctuated climatic events in shaping forest structure for centuries. Such an understanding is critical given the potential of more frequent extreme climatic events in the future.
Kamneva, Olga K; Rosenberg, Noah A
2017-01-01
Hybridization events generate reticulate species relationships, giving rise to species networks rather than species trees. We report a comparative study of consensus, maximum parsimony, and maximum likelihood methods of species network reconstruction using gene trees simulated assuming a known species history. We evaluate the role of the divergence time between species involved in a hybridization event, the relative contributions of the hybridizing species, and the error in gene tree estimation. When gene tree discordance is mostly due to hybridization and not due to incomplete lineage sorting (ILS), most of the methods can detect even highly skewed hybridization events between highly divergent species. For recent divergences between hybridizing species, when the influence of ILS is sufficiently high, likelihood methods outperform parsimony and consensus methods, which erroneously identify extra hybridizations. The more sophisticated likelihood methods, however, are affected by gene tree errors to a greater extent than are consensus and parsimony. PMID:28469378
Efficient Exploration of the Space of Reconciled Gene Trees
Szöllősi, Gergely J.; Rosikiewicz, Wojciech; Boussau, Bastien; Tannier, Eric; Daubin, Vincent
2013-01-01
Gene trees record the combination of gene-level events, such as duplication, transfer and loss (DTL), and species-level events, such as speciation and extinction. Gene tree–species tree reconciliation methods model these processes by drawing gene trees into the species tree using a series of gene and species-level events. The reconstruction of gene trees based on sequence alone almost always involves choosing between statistically equivalent or weakly distinguishable relationships that could be much better resolved based on a putative species tree. To exploit this potential for accurate reconstruction of gene trees, the space of reconciled gene trees must be explored according to a joint model of sequence evolution and gene tree–species tree reconciliation. Here we present amalgamated likelihood estimation (ALE), a probabilistic approach to exhaustively explore all reconciled gene trees that can be amalgamated as a combination of clades observed in a sample of gene trees. We implement the ALE approach in the context of a reconciliation model (Szöllősi et al. 2013), which allows for the DTL of genes. We use ALE to efficiently approximate the sum of the joint likelihood over amalgamations and to find the reconciled gene tree that maximizes the joint likelihood among all such trees. We demonstrate using simulations that gene trees reconstructed using the joint likelihood are substantially more accurate than those reconstructed using sequence alone. Using realistic gene tree topologies, branch lengths, and alignment sizes, we demonstrate that ALE produces more accurate gene trees even if the model of sequence evolution is greatly simplified. Finally, examining 1099 gene families from 36 cyanobacterial genomes we find that joint likelihood-based inference results in a striking reduction in apparent phylogenetic discord, with respectively. 24%, 59%, and 46% reductions in the mean numbers of duplications, transfers, and losses per gene family. The open source implementation of ALE is available from https://github.com/ssolo/ALE.git. [amalgamation; gene tree reconciliation; gene tree reconstruction; lateral gene transfer; phylogeny.] PMID:23925510
Efficient algorithms for dilated mappings of binary trees
NASA Technical Reports Server (NTRS)
Iqbal, M. Ashraf
1990-01-01
The problem is addressed to find a 1-1 mapping of the vertices of a binary tree onto those of a target binary tree such that the son of a node on the first binary tree is mapped onto a descendent of the image of that node in the second binary tree. There are two natural measures of the cost of this mapping, namely the dilation cost, i.e., the maximum distance in the target binary tree between the images of vertices that are adjacent in the original tree. The other measure, expansion cost, is defined as the number of extra nodes/edges to be added to the target binary tree in order to ensure a 1-1 mapping. An efficient algorithm to find a mapping of one binary tree onto another is described. It is shown that it is possible to minimize one cost of mapping at the expense of the other. This problem arises when designing pipelined arithmetic logic units (ALU) for special purpose computers. The pipeline is composed of ALU chips connected in the form of a binary tree. The operands to the pipeline can be supplied to the leaf nodes of the binary tree which then process and pass the results up to their parents. The final result is available at the root. As each new application may require a distinct nesting of operations, it is useful to be able to find a good mapping of a new binary tree over existing ALU tree. Another problem arises if every distinct required binary tree is known beforehand. Here it is useful to hardwire the pipeline in the form of a minimal supertree that contains all required binary trees.
Using Histories to Implement Atomic Objects
NASA Technical Reports Server (NTRS)
Ng, Pui
1987-01-01
In this paper we describe an approach of implementing atomicity. Atomicity requires that computations appear to be all-or-nothing and executed in a serialization order. The approach we describe has three characteristics. First, it utilizes the semantics of an application to improve concurrency. Second, it reduces the complexity of application-dependent synchronization code by analyzing the process of writing it. In fact, the process can be automated with logic programming. Third, our approach hides the protocol used to arrive at a serialization order from the applications. As a result, different protocols can be used without affecting the applications. Our approach uses a history tree abstraction. The history tree captures the ordering relationship among concurrent computations. By determining what types of computations exist in the history tree and their parameters, a computation can determine whether it can proceed.
NASA Astrophysics Data System (ADS)
Caldeira, M. C.; Lobo-do-Vale, R.; Lecomte, X.; David, T. S.; Pinto, J. G.; Bugalho, M. N.; Werner, C.
2016-12-01
Extreme droughts and plant invasions are major drivers of global change that can critically affect ecosystem functioning. Shrub encroachment is increasing in many regions worldwide and extreme events are projected to increase in frequency and intensity, namely in the Mediterranean region. Nevertheless, little is known about how these drivers may interact and affect ecosystem functioning and resilience Using a manipulative shrub removal experiment and the co-occurrence of an extreme drought event in a Mediterranean oak woodland, we show that the combination of native shrub invasion and extreme drought reduced ecosystem transpiration and the resilience of the key-stone oak tree species. We established six 25 x 25 m paired plots in a shrub (Cistus ladanifer L.) encroached Mediterranean cork-oak (Quercus suber L.) woodland. We measured sapflow and pre-dawn leaf water potential of trees and shrubs and soil water content in all plots during four years. We determined the resilience of tree transpiration to evaluate to what extent trees recovered from the extreme drought event. From February to November 2011 we conducted baseline measurements for plot comparison. In November 2011 all the shrubs from one of all the paired plots were cut and removed. Ecosystem transpiration was dominated by the water use of the invasive shrub, which further increased after the extreme drought. Simultaneously, tree transpiration in invaded plots declined more sharply (67 ± 13 %) than in plots cleared from shrubs (31 ± 11%) relative to the pre-drought year (2011). Trees in invaded plots were not able to recover in the following wetter year showing lower resilience to the extreme drought event. Our results imply that in Mediterranean-type of climates invasion by water spending species coupled with the projected recurrent extreme droughts will cause critical drought tolerance thresholds of trees to be overcome, thus increasing the probability of tree mortality.
Staes, Catherine J; Altamore, Rita; Han, EunGyoung; Mottice, Susan; Rajeev, Deepthi; Bradshaw, Richard
2011-01-01
To control disease, laboratories and providers are required to report conditions to public health authorities. Reporting logic is defined in a variety of resources, but there is no single resource available for reporters to access the list of reportable events and computable reporting logic for any jurisdiction. In order to develop evidence-based requirements for authoring such knowledge, we evaluated reporting logic in the Council of State and Territorial Epidemiologist (CSTE) position statements to assess its readiness for automated systems and identify features that should be considered when designing an authoring interface; we evaluated codes in the Reportable Condition Mapping Tables (RCMT) relative to the nationally-defined reporting logic, and described the high level business processes and knowledge required to support laboratory-based public health reporting. We focused on logic for viral hepatitis. We found that CSTE tabular logic was unnecessarily complex (sufficient conditions superseded necessary and optional conditions) and was sometimes true for more than one reportable event: we uncovered major overlap in the logic between acute and chronic hepatitis B (52%), acute and Past and Present hepatitis C (90%). We found that the RCMT includes codes for all hepatitis criteria, but includes addition codes for tests not included in the criteria. The proportion of hepatitis variant-related codes included in RCMT that correspond to a criterion in the hepatitis-related position statements varied between hepatitis A (36%), acute hepatitis B (16%), chronic hepatitis B (64%), acute hepatitis C (96%), and past and present hepatitis C (96%). Public health epidemiologists have the need to communicate parameters other than just the name of a disease or organism that should be reported, such as the status and specimen sources. Existing knowledge resources should be integrated, harmonized and made computable. Our findings identified functionality that should be provided by future knowledge management systems to support epidemiologists as they communicate reporting rules for their jurisdiction. PMID:23569619
Nonbinary Tree-Based Phylogenetic Networks.
Jetten, Laura; van Iersel, Leo
2018-01-01
Rooted phylogenetic networks are used to describe evolutionary histories that contain non-treelike evolutionary events such as hybridization and horizontal gene transfer. In some cases, such histories can be described by a phylogenetic base-tree with additional linking arcs, which can, for example, represent gene transfer events. Such phylogenetic networks are called tree-based. Here, we consider two possible generalizations of this concept to nonbinary networks, which we call tree-based and strictly-tree-based nonbinary phylogenetic networks. We give simple graph-theoretic characterizations of tree-based and strictly-tree-based nonbinary phylogenetic networks. Moreover, we show for each of these two classes that it can be decided in polynomial time whether a given network is contained in the class. Our approach also provides a new view on tree-based binary phylogenetic networks. Finally, we discuss two examples of nonbinary phylogenetic networks in biology and show how our results can be applied to them.
Linares, Juan Carlos; Camarero, Jesús Julio; Bowker, Matthew A; Ochoa, Victoria; Carreira, José Antonio
2010-12-01
Climate change may affect tree-pathogen interactions. This possibility has important implications for drought-prone forests, where stand dynamics and disease pathogenicity are especially sensitive to climatic stress. In addition, stand structural attributes including density-dependent tree-to-tree competition may modulate the stands' resistance to drought events and pathogen outbreaks. To assess the effects of stand structure on root-rot-related mortality after severe droughts, we focused on Heterobasidion abietinum mortality in relict Spanish stands of Abies pinsapo, a drought-sensitive fir. We compared stand attributes and tree spatial patterns in three plots with H. abietinum root-rot disease and three plots without root-rot. Point-pattern analyses were used to investigate the scale and extent of mortality patterns and to test hypotheses related to the spread of the disease. Dendrochronology was used to date the year of death and to assess the association between droughts and growth decline. We applied a structural equation modelling approach to test if tree mortality occurs more rapidly than predicted by a simple distance model when trees are subjected to high tree-to-tree competition and following drought events. Contrary to expectations of drought mortality, the effect of precipitation on the year of death was strong and negative, indicating that a period of high precipitation induced an earlier tree death. Competition intensity, related to the size and density of neighbour trees, also induced an earlier tree death. The effect of distance to the disease focus was negligible except in combination with intensive competition. Our results indicate that infected trees have decreased ability to withstand drought stress, and demonstrate that tree-to-tree competition and fungal infection act as predisposing factors of forest decline and mortality.
2014-07-01
Unified Theory of Acceptance and Use of Technology, Structuration Model of Technology, UNCLASSIFIED DSTO-TR-2992 UNCLASSIFIED 5 Adaptive...Structuration Theory , Model of Mutual Adaptation, Model of Technology Appropriation, Diffusion/Implementation Model, and Tri-core Model, among others [11...simulation gaming essay/scenario writing genius forecasting role play/acting backcasting swot brainstorming relevance tree/logic chart scenario workshop
Specific gravity relationships in plantation-grown red pine
Gregory Baker; James E. Shottafer
1968-01-01
Norway or red pine (Pinus resinosa Ait.) has been popular in Maine for forest planting because it will rapidly convert grass and weed cover to a forest floor and because it is relatively free from attack by insects and diseases. Since the first commercial thinnings consist of small-sized trees, the most logical market outlet is for pulpwood. Yield of...
T.Z. Ye; K.J.S. Jayawickrama; G.R. Johnson
2004-01-01
BLUP (Best linear unbiased prediction) method has been widely used in forest tree improvement programs. Since one of the properties of BLUP is that related individuals contribute to the predictions of each other, it seems logical that integrating data from all generations and from all populations would improve both the precision and accuracy in predicting genetic...
Automatic Configuration of Programmable Logic Controller Emulators
2015-03-01
25 11 Example tree generated using UPGMA [Edw13] . . . . . . . . . . . . . . . . . . . . 33 12 Example sequence alignment for two... UPGMA Unweighted Pair Group Method with Arithmetic Mean URL uniform resource locator VM virtual machine XML Extensible Markup Language xx List of...appearance in the ses- sion, and then they are clustered again using Unweighted Pair Group Method with Arithmetic Mean ( UPGMA ) with a distance matrix based
2011-01-01
Background Remote homology detection is a hard computational problem. Most approaches have trained computational models by using either full protein sequences or multiple sequence alignments (MSA), including all positions. However, when we deal with proteins in the "twilight zone" we can observe that only some segments of sequences (motifs) are conserved. We introduce a novel logical representation that allows us to represent physico-chemical properties of sequences, conserved amino acid positions and conserved physico-chemical positions in the MSA. From this, Inductive Logic Programming (ILP) finds the most frequent patterns (motifs) and uses them to train propositional models, such as decision trees and support vector machines (SVM). Results We use the SCOP database to perform our experiments by evaluating protein recognition within the same superfamily. Our results show that our methodology when using SVM performs significantly better than some of the state of the art methods, and comparable to other. However, our method provides a comprehensible set of logical rules that can help to understand what determines a protein function. Conclusions The strategy of selecting only the most frequent patterns is effective for the remote homology detection. This is possible through a suitable first-order logical representation of homologous properties, and through a set of frequent patterns, found by an ILP system, that summarizes essential features of protein functions. PMID:21429187
Bernardes, Juliana S; Carbone, Alessandra; Zaverucha, Gerson
2011-03-23
Remote homology detection is a hard computational problem. Most approaches have trained computational models by using either full protein sequences or multiple sequence alignments (MSA), including all positions. However, when we deal with proteins in the "twilight zone" we can observe that only some segments of sequences (motifs) are conserved. We introduce a novel logical representation that allows us to represent physico-chemical properties of sequences, conserved amino acid positions and conserved physico-chemical positions in the MSA. From this, Inductive Logic Programming (ILP) finds the most frequent patterns (motifs) and uses them to train propositional models, such as decision trees and support vector machines (SVM). We use the SCOP database to perform our experiments by evaluating protein recognition within the same superfamily. Our results show that our methodology when using SVM performs significantly better than some of the state of the art methods, and comparable to other. However, our method provides a comprehensible set of logical rules that can help to understand what determines a protein function. The strategy of selecting only the most frequent patterns is effective for the remote homology detection. This is possible through a suitable first-order logical representation of homologous properties, and through a set of frequent patterns, found by an ILP system, that summarizes essential features of protein functions.
Kevin T. Smith
2009-01-01
Trees and tree care can capture the best of people's motivations and intentions. Trees are living memorials that help communities heal at sites of national tragedy, such as Oklahoma City and the World Trade Center. We mark the places of important historical events by the trees that grew nearby even if the original tree, such as the Charter Oak in Connecticut or...
SEE Sensitivity Analysis of 180 nm NAND CMOS Logic Cell for Space Applications
NASA Astrophysics Data System (ADS)
Sajid, Muhammad
2016-07-01
This paper focus on Single Event Effects caused by energetic particle strike on sensitive locations in CMOS NAND logic cell designed in 180nm technology node to be operated in space radiation environment. The generation of SE transients as well as upsets as function of LET of incident particle has been determined for logic devices onboard LEO and GEO satellites. The minimum magnitude pulse and pulse-width for threshold LET was determined to estimate the vulnerability /susceptibility of device for heavy ion strike. The impact of temperature, strike location and logic state of NAND circuit on total SEU/SET rate was estimated with physical mechanism simulations using Visual TCAD, Genius, runSEU program and Crad computer codes.
PS1-41: Just Add Data: Implementing an Event-Based Data Model for Clinical Trial Tracking
Fuller, Sharon; Carrell, David; Pardee, Roy
2012-01-01
Background/Aims Clinical research trials often have similar fundamental tracking needs, despite being quite variable in their specific logic and activities. A model tracking database that can be quickly adapted by a variety of studies has the potential to achieve significant efficiencies in database development and maintenance. Methods Over the course of several different clinical trials, we have developed a database model that is highly adaptable to a variety of projects. Rather than hard-coding each specific event that might occur in a trial, along with its logical consequences, this model considers each event and its parameters to be a data record in its own right. Each event may have related variables (metadata) describing its prerequisites, subsequent events due, associated mailings, or events that it overrides. The metadata for each event is stored in the same record with the event name. When changes are made to the study protocol, no structural changes to the database are needed. One has only to add or edit events and their metadata. Changes in the event metadata automatically determine any related logic changes. In addition to streamlining application code, this model simplifies communication between the programmer and other team members. Database requirements can be phrased as changes to the underlying data, rather than to the application code. The project team can review a single report of events and metadata and easily see where changes might be needed. In addition to benefitting from streamlined code, the front end database application can also implement useful standard features such as automated mail merges and to do lists. Results The event-based data model has proven itself to be robust, adaptable and user-friendly in a variety of study contexts. We have chosen to implement it as a SQL Server back end and distributed Access front end. Interested readers may request a copy of the Access front end and scripts for creating the back end database. Discussion An event-based database with a consistent, robust set of features has the potential to significantly reduce development time and maintenance expense for clinical trial tracking databases.
FPGA-based gating and logic for multichannel single photon counting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pooser, Raphael C; Earl, Dennis Duncan; Evans, Philip G
2012-01-01
We present results characterizing multichannel InGaAs single photon detectors utilizing gated passive quenching circuits (GPQC), self-differencing techniques, and field programmable gate array (FPGA)-based logic for both diode gating and coincidence counting. Utilizing FPGAs for the diode gating frontend and the logic counting backend has the advantage of low cost compared to custom built logic circuits and current off-the-shelf detector technology. Further, FPGA logic counters have been shown to work well in quantum key distribution (QKD) test beds. Our setup combines multiple independent detector channels in a reconfigurable manner via an FPGA backend and post processing in order to perform coincidencemore » measurements between any two or more detector channels simultaneously. Using this method, states from a multi-photon polarization entangled source are detected and characterized via coincidence counting on the FPGA. Photons detection events are also processed by the quantum information toolkit for application testing (QITKAT)« less
The role of hybridization in facilitating tree invasion
USDA-ARS?s Scientific Manuscript database
Hybridization events can generate additional genetic diversity on which natural selection can act and at times enhance invasiveness of the species. Invasive tree species are a growing ecological concern worldwide, and some of these invasions involve hybridization events pre- or post-introduction. Th...
Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Woo, Gordon
2017-04-01
For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.
Independent effects of relevance and arousal on deductive reasoning.
Caparos, Serge; Blanchette, Isabelle
2017-08-01
Emotional content can have either a deleterious or a beneficial impact on logicality. Using standard deductive-reasoning tasks, we tested the hypothesis that the interplay of two factors - personal relevance and arousal - determines the nature of the effect of emotional content on logicality. Arousal was assessed using measures of skin conductance. Personal relevance was manipulated by asking participants to reason about semantic contents linked to an emotional event that they had experienced or not. Findings showed that (1) personal relevance exerts a positive effect on logicality while arousal exerts a negative effect, and that (2) these effects are independent of each other.
The Strengths and Weaknesses of Logic Formalisms to Support Mishap Analysis
NASA Technical Reports Server (NTRS)
Johnson, C. W.; Holloway, C. M.
2002-01-01
The increasing complexity of many safety critical systems poses new problems for mishap analysis. Techniques developed in the sixties and seventies cannot easily scale-up to analyze incidents involving tightly integrated software and hardware components. Similarly, the realization that many failures have systemic causes has widened the scope of many mishap investigations. Organizations, including NASA and the NTSB, have responded by starting research and training initiatives to ensure that their personnel are well equipped to meet these challenges. One strand of research has identified a range of mathematically based techniques that can be used to reason about the causes of complex, adverse events. The proponents of these techniques have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. Mathematical proofs can reduce the bias that is often perceived to effect the interpretation of adverse events. Others have opposed the introduction of these techniques by identifying social and political aspects to incident investigation that cannot easily be reconciled with a logic-based approach. Traditional theorem proving mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators routinely use in their analysis of adverse events. This paper summarizes some of the benefits that logics provide, describes their weaknesses, and proposes a number of directions for future research.
RecPhyloXML - a format for reconciled gene trees.
Duchemin, Wandrille; Gence, Guillaume; Arigon Chifolleau, Anne-Muriel; Arvestad, Lars; Bansal, Mukul S; Berry, Vincent; Boussau, Bastien; Chevenet, François; Comte, Nicolas; Davín, Adrián A; Dessimoz, Christophe; Dylus, David; Hasic, Damir; Mallo, Diego; Planel, Rémi; Posada, David; Scornavacca, Celine; Szöllosi, Gergely; Zhang, Louxin; Tannier, Éric; Daubin, Vincent
2018-05-14
A reconciliation is an annotation of the nodes of a gene tree with evolutionary events-for example, speciation, gene duplication, transfer, loss, etc-along with a mapping onto a species tree. Many algorithms and software produce or use reconciliations but often using different reconciliation formats, regarding the type of events considered or whether the species tree is dated or not. This complicates the comparison and communication between different programs. Here, we gather a consortium of software developers in gene tree species tree reconciliation to propose and endorse a format that aims to promote an integrative-albeit flexible-specification of phylogenetic reconciliations. This format, named recPhyloXML, is accompanied by several tools such as a reconciled tree visualizer and conversion utilities. http://phylariane.univ-lyon1.fr/recphyloxml/. wandrille.duchemin@univ-lyon1.fr. There is no supplementary data associated with this publication.
Answering Questions about Complex Events
2008-12-19
in their environment. To reason about events requires a means of describing, simulating, and analyzing their underlying dynamic processes . For our...that are relevant to our goal of connecting inference and reasoning about processes to answering questions about events. 11 We start with a...different event and process descriptions, ontologies, and models. 2.1.1 Logical AI In AI, formal approaches to model the ability to reason about
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Large-scale wind disturbances promote tree diversity in a Central Amazon forest.
Marra, Daniel Magnabosco; Chambers, Jeffrey Q; Higuchi, Niro; Trumbore, Susan E; Ribeiro, Gabriel H P M; Dos Santos, Joaquim; Negrón-Juárez, Robinson I; Reu, Björn; Wirth, Christian
2014-01-01
Canopy gaps created by wind-throw events, or blowdowns, create a complex mosaic of forest patches varying in disturbance intensity and recovery in the Central Amazon. Using field and remote sensing data, we investigated the short-term (four-year) effects of large (>2000 m(2)) blowdown gaps created during a single storm event in January 2005 near Manaus, Brazil, to study (i) how forest structure and composition vary with disturbance gradients and (ii) whether tree diversity is promoted by niche differentiation related to wind-throw events at the landscape scale. In the forest area affected by the blowdown, tree mortality ranged from 0 to 70%, and was highest on plateaus and slopes. Less impacted areas in the region affected by the blowdown had overlapping characteristics with a nearby unaffected forest in tree density (583 ± 46 trees ha(-1)) (mean ± 99% Confidence Interval) and basal area (26.7 ± 2.4 m(2) ha(-1)). Highly impacted areas had tree density and basal area as low as 120 trees ha(-1) and 14.9 m(2) ha(-1), respectively. In general, these structural measures correlated negatively with an index of tree mortality intensity derived from satellite imagery. Four years after the blowdown event, differences in size-distribution, fraction of resprouters, floristic composition and species diversity still correlated with disturbance measures such as tree mortality and gap size. Our results suggest that the gradients of wind disturbance intensity encompassed in large blowdown gaps (>2000 m(2)) promote tree diversity. Specialists for particular disturbance intensities existed along the entire gradient. The existence of species or genera taking an intermediate position between undisturbed and gap specialists led to a peak of rarefied richness and diversity at intermediate disturbance levels. A diverse set of species differing widely in requirements and recruitment strategies forms the initial post-disturbance cohort, thus lending a high resilience towards wind disturbances at the community level.
Large-Scale Wind Disturbances Promote Tree Diversity in a Central Amazon Forest
Marra, Daniel Magnabosco; Chambers, Jeffrey Q.; Higuchi, Niro; Trumbore, Susan E.; Ribeiro, Gabriel H. P. M.; dos Santos, Joaquim; Negrón-Juárez, Robinson I.; Reu, Björn; Wirth, Christian
2014-01-01
Canopy gaps created by wind-throw events, or blowdowns, create a complex mosaic of forest patches varying in disturbance intensity and recovery in the Central Amazon. Using field and remote sensing data, we investigated the short-term (four-year) effects of large (>2000 m2) blowdown gaps created during a single storm event in January 2005 near Manaus, Brazil, to study (i) how forest structure and composition vary with disturbance gradients and (ii) whether tree diversity is promoted by niche differentiation related to wind-throw events at the landscape scale. In the forest area affected by the blowdown, tree mortality ranged from 0 to 70%, and was highest on plateaus and slopes. Less impacted areas in the region affected by the blowdown had overlapping characteristics with a nearby unaffected forest in tree density (583±46 trees ha−1) (mean±99% Confidence Interval) and basal area (26.7±2.4 m2 ha−1). Highly impacted areas had tree density and basal area as low as 120 trees ha−1 and 14.9 m2 ha−1, respectively. In general, these structural measures correlated negatively with an index of tree mortality intensity derived from satellite imagery. Four years after the blowdown event, differences in size-distribution, fraction of resprouters, floristic composition and species diversity still correlated with disturbance measures such as tree mortality and gap size. Our results suggest that the gradients of wind disturbance intensity encompassed in large blowdown gaps (>2000 m2) promote tree diversity. Specialists for particular disturbance intensities existed along the entire gradient. The existence of species or genera taking an intermediate position between undisturbed and gap specialists led to a peak of rarefied richness and diversity at intermediate disturbance levels. A diverse set of species differing widely in requirements and recruitment strategies forms the initial post-disturbance cohort, thus lending a high resilience towards wind disturbances at the community level. PMID:25099118
Large-Scale Wind Disturbances Promote Tree Diversity in a Central Amazon Forest
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marra, Daniel Magnabosco; Chambers, Jeffrey Q.; Higuchi, Niro
Canopy gaps created by wind-throw events, or blowdowns, create a complex mosaic of forest patches varying in disturbance intensity and recovery in the Central Amazon. Using field and remote sensing data, we investigated the short-term (four-year) effects of large (>2000 m 2) blowdown gaps created during a single storm event in January 2005 near Manaus, Brazil, to study (i) how forest structure and composition vary with disturbance gradients and (ii) whether tree diversity is promoted by niche differentiation related to wind-throw events at the landscape scale. In the forest area affected by the blowdown, tree mortality ranged from 0 tomore » 70%, and was highest on plateaus and slopes. Less impacted areas in the region affected by the blowdown had overlapping characteristics with a nearby unaffected forest in tree density (583±46 trees ha -1) (mean±99% Confidence Interval) and basal area (26.7±2.4 m 2 ha -1). Highly impacted areas had tree density and basal area as low as 120 trees ha -1 and 14.9 m 2 ha -1, respectively. In general, these structural measures correlated negatively with an index of tree mortality intensity derived from satellite imagery. Four years after the blowdown event, differences in size-distribution, fraction of resprouters, floristic composition and species diversity still correlated with disturbance measures such as tree mortality and gap size. Our results suggest that the gradients of wind disturbance intensity encompassed in large blowdown gaps (>2000 m 2) promote tree diversity. Specialists for particular disturbance intensities existed along the entire gradient. The existence of species or genera taking an intermediate position between undisturbed and gap specialists led to a peak of rarefied richness and diversity at intermediate disturbance levels. A diverse set of species differing widely in requirements and recruitment strategies forms the initial post-disturbance cohort, thus lending a high resilience towards wind disturbances at the community level.« less
Large-Scale Wind Disturbances Promote Tree Diversity in a Central Amazon Forest
Marra, Daniel Magnabosco; Chambers, Jeffrey Q.; Higuchi, Niro; ...
2014-08-06
Canopy gaps created by wind-throw events, or blowdowns, create a complex mosaic of forest patches varying in disturbance intensity and recovery in the Central Amazon. Using field and remote sensing data, we investigated the short-term (four-year) effects of large (>2000 m 2) blowdown gaps created during a single storm event in January 2005 near Manaus, Brazil, to study (i) how forest structure and composition vary with disturbance gradients and (ii) whether tree diversity is promoted by niche differentiation related to wind-throw events at the landscape scale. In the forest area affected by the blowdown, tree mortality ranged from 0 tomore » 70%, and was highest on plateaus and slopes. Less impacted areas in the region affected by the blowdown had overlapping characteristics with a nearby unaffected forest in tree density (583±46 trees ha -1) (mean±99% Confidence Interval) and basal area (26.7±2.4 m 2 ha -1). Highly impacted areas had tree density and basal area as low as 120 trees ha -1 and 14.9 m 2 ha -1, respectively. In general, these structural measures correlated negatively with an index of tree mortality intensity derived from satellite imagery. Four years after the blowdown event, differences in size-distribution, fraction of resprouters, floristic composition and species diversity still correlated with disturbance measures such as tree mortality and gap size. Our results suggest that the gradients of wind disturbance intensity encompassed in large blowdown gaps (>2000 m 2) promote tree diversity. Specialists for particular disturbance intensities existed along the entire gradient. The existence of species or genera taking an intermediate position between undisturbed and gap specialists led to a peak of rarefied richness and diversity at intermediate disturbance levels. A diverse set of species differing widely in requirements and recruitment strategies forms the initial post-disturbance cohort, thus lending a high resilience towards wind disturbances at the community level.« less
NASA Astrophysics Data System (ADS)
Akinci, A.; Pace, B.
2017-12-01
In this study, we discuss the seismic hazard variability of peak ground acceleration (PGA) at 475 years return period in the Southern Apennines of Italy. The uncertainty and parametric sensitivity are presented to quantify the impact of the several fault parameters on ground motion predictions for 10% exceedance in 50-year hazard. A time-independent PSHA model is constructed based on the long-term recurrence behavior of seismogenic faults adopting the characteristic earthquake model for those sources capable of rupturing the entire fault segment with a single maximum magnitude. The fault-based source model uses the dimensions and slip rates of mapped fault to develop magnitude-frequency estimates for characteristic earthquakes. Variability of the selected fault parameter is given with a truncated normal random variable distribution presented by standard deviation about a mean value. A Monte Carlo approach, based on the random balanced sampling by logic tree, is used in order to capture the uncertainty in seismic hazard calculations. For generating both uncertainty and sensitivity maps, we perform 200 simulations for each of the fault parameters. The results are synthesized both in frequency-magnitude distribution of modeled faults as well as the different maps: the overall uncertainty maps provide a confidence interval for the PGA values and the parameter uncertainty maps determine the sensitivity of hazard assessment to variability of every logic tree branch. These branches of logic tree, analyzed through the Monte Carlo approach, are maximum magnitudes, fault length, fault width, fault dip and slip rates. The overall variability of these parameters is determined by varying them simultaneously in the hazard calculations while the sensitivity of each parameter to overall variability is determined varying each of the fault parameters while fixing others. However, in this study we do not investigate the sensitivity of mean hazard results to the consideration of different GMPEs. Distribution of possible seismic hazard results is illustrated by 95% confidence factor map, which indicates the dispersion about mean value, and coefficient of variation map, which shows percent variability. The results of our study clearly illustrate the influence of active fault parameters to probabilistic seismic hazard maps.
NASA Astrophysics Data System (ADS)
Fan, Yuanchao; Koukal, Tatjana; Weisberg, Peter J.
2014-10-01
Canopy shadowing mediated by topography is an important source of radiometric distortion on remote sensing images of rugged terrain. Topographic correction based on the sun-canopy-sensor (SCS) model significantly improved over those based on the sun-terrain-sensor (STS) model for surfaces with high forest canopy cover, because the SCS model considers and preserves the geotropic nature of trees. The SCS model accounts for sub-pixel canopy shadowing effects and normalizes the sunlit canopy area within a pixel. However, it does not account for mutual shadowing between neighboring pixels. Pixel-to-pixel shadowing is especially apparent for fine resolution satellite images in which individual tree crowns are resolved. This paper proposes a new topographic correction model: the sun-crown-sensor (SCnS) model based on high-resolution satellite imagery (IKONOS) and high-precision LiDAR digital elevation model. An improvement on the C-correction logic with a radiance partitioning method to address the effects of diffuse irradiance is also introduced (SCnS + C). In addition, we incorporate a weighting variable, based on pixel shadow fraction, on the direct and diffuse radiance portions to enhance the retrieval of at-sensor radiance and reflectance of highly shadowed tree pixels and form another variety of SCnS model (SCnS + W). Model evaluation with IKONOS test data showed that the new SCnS model outperformed the STS and SCS models in quantifying the correlation between terrain-regulated illumination factor and at-sensor radiance. Our adapted C-correction logic based on the sun-crown-sensor geometry and radiance partitioning better represented the general additive effects of diffuse radiation than C parameters derived from the STS or SCS models. The weighting factor Wt also significantly enhanced correction results by reducing within-class standard deviation and balancing the mean pixel radiance between sunlit and shaded slopes. We analyzed these improvements with model comparison on the red and near infrared bands. The advantages of SCnS + C and SCnS + W on both bands are expected to facilitate forest classification and change detection applications.
Extended GTST-MLD for aerospace system safety analysis.
Guo, Chiming; Gong, Shiyu; Tan, Lin; Guo, Bo
2012-06-01
The hazards caused by complex interactions in the aerospace system have become a problem that urgently needs to be settled. This article introduces a method for aerospace system hazard interaction identification based on extended GTST-MLD (goal tree-success tree-master logic diagram) during the design stage. GTST-MLD is a functional modeling framework with a simple architecture. Ontology is used to extend the ability of system interaction description in GTST-MLD by adding the system design knowledge and the past accident experience. From the level of functionality and equipment, respectively, this approach can help the technician detect potential hazard interactions. Finally, a case is used to show the method. © 2011 Society for Risk Analysis.
Radiation Characteristics of a 0.11 Micrometer Modified Commercial CMOS Process
NASA Technical Reports Server (NTRS)
Poivey, Christian; Kim, Hak; Berg, Melanie D.; Forney, Jim; Seidleck, Christina; Vilchis, Miguel A.; Phan, Anthony; Irwin, Tim; LaBel, Kenneth A.; Saigusa, Rajan K.;
2006-01-01
We present radiation data, Total Ionizing Dose and Single Event Effects, on the LSI Logic 0.11 micron commercial process and two modified versions of this process. Modified versions include a buried layer to guarantee Single Event Latchup immunity.
Single event upset sensitivity of low power Schottky devices
NASA Technical Reports Server (NTRS)
Price, W. E.; Nichols, D. K.; Measel, P. R.; Wahlin, K. L.
1982-01-01
Data taken from tests involving heavy ions in the Berkeley 88 in. cyclotron being directed at low power Schottky barrier devices are reported. The tests also included trials in the Harvard cyclotron with 130 MeV protons, and at the U.C. Davis cyclotron using 56 MeV protons. The experiments were performed to study the single event upsets in MSI logic devices containing flip-flops. Results are presented of single-event upsets (SEU) causing functional degradation observed in post-exposure tests of six different devices. The effectiveness of the particles in producing SEUs in logic device functioning was found to be directly proportional to the proton energy. Shielding was determined to offer negligible protection from the particle bombardment. The results are considered significant for the design and fabrication of LS devices for space applications.
Mitchell, Patrick J; O'Grady, Anthony P; Hayes, Keith R; Pinkard, Elizabeth A
2014-01-01
Increases in drought and temperature stress in forest and woodland ecosystems are thought to be responsible for the rise in episodic mortality events observed globally. However, key climatic drivers common to mortality events and the impacts of future extreme droughts on tree survival have not been evaluated. Here, we characterize climatic drivers associated with documented tree die-off events across Australia using standardized climatic indices to represent the key dimensions of drought stress for a range of vegetation types. We identify a common probabilistic threshold associated with an increased risk of die-off across all the sites that we examined. We show that observed die-off events occur when water deficits and maximum temperatures are high and exist outside 98% of the observed range in drought intensity; this threshold was evident at all sites regardless of vegetation type and climate. The observed die-off events also coincided with at least one heat wave (three consecutive days above the 90th percentile for maximum temperature), emphasizing a pivotal role of heat stress in amplifying tree die-off and mortality processes. The joint drought intensity and maximum temperature distributions were modeled for each site to describe the co-occurrence of both hot and dry conditions and evaluate future shifts in climatic thresholds associated with the die-off events. Under a relatively dry and moderate warming scenario, the frequency of droughts capable of inducing significant tree die-off across Australia could increase from 1 in 24 years to 1 in 15 years by 2050, accompanied by a doubling in the occurrence of associated heat waves. By defining commonalities in drought conditions capable of inducing tree die-off, we show a strong interactive effect of water and high temperature stress and provide a consistent approach for assessing changes in the exposure of ecosystems to extreme drought events. PMID:24772285
Widespread Amazon forest tree mortality from a single cross-basin squall line event
NASA Astrophysics Data System (ADS)
Negrón-Juárez, Robinson I.; Chambers, Jeffrey Q.; Guimaraes, Giuliano; Zeng, Hongcheng; Raupp, Carlos F. M.; Marra, Daniel M.; Ribeiro, Gabriel H. P. M.; Saatchi, Sassan S.; Nelson, Bruce W.; Higuchi, Niro
2010-08-01
Climate change is expected to increase the intensity of extreme precipitation events in Amazonia that in turn might produce more forest blowdowns associated with convective storms. Yet quantitative tree mortality associated with convective storms has never been reported across Amazonia, representing an important additional source of carbon to the atmosphere. Here we demonstrate that a single squall line (aligned cluster of convective storm cells) propagating across Amazonia in January, 2005, caused widespread forest tree mortality and may have contributed to the elevated mortality observed that year. Forest plot data demonstrated that the same year represented the second highest mortality rate over a 15-year annual monitoring interval. Over the Manaus region, disturbed forest patches generated by the squall followed a power-law distribution (scaling exponent α = 1.48) and produced a mortality of 0.3-0.5 million trees, equivalent to 30% of the observed annual deforestation reported in 2005 over the same area. Basin-wide, potential tree mortality from this one event was estimated at 542 ± 121 million trees, equivalent to 23% of the mean annual biomass accumulation estimated for these forests. Our results highlight the vulnerability of Amazon trees to wind-driven mortality associated with convective storms. Storm intensity is expected to increase with a warming climate, which would result in additional tree mortality and carbon release to the atmosphere, with the potential to further warm the climate system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donker, H.C., E-mail: h.donker@science.ru.nl; Katsnelson, M.I.; De Raedt, H.
2016-09-15
The logical inference approach to quantum theory, proposed earlier De Raedt et al. (2014), is considered in a relativistic setting. It is shown that the Klein–Gordon equation for a massive, charged, and spinless particle derives from the combination of the requirements that the space–time data collected by probing the particle is obtained from the most robust experiment and that on average, the classical relativistic equation of motion of a particle holds. - Highlights: • Logical inference applied to relativistic, massive, charged, and spinless particle experiments leads to the Klein–Gordon equation. • The relativistic Hamilton–Jacobi is scrutinized by employing a field description formore » the four-velocity. • Logical inference allows analysis of experiments with uncertainty in detection events and experimental conditions.« less
Global interrupt and barrier networks
Blumrich, Matthias A.; Chen, Dong; Coteus, Paul W.; Gara, Alan G.; Giampapa, Mark E; Heidelberger, Philip; Kopcsay, Gerard V.; Steinmacher-Burow, Burkhard D.; Takken, Todd E.
2008-10-28
A system and method for generating global asynchronous signals in a computing structure. Particularly, a global interrupt and barrier network is implemented that implements logic for generating global interrupt and barrier signals for controlling global asynchronous operations performed by processing elements at selected processing nodes of a computing structure in accordance with a processing algorithm; and includes the physical interconnecting of the processing nodes for communicating the global interrupt and barrier signals to the elements via low-latency paths. The global asynchronous signals respectively initiate interrupt and barrier operations at the processing nodes at times selected for optimizing performance of the processing algorithms. In one embodiment, the global interrupt and barrier network is implemented in a scalable, massively parallel supercomputing device structure comprising a plurality of processing nodes interconnected by multiple independent networks, with each node including one or more processing elements for performing computation or communication activity as required when performing parallel algorithm operations. One multiple independent network includes a global tree network for enabling high-speed global tree communications among global tree network nodes or sub-trees thereof. The global interrupt and barrier network may operate in parallel with the global tree network for providing global asynchronous sideband signals.
Caldeira, Maria C.; Lecomte, Xavier; David, Teresa S.; Pinto, Joaquim G.; Bugalho, Miguel N.; Werner, Christiane
2015-01-01
Extreme drought events and plant invasions are major drivers of global change that can critically affect ecosystem functioning and alter ecosystem-atmosphere exchange. Invaders are expanding worldwide and extreme drought events are projected to increase in frequency and intensity. However, very little is known on how these drivers may interact to affect the functioning and resilience of ecosystems to extreme events. Using a manipulative shrub removal experiment and the co-occurrence of an extreme drought event (2011/2012) in a Mediterranean woodland, we show that native shrub invasion and extreme drought synergistically reduced ecosystem transpiration and the resilience of key-stone oak tree species. Ecosystem transpiration was dominated by the water use of the invasive shrub Cistus ladanifer, which further increased after the extreme drought event. Meanwhile, the transpiration of key-stone tree species decreased, indicating a competitive advantage in favour of the invader. Our results suggest that in Mediterranean-type climates the invasion of water spending species and projected recurrent extreme drought events may synergistically cause critical drought tolerance thresholds of key-stone tree species to be surpassed, corroborating observed higher tree mortality in the invaded ecosystems. Ultimately, this may shift seasonally water limited ecosystems into less desirable alternative states dominated by water spending invasive shrubs. PMID:26461978
Caldeira, Maria C; Lecomte, Xavier; David, Teresa S; Pinto, Joaquim G; Bugalho, Miguel N; Werner, Christiane
2015-10-13
Extreme drought events and plant invasions are major drivers of global change that can critically affect ecosystem functioning and alter ecosystem-atmosphere exchange. Invaders are expanding worldwide and extreme drought events are projected to increase in frequency and intensity. However, very little is known on how these drivers may interact to affect the functioning and resilience of ecosystems to extreme events. Using a manipulative shrub removal experiment and the co-occurrence of an extreme drought event (2011/2012) in a Mediterranean woodland, we show that native shrub invasion and extreme drought synergistically reduced ecosystem transpiration and the resilience of key-stone oak tree species. Ecosystem transpiration was dominated by the water use of the invasive shrub Cistus ladanifer, which further increased after the extreme drought event. Meanwhile, the transpiration of key-stone tree species decreased, indicating a competitive advantage in favour of the invader. Our results suggest that in Mediterranean-type climates the invasion of water spending species and projected recurrent extreme drought events may synergistically cause critical drought tolerance thresholds of key-stone tree species to be surpassed, corroborating observed higher tree mortality in the invaded ecosystems. Ultimately, this may shift seasonally water limited ecosystems into less desirable alternative states dominated by water spending invasive shrubs.
NASA Astrophysics Data System (ADS)
Caldeira, Maria C.; Lecomte, Xavier; David, Teresa S.; Pinto, Joaquim G.; Bugalho, Miguel N.; Werner, Christiane
2015-10-01
Extreme drought events and plant invasions are major drivers of global change that can critically affect ecosystem functioning and alter ecosystem-atmosphere exchange. Invaders are expanding worldwide and extreme drought events are projected to increase in frequency and intensity. However, very little is known on how these drivers may interact to affect the functioning and resilience of ecosystems to extreme events. Using a manipulative shrub removal experiment and the co-occurrence of an extreme drought event (2011/2012) in a Mediterranean woodland, we show that native shrub invasion and extreme drought synergistically reduced ecosystem transpiration and the resilience of key-stone oak tree species. Ecosystem transpiration was dominated by the water use of the invasive shrub Cistus ladanifer, which further increased after the extreme drought event. Meanwhile, the transpiration of key-stone tree species decreased, indicating a competitive advantage in favour of the invader. Our results suggest that in Mediterranean-type climates the invasion of water spending species and projected recurrent extreme drought events may synergistically cause critical drought tolerance thresholds of key-stone tree species to be surpassed, corroborating observed higher tree mortality in the invaded ecosystems. Ultimately, this may shift seasonally water limited ecosystems into less desirable alternative states dominated by water spending invasive shrubs.
Zarcone, Alessandra; Padó, Sebastian; Lenci, Alessandro
2014-06-01
Logical metonymy resolution (begin a book → begin reading a book or begin writing a book) has traditionally been explained either through complex lexical entries (qualia structures) or through the integration of the implicit event via post-lexical access to world knowledge. We propose that recent work within the words-as-cues paradigm can provide a more dynamic model of logical metonymy, accounting for early and dynamic integration of complex event information depending on previous contextual cues (agent and patient). We first present a self-paced reading experiment on German subordinate sentences, where metonymic sentences and their paraphrased version differ only in the presence or absence of the clause-final target verb (Der Konditor begann die Glasur → Der Konditor begann, die Glasur aufzutragen/The baker began the icing → The baker began spreading the icing). Longer reading times at the target verb position in a high-typicality condition (baker + icing → spread ) compared to a low-typicality (but still plausible) condition (child + icing → spread) suggest that we make use of knowledge activated by lexical cues to build expectations about events. The early and dynamic integration of event knowledge in metonymy interpretation is bolstered by further evidence from a second experiment using the probe recognition paradigm. Presenting covert events as probes following a high-typicality or a low-typicality metonymic sentence (Der Konditor begann die Glasur → AUFTRAGEN/The baker began the icing → SPREAD), we obtain an analogous effect of typicality at 100 ms interstimulus interval. © 2014 Cognitive Science Society, Inc.
Nickel, Daniela Alba; Calvo, Maria Cristina Marino; Natal, Sonia; Freitas, Sérgio Fernando Torres de; Hartz, Zulmira Maria de Araújo
2014-04-01
This article analyzes evaluation capacity-building based on the case study of a State Health Secretariat participating in the Project to Strengthen the Technical Capacity of State Health Secretariats in Monitoring and Evaluating Primary Healthcare. The case study adopted a mixed design with information from documents, semi-structured interviews, and evaluation of primary care by the State Health Secretariat in 2008-2011. Process analysis was used to identify the logical events that contributed to evaluation capacity-building, with two categories: evaluation capacity-building events and events for building organizational structure. The logical chain of events was formed by negotiation and agreement on the decision-making levels for the continuity of evaluation, data collection and analysis by the State Health Secretariat, a change in key indicators, restructuring of the evaluation matrix, and communication of the results to the municipalities. The three-way analysis showed that the aim of developing evaluation capacity was achieved.
NASA Astrophysics Data System (ADS)
Lanzalaco, Felix; Pissanetzky, Sergio
2013-12-01
A recent theory of physical information based on the fundamental principles of causality and thermodynamics has proposed that a large number of observable life and intelligence signals can be described in terms of the Causal Mathematical Logic (CML), which is proposed to encode the natural principles of intelligence across any physical domain and substrate. We attempt to expound the current definition of CML, the "Action functional" as a theory in terms of its ability to possess a superior explanatory power for the current neuroscientific data we use to measure the mammalian brains "intelligence" processes at its most general biophysical level. Brain simulation projects define their success partly in terms of the emergence of "non-explicitly programmed" complex biophysical signals such as self-oscillation and spreading cortical waves. Here we propose to extend the causal theory to predict and guide the understanding of these more complex emergent "intelligence Signals". To achieve this we review whether causal logic is consistent with, can explain and predict the function of complete perceptual processes associated with intelligence. Primarily those are defined as the range of Event Related Potentials (ERP) which include their primary subcomponents; Event Related Desynchronization (ERD) and Event Related Synchronization (ERS). This approach is aiming for a universal and predictive logic for neurosimulation and AGi. The result of this investigation has produced a general "Information Engine" model from translation of the ERD and ERS. The CML algorithm run in terms of action cost predicts ERP signal contents and is consistent with the fundamental laws of thermodynamics. A working substrate independent natural information logic would be a major asset. An information theory consistent with fundamental physics can be an AGi. It can also operate within genetic information space and provides a roadmap to understand the live biophysical operation of the phenotype
Zang, Christian; Hartl-Meier, Claudia; Dittmar, Christoph; Rothe, Andreas; Menzel, Annette
2014-12-01
The future performance of native tree species under climate change conditions is frequently discussed, since increasingly severe and more frequent drought events are expected to become a major risk for forest ecosystems. To improve our understanding of the drought tolerance of the three common European temperate forest tree species Norway spruce, silver fir and common beech, we tested the influence of climate and tree-specific traits on the inter and intrasite variability in drought responses of these species. Basal area increment data from a large tree-ring network in Southern Germany and Alpine Austria along a climatic cline from warm-dry to cool-wet conditions were used to calculate indices of tolerance to drought events and their variability at the level of individual trees and populations. General patterns of tolerance indicated a high vulnerability of Norway spruce in comparison to fir and beech and a strong influence of bioclimatic conditions on drought response for all species. On the level of individual trees, low-growth rates prior to drought events, high competitive status and low age favored resilience in growth response to drought. Consequently, drought events led to heterogeneous and variable response patterns in forests stands. These findings may support the idea of deliberately using spontaneous selection and adaption effects as a passive strategy of forest management under climate change conditions, especially a strong directional selection for more tolerant individuals when frequency and intensity of summer droughts will increase in the course of global climate change. © 2014 John Wiley & Sons Ltd.
Structural system reliability calculation using a probabilistic fault tree analysis method
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.
1992-01-01
The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.
A Computation Infrastructure for Knowledge-Based Development of Reliable Software Systems
2006-11-10
Grant number: F045-023-0029 * Principal Investigator: David Gaspari, ATC-NY * Duration: May 2007 (assuming a successful review in 2005) * Source of... David Guaspari, Verifying Chain Replication in Event Logic Cornell University Technical Report, to be published 2006 "* Eli Barzilay, Implementing...and Reasoning, volume 2452 of Lecture Notes in Computer Science, pages 449-465, 2005. "* Mark Bickford and David Guaspari, A Programming Logic for
Agent Based Modeling and Simulation Framework for Supply Chain Risk Management
2012-03-01
Christopher and Peck 2004) macroeconomic , policy, competition, and resource (Ghoshal 1987) value chain, operational, event, and recurring (Shi 2004...clustering algorithms in agent logic to protect company privacy ( da Silva et al. 2006), aggregation of domain context in agent data analysis logic (Xiang...Operational Availability ( OA ) for FMC and PMC. 75 Mission Capable (MICAP) Hours is the measure of total time (in a month) consumable or reparable
Sequencing Events: Exploring Art and Art Jobs.
ERIC Educational Resources Information Center
Stephens, Pamela Geiger; Shaddix, Robin K.
2000-01-01
Presents an activity for upper-elementary students that correlates the actions of archaeologists, patrons, and artists with the sequencing of events in a logical order. Features ancient Egyptian art images. Discusses the preparation of materials, motivation, a pre-writing activity, and writing a story in sequence. (CMK)
SIGMA--A Graphical Approach to Teaching Simulation.
ERIC Educational Resources Information Center
Schruben, Lee W.
1992-01-01
SIGMA (Simulation Graphical Modeling and Analysis) is a computer graphics environment for building, testing, and experimenting with discrete event simulation models on personal computers. It uses symbolic representations (computer animation) to depict the logic of large, complex discrete event systems for easier understanding and has proven itself…
Probabilistically Bounded Staleness for Practical Partial Quorums
2012-01-03
probability of non-intersection be- tween any two quorums decreases. To the best of our knowledge , probabilistic quorums have only been used to study the...Practice In practice, many distributed data management systems use quo- rums as a replication mechanism. Amazon’s Dynamo [21] is the progenitor of a...Abbadi. Resilient logical structures for efficient management of replicated data. In VLDB 1992. [9] D. Agrawal and A. E. Abbadi. The tree quorum
NASA Astrophysics Data System (ADS)
Gagen, Mary; McCarroll, Danny; Loader, Neil; Young, Giles; Robertson, Iain
2015-04-01
Stable carbon isotope (δ13C) measurements from the annual rings of trees are increasingly used to explore long term changes in plant-carbon-water relations, via changes in intrinsic water use efficiency (iWUE); the ratio of photosynthetic rate to stomatal conductance. Many studies report a significant increase in iWEU since industrialisation, which tracks rising global atmospheric CO2. Such changes are logical are trees are known to change their stomatal geometry, number and action in response to rising CO2. However, which increasing iWUE suggests physiological changes which should lead to increased growth increasing iWUE is rarely matched by enhanced tree growth when tree rings are measured, despite increases of up to 30% in iWUE over the recent past (van der Sleen et al 2015). Explanations for the mismatch between iWUE and tree growth records encompass questions over the veracity of δ13C records for recording physiological change (Silva and Howarth 2013), suggestions that moisture stress in warming climates becomes a limit to growth and prevents opportunistic use of rising CO2 by trees (Andreu-Hayles et al 2011) and questions regarding the use of tree ring width, which does not record tree height gain, to record growth. Here we present an extensive range of long term iWUE records, derived broadly from the temperate, high latitude and one tropical forest site to explore the palaeoclimatic perspective on the iWUE-fertilization conundrum in a spatio temporally extensive manner.
Origins of Chaos in Autonomous Boolean Networks
NASA Astrophysics Data System (ADS)
Socolar, Joshua; Cavalcante, Hugo; Gauthier, Daniel; Zhang, Rui
2010-03-01
Networks with nodes consisting of ideal Boolean logic gates are known to display either steady states, periodic behavior, or an ultraviolet catastrophe where the number of logic-transition events circulating in the network per unit time grows as a power-law. In an experiment, non-ideal behavior of the logic gates prevents the ultraviolet catastrophe and may lead to deterministic chaos. We identify certain non-ideal features of real logic gates that enable chaos in experimental networks. We find that short-pulse rejection and the asymmetry between the logic states tends to engender periodic behavior. On the other hand, a memory effect termed ``degradation'' can generate chaos. Our results strongly suggest that deterministic chaos can be expected in a large class of experimental Boolean-like networks. Such devices may find application in a variety of technologies requiring fast complex waveforms or flat power spectra. The non-ideal effects identified here also have implications for the statistics of attractors in large complex networks.
Selective Tree-ring Models: A Novel Method for Reconstructing Streamflow Using Tree Rings
NASA Astrophysics Data System (ADS)
Foard, M. B.; Nelson, A. S.; Harley, G. L.
2017-12-01
Surface water is among the most instrumental and vulnerable resources in the Northwest United States (NW). Recent observations show that overall water quantity is declining in streams across the region, while extreme flooding events occur more frequently. Historical streamflow models inform probabilities of extreme flow events (flood or drought) by describing frequency and duration of past events. There are numerous examples of tree-rings being utilized to reconstruct streamflow in the NW. These models confirm that tree-rings are highly accurate at predicting streamflow, however there are many nuances that limit their applicability through time and space. For example, most models predict streamflow from hydrologically altered rivers (e.g. dammed, channelized) which may hinder our ability to predict natural prehistoric flow. They also have a tendency to over/under-predict extreme flow events. Moreover, they often neglect to capture the changing relationships between tree-growth and streamflow over time and space. To address these limitations, we utilized national tree-ring and streamflow archives to investigate the relationships between the growth of multiple coniferous species and free-flowing streams across the NW using novel species-and site-specific streamflow models - a term we coined"selective tree-ring models." Correlation function analysis and regression modeling were used to evaluate the strengths and directions of the flow-growth relationships. Species with significant relationships in the same direction were identified as strong candidates for selective models. Temporal and spatial patterns of these relationships were examined using running correlations and inverse distance weighting interpolation, respectively. Our early results indicate that (1) species adapted to extreme climates (e.g. hot-dry, cold-wet) exhibit the most consistent relationships across space, (2) these relationships weaken in locations with mild climatic variability, and (3) some species appear to be strong candidates for predicting high flow events, while others may be better at pridicting drought. These findings indicate that selective models may outperform traditional models when reconstructing distinctive aspects of streamflow.
NASA Astrophysics Data System (ADS)
Šilhán, Karel; Stoffel, Markus
2015-05-01
Different approaches and thresholds have been utilized in the past to date landslides with growth ring series of disturbed trees. Past work was mostly based on conifer species because of their well-defined ring boundaries and the easy identification of compression wood after stem tilting. More recently, work has been expanded to include broad-leaved trees, which are thought to produce less and less evident reactions after landsliding. This contribution reviews recent progress made in dendrogeomorphic landslide analysis and introduces a new approach in which landslides are dated via ring eccentricity formed after tilting. We compare results of this new and the more conventional approaches. In addition, the paper also addresses tree sensitivity to landslide disturbance as a function of tree age and trunk diameter using 119 common beech (Fagus sylvatica L.) and 39 Crimean pine (Pinus nigra ssp. pallasiana) trees growing on two landslide bodies. The landslide events reconstructed with the classical approach (reaction wood) also appear as events in the eccentricity analysis, but the inclusion of eccentricity clearly allowed for more (162%) landslides to be detected in the tree-ring series. With respect to tree sensitivity, conifers and broad-leaved trees show the strongest reactions to landslides at ages comprised between 40 and 60 years, with a second phase of increased sensitivity in P. nigra at ages of ca. 120-130 years. These phases of highest sensitivities correspond with trunk diameters at breast height of 6-8 and 18-22 cm, respectively (P. nigra). This study thus calls for the inclusion of eccentricity analyses in future landslide reconstructions as well as for the selection of trees belonging to different age and diameter classes to allow for a well-balanced and more complete reconstruction of past events.
Rymer, M.J.
2000-01-01
The Coachella Valley area was strongly shaken by the 1992 Joshua Tree (23 April) and Landers (28 June) earthquakes, and both events caused triggered slip on active faults within the area. Triggered slip associated with the Joshua Tree earthquake was on a newly recognized fault, the East Wide Canyon fault, near the southwestern edge of the Little San Bernardino Mountains. Slip associated with the Landers earthquake formed along the San Andreas fault in the southeastern Coachella Valley. Surface fractures formed along the East Wide Canyon fault in association with the Joshua Tree earthquake. The fractures extended discontinuously over a 1.5-km stretch of the fault, near its southern end. Sense of slip was consistently right-oblique, west side down, similar to the long-term style of faulting. Measured offset values were small, with right-lateral and vertical components of slip ranging from 1 to 6 mm and 1 to 4 mm, respectively. This is the first documented historic slip on the East Wide Canyon fault, which was first mapped only months before the Joshua Tree earthquake. Surface slip associated with the Joshua Tree earthquake most likely developed as triggered slip given its 5 km distance from the Joshua Tree epicenter and aftershocks. As revealed in a trench investigation, slip formed in an area with only a thin (<3 m thick) veneer of alluvium in contrast to earlier documented triggered slip events in this region, all in the deep basins of the Salton Trough. A paleoseismic trench study in an area of 1992 surface slip revealed evidence of two and possibly three surface faulting events on the East Wide Canyon fault during the late Quaternary, probably latest Pleistocene (first event) and mid- to late Holocene (second two events). About two months after the Joshua Tree earthquake, the Landers earthquake then triggered slip on many faults, including the San Andreas fault in the southeastern Coachella Valley. Surface fractures associated with this event formed discontinuous breaks over a 54-km-long stretch of the fault, from the Indio Hills southeastward to Durmid Hill. Sense of slip was right-lateral; only locally was there a minor (~1 mm) vertical component of slip. Measured dextral displacement values ranged from 1 to 20 mm, with the largest amounts found in the Mecca Hills where large slip values have been measured following past triggered-slip events.
The Inference of Gene Trees with Species Trees
Szöllősi, Gergely J.; Tannier, Eric; Daubin, Vincent; Boussau, Bastien
2015-01-01
This article reviews the various models that have been used to describe the relationships between gene trees and species trees. Molecular phylogeny has focused mainly on improving models for the reconstruction of gene trees based on sequence alignments. Yet, most phylogeneticists seek to reveal the history of species. Although the histories of genes and species are tightly linked, they are seldom identical, because genes duplicate, are lost or horizontally transferred, and because alleles can coexist in populations for periods that may span several speciation events. Building models describing the relationship between gene and species trees can thus improve the reconstruction of gene trees when a species tree is known, and vice versa. Several approaches have been proposed to solve the problem in one direction or the other, but in general neither gene trees nor species trees are known. Only a few studies have attempted to jointly infer gene trees and species trees. These models account for gene duplication and loss, transfer or incomplete lineage sorting. Some of them consider several types of events together, but none exists currently that considers the full repertoire of processes that generate gene trees along the species tree. Simulations as well as empirical studies on genomic data show that combining gene tree–species tree models with models of sequence evolution improves gene tree reconstruction. In turn, these better gene trees provide a more reliable basis for studying genome evolution or reconstructing ancestral chromosomes and ancestral gene sequences. We predict that gene tree–species tree methods that can deal with genomic data sets will be instrumental to advancing our understanding of genomic evolution. PMID:25070970
Reconciliation of Gene and Species Trees
Rusin, L. Y.; Lyubetskaya, E. V.; Gorbunov, K. Y.; Lyubetsky, V. A.
2014-01-01
The first part of the paper briefly overviews the problem of gene and species trees reconciliation with the focus on defining and algorithmic construction of the evolutionary scenario. Basic ideas are discussed for the aspects of mapping definitions, costs of the mapping and evolutionary scenario, imposing time scales on a scenario, incorporating horizontal gene transfers, binarization and reconciliation of polytomous trees, and construction of species trees and scenarios. The review does not intend to cover the vast diversity of literature published on these subjects. Instead, the authors strived to overview the problem of the evolutionary scenario as a central concept in many areas of evolutionary research. The second part provides detailed mathematical proofs for the solutions of two problems: (i) inferring a gene evolution along a species tree accounting for various types of evolutionary events and (ii) trees reconciliation into a single species tree when only gene duplications and losses are allowed. All proposed algorithms have a cubic time complexity and are mathematically proved to find exact solutions. Solving algorithms for problem (ii) can be naturally extended to incorporate horizontal transfers, other evolutionary events, and time scales on the species tree. PMID:24800245
NASA Astrophysics Data System (ADS)
Caldeira, Maria; Lecomte, Xavier; David, Teresa; Pinto, Joaquim; Bugalho, Miguel; Werner, Christiane
2016-04-01
Extreme droughts and plant invasions are major drivers of global change that can critically affect ecosystem functioning. Shrub encroachment is increasing in many regions worldwide and extreme events are projected to increase in frequency and intensity, namely in the Mediterranean region. Nevertheless, little is known about how these drivers may interact and affect ecosystem functioning and resilience to extreme droughts. Using a manipulative shrub removal experiment and the co-occurrence of an extreme drought event (2011/2012) in a Mediterranean woodland, we show that the native shrub invasion and extreme drought combined to reduce ecosystem transpiration and the resilience of the key-stone oak tree species. We established six 25 x 25 m paired plots in a shrub (Cistus ladanifer L.) encroached Mediterranean cork-oak (Quercus suber L.) woodland. We measured sapflow and pre-dawn leaf water potential of trees and shrubs and soil water content in all plots during three years. We determined the resilience of tree transpiration to evaluate to what extent trees recovered from the extreme drought event. From February to November 2011 we conducted baseline measurements for plot comparison. In November 2011 all the shrubs from one of all the paired plots were cut and removed. Ecosystem transpiration was dominated by the water use of the invasive shrub, which further increased after the extreme drought. Simultaneously, tree transpiration in invaded plots declined much stronger (67 ± 13 %) than in plots cleared from shrubs (31 ± 11%) relative to the pre-drought year. Trees in invaded plots were not able to recover in the following wetter year showing lower resilience to the extreme drought event. Our results imply that in Mediterranean-type of climates invasion by water spending species can combine with projected recurrent extreme droughts causing critical drought tolerance thresholds of trees to be overcome increasing the probability of tree mortality (Caldeira et.al. 2015). Caldeira M.C., Lecomte X., David T.S., Pinto J.G., Bugalho M.N. & Werner C. (2015). Synergy of extreme drought and shrub invasion reduce ecosystem functioning and resilience in water-limited climates. Scientific Reports, 5, 15110.
Logic, Probability, and Human Reasoning
2015-01-01
Reasoning with exceptions: an event-related brain potentials study. J. Cogn . Neurosci . 23, 471 480 40 Baggio, G. et al. (2014) Logic as Marr’s...Johnson-Laird, P.N. (2013) Strategic changes in problem solving. J. Cogn . Psychol. 25, 165 173 5 Khemlani, S.S. et al. (2013) Kinematic mental simulations...and its application to Boolean systems. J. Cogn . Psychol. 25, 365 389 7 Beth, E.W. and Piaget, J. (1966) Mathematical Epistemology and Psychology
GIGA: a simple, efficient algorithm for gene tree inference in the genomic age
2010-01-01
Background Phylogenetic relationships between genes are not only of theoretical interest: they enable us to learn about human genes through the experimental work on their relatives in numerous model organisms from bacteria to fruit flies and mice. Yet the most commonly used computational algorithms for reconstructing gene trees can be inaccurate for numerous reasons, both algorithmic and biological. Additional information beyond gene sequence data has been shown to improve the accuracy of reconstructions, though at great computational cost. Results We describe a simple, fast algorithm for inferring gene phylogenies, which makes use of information that was not available prior to the genomic age: namely, a reliable species tree spanning much of the tree of life, and knowledge of the complete complement of genes in a species' genome. The algorithm, called GIGA, constructs trees agglomeratively from a distance matrix representation of sequences, using simple rules to incorporate this genomic age information. GIGA makes use of a novel conceptualization of gene trees as being composed of orthologous subtrees (containing only speciation events), which are joined by other evolutionary events such as gene duplication or horizontal gene transfer. An important innovation in GIGA is that, at every step in the agglomeration process, the tree is interpreted/reinterpreted in terms of the evolutionary events that created it. Remarkably, GIGA performs well even when using a very simple distance metric (pairwise sequence differences) and no distance averaging over clades during the tree construction process. Conclusions GIGA is efficient, allowing phylogenetic reconstruction of very large gene families and determination of orthologs on a large scale. It is exceptionally robust to adding more gene sequences, opening up the possibility of creating stable identifiers for referring to not only extant genes, but also their common ancestors. We compared trees produced by GIGA to those in the TreeFam database, and they were very similar in general, with most differences likely due to poor alignment quality. However, some remaining differences are algorithmic, and can be explained by the fact that GIGA tends to put a larger emphasis on minimizing gene duplication and deletion events. PMID:20534164
GIGA: a simple, efficient algorithm for gene tree inference in the genomic age.
Thomas, Paul D
2010-06-09
Phylogenetic relationships between genes are not only of theoretical interest: they enable us to learn about human genes through the experimental work on their relatives in numerous model organisms from bacteria to fruit flies and mice. Yet the most commonly used computational algorithms for reconstructing gene trees can be inaccurate for numerous reasons, both algorithmic and biological. Additional information beyond gene sequence data has been shown to improve the accuracy of reconstructions, though at great computational cost. We describe a simple, fast algorithm for inferring gene phylogenies, which makes use of information that was not available prior to the genomic age: namely, a reliable species tree spanning much of the tree of life, and knowledge of the complete complement of genes in a species' genome. The algorithm, called GIGA, constructs trees agglomeratively from a distance matrix representation of sequences, using simple rules to incorporate this genomic age information. GIGA makes use of a novel conceptualization of gene trees as being composed of orthologous subtrees (containing only speciation events), which are joined by other evolutionary events such as gene duplication or horizontal gene transfer. An important innovation in GIGA is that, at every step in the agglomeration process, the tree is interpreted/reinterpreted in terms of the evolutionary events that created it. Remarkably, GIGA performs well even when using a very simple distance metric (pairwise sequence differences) and no distance averaging over clades during the tree construction process. GIGA is efficient, allowing phylogenetic reconstruction of very large gene families and determination of orthologs on a large scale. It is exceptionally robust to adding more gene sequences, opening up the possibility of creating stable identifiers for referring to not only extant genes, but also their common ancestors. We compared trees produced by GIGA to those in the TreeFam database, and they were very similar in general, with most differences likely due to poor alignment quality. However, some remaining differences are algorithmic, and can be explained by the fact that GIGA tends to put a larger emphasis on minimizing gene duplication and deletion events.
Object-oriented fault tree evaluation program for quantitative analyses
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Koen, B. V.
1988-01-01
Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.
Intelligent fuzzy controller for event-driven real time systems
NASA Technical Reports Server (NTRS)
Grantner, Janos; Patyra, Marek; Stachowicz, Marian S.
1992-01-01
Most of the known linguistic models are essentially static, that is, time is not a parameter in describing the behavior of the object's model. In this paper we show a model for synchronous finite state machines based on fuzzy logic. Such finite state machines can be used to build both event-driven, time-varying, rule-based systems and the control unit section of a fuzzy logic computer. The architecture of a pipelined intelligent fuzzy controller is presented, and the linguistic model is represented by an overall fuzzy relation stored in a single rule memory. A VLSI integrated circuit implementation of the fuzzy controller is suggested. At a clock rate of 30 MHz, the controller can perform 3 MFLIPS on multi-dimensional fuzzy data.
Defining and Enforcing Hardware Security Requirements
2011-12-01
Computer-Aided Design CPU Central Processing Unit CTL Computation Tree Logic DARPA The Defense Advanced Projects Research Agency DFF D-type Flip-Flop DNF...They too have no global knowledge of what is going on, nor any meaning to attach to any bit, whether storage or gating . . . it is we who attach...This option is prohibitively ex- pensive with the current trends in the global distribution of the steps in IC design and fabrication. The second option
Evolutionary Data Mining Approach to Creating Digital Logic
2010-01-01
To deal with this problem a genetic program (GP) based data mining ( DM ) procedure has been invented (Smith 2005). A genetic program is an algorithm...that can operate on the variables. When a GP was used as a DM function in the past to automatically create fuzzy decision trees, the Report...rules represents an approach to the determining the effect of linguistic imprecision, i.e., the inability of experts to provide crisp rules. The
Formal modeling of a system of chemical reactions under uncertainty.
Ghosh, Krishnendu; Schlipf, John
2014-10-01
We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.
Psycho-logic: some thoughts and after-thoughts.
Smedslund, J
2012-08-01
The main features of the system of psycho-logic and its historical origins, especially in the writings of Heider and Piaget, are briefly reviewed. An updated version of the axioms of psycho-logic, and a list of the semantic primitives of Wierzbicka are presented. Some foundational questions are discussed, including the genetically determined limitations of human knowledge, the constructive, moral, and political nature of the approach, the role of fortuitous events, the ultimate limitations of psychological knowledge (the "balloon" to be inflated from the inside), the role of the subjective unconscious, and the implications of the approach for practice. © 2012 The Author. Scandinavian Journal of Psychology © 2012 The Scandinavian Psychological Associations.
Remote Detection and Modeling of Abrupt and Gradual Tree Mortality in the Southwestern USA
NASA Astrophysics Data System (ADS)
Muss, J. D.; Xu, C.; McDowell, N. G.
2014-12-01
Current climate models predict a warming and drying trend that has a high probability of increasing the frequency and spatial extent of tree mortality events. Field surveys can be used to identify, date, and attribute a cause of mortality to specific trees, but monetary and time constraints prevent broad-scale surveys, which are necessary to establish regional or global trends in tree mortality. This is significant because widespread forest mortality will likely lead to radical changes in evapotranspiration and surface albedo, which could compound climate change. While understanding the causes and mechanisms of tree mortality events is crucial, it is equally important to be able to detect and monitor mortality and subsequent changes to the ecosystem at broad spatial- and temporal-scales. Over the past five years our ability to remotely detect abrupt forest mortality events has improved greatly, but gradual events—such as those caused by drought or certain types of insects—are still difficult to identify. Moreover, it is virtually impossible to quantify the amount of mortality that has occurred within a mixed pixel. We have developed a system that fuses climate and satellite-derived spectral data to identify both the date and the agent of forest mortality events. This system has been used with Landsat time series data to detect both abrupt and general trends in tree loss that have occurred during the past quarter-century in northern New Mexico. It has also been used with MODIS data to identify pixels with a high likelihood of drought-caused tree mortality in the Southwestern US. These candidate pixels were then fed to ED-FRT, a coupled forest dynamics-radiative transfer model, to generate estimates of drought-induced. We demonstrate a multi-scale approach that can produce results that will be instrumental in advancing our understanding of tree mortality-climate feedbacks, and improve our ability to predict what forests could look like in the future.
Fragments of Science: Festschrift for Mendel Sachs
NASA Astrophysics Data System (ADS)
Ram, Michael
1999-11-01
The Table of Contents for the full book PDF is as follows: * Preface * Sketches at a Symposium * For Mendel Sachs * The Constancy of an Angular Point of View * Information-Theoretic Logic and Transformation-Theoretic Logic * The Invention of the Transistor and the Realization of the Hole * Mach's Principle, Newtonian Gravitation, Absolute Space, and Einstein * The Sun, Our Variable Star * The Inconstant Sun: Symbiosis of Time Variations of Sunspots, Atmospheric Radiocarbon, Aurorae, and Tree Ring Growth * Other Worlds * Super-Classical Quantum Mechanics * A Probabilistic Approach to the Phase Problem of X-Ray Crystallography * A Nonlinear Twist on Inertia Gives Unified Electroweak Gravitation * Neutrino Oscillations * On an Incompleteness in the General-Relativistic Description of Gravitation * All Truth is One * Ideas of Physics: Correspondence between Colleagues * The Influence of the Physics and Philosophy of Einstein's Relativity on My Attitudes in Science: An Autobiography
Query Language for Location-Based Services: A Model Checking Approach
NASA Astrophysics Data System (ADS)
Hoareau, Christian; Satoh, Ichiro
We present a model checking approach to the rationale, implementation, and applications of a query language for location-based services. Such query mechanisms are necessary so that users, objects, and/or services can effectively benefit from the location-awareness of their surrounding environment. The underlying data model is founded on a symbolic model of space organized in a tree structure. Once extended to a semantic model for modal logic, we regard location query processing as a model checking problem, and thus define location queries as hybrid logicbased formulas. Our approach is unique to existing research because it explores the connection between location models and query processing in ubiquitous computing systems, relies on a sound theoretical basis, and provides modal logic-based query mechanisms for expressive searches over a decentralized data structure. A prototype implementation is also presented and will be discussed.
Survey of critical failure events in on-chip interconnect by fault tree analysis
NASA Astrophysics Data System (ADS)
Yokogawa, Shinji; Kunii, Kyousuke
2018-07-01
In this paper, a framework based on reliability physics is proposed for adopting fault tree analysis (FTA) to the on-chip interconnect system of a semiconductor. By integrating expert knowledge and experience regarding the possibilities of failure on basic events, critical issues of on-chip interconnect reliability will be evaluated by FTA. In particular, FTA is used to identify the minimal cut sets with high risk priority. Critical events affecting the on-chip interconnect reliability are identified and discussed from the viewpoint of long-term reliability assessment. The moisture impact is evaluated as an external event.
Vegetation optical depth measured by microwave radiometry as an indicator of tree mortality risk
NASA Astrophysics Data System (ADS)
Rao, K.; Anderegg, W.; Sala, A.; Martínez-Vilalta, J.; Konings, A. G.
2017-12-01
Increased drought-related tree mortality has been observed across several regions in recent years. Vast spatial extent and high temporal variability makes field monitoring of tree mortality cumbersome and expensive. With global coverage and high temporal revisit, satellite remote sensing offers an unprecedented tool to monitor terrestrial ecosystems and identify areas at risk of large drought-driven tree mortality events. To date, studies that use remote sensing data to monitor tree mortality have focused on external climatic thresholds such as temperature and evapotranspiration. However, this approach fails to consider internal water stress in vegetation - which can vary across trees even for similar climatic conditions due to differences in hydraulic behavior, soil type, etc - and may therefore be a poor basis for measuring mortality events. There is a consensus that xylem hydraulic failure often precedes drought-induced mortality, suggesting depleted canopy water content shortly before onset of mortality. Observations of vegetation optical depth (VOD) derived from passive microwave are proportional to canopy water content. In this study, we propose to use variations in VOD as an indicator of potential tree mortality. Since VOD accounts for intrinsic water stress undergone by vegetation, it is expected to be more accurate than external climatic stress indicators. Analysis of tree mortality events in California, USA observed by airborne detection shows a consistent relationship between mortality and the proposed VOD metric. Although this approach is limited by the kilometer-scale resolution of passive microwave radiometry, our results nevertheless demonstrate that microwave-derived estimates of vegetation water content can be used to study drought-driven tree mortality, and may be a valuable tool for mortality predictions if they can be combined with higher-resolution variables.
NASA Astrophysics Data System (ADS)
Liu, Tianqi; Yang, Zhenlei; Guo, Jinlong; Du, Guanghua; Tong, Teng; Wang, Xiaohui; Su, Hong; Liu, Wenjing; Liu, Jiande; Wang, Bin; Ye, Bing; Liu, Jie
2017-08-01
The heavy-ion imaging of single event upset (SEU) in a flash-based field programmable gate array (FPGA) device was carried out for the first time at Heavy Ion Research Facility in Lanzhou (HIRFL). The three shift register chains with separated input and output configurations in device under test (DUT) were used to identify the corresponding logical area rapidly once an upset occurred. The logic units in DUT were partly configured in order to distinguish the registers in SEU images. Based on the above settings, the partial architecture of shift register chains in DUT was imaged by employing the microbeam of 86Kr ion with energy of 25 MeV/u in air. The results showed that the physical distribution of registers in DUT had a high consistency with its logical arrangement by comparing SEU image with logic configuration in scanned area.
Users Guide to Direct Digital Control of Heating, Ventilating, and Air Conditioning Equipment,
1985-01-01
cycles, reset, load shedding, chiller optimization , VAV fan synchronization, and optimum start/stop. The prospective buyer of a DDC system should...in Fig- ure 4. Data on setpoints , reset schedules, and event timing, such as that presented in Figure 6, are often even more difficult to find. In con...control logic, setpoint and other data are readily available. Program logic, setpoint and schedule data, and other information stored in a DDC unit
Anger biting. The hidden impulse.
Walter, R D
1985-09-01
Based upon the paralogical reasoning of the anger-impulsive biter, this paper addresses the overload of emotional catharsis which can block a full memory of the biting event and suspend the logical infrastructure of rational behavior. In an effort to overcome these types of investigative difficulties, the paper suggests an approach to resolve dilemma through decompressing the emotional content into path ways of logical understanding. By offering a network of rationale hooks, the perpetrator becomes better equipped to acknowledge the deed.
The spatiotemporal order of signaling events unveils the logic of development signaling.
Zhu, Hao; Owen, Markus R; Mao, Yanlan
2016-08-01
Animals from worms and insects to birds and mammals show distinct body plans; however, the embryonic development of diverse body plans with tissues and organs within is controlled by a surprisingly few signaling pathways. It is well recognized that combinatorial use of and dynamic interactions among signaling pathways follow specific logic to control complex and accurate developmental signaling and patterning, but it remains elusive what such logic is, or even, what it looks like. We have developed a computational model for Drosophila eye development with innovated methods to reveal how interactions among multiple pathways control the dynamically generated hexagonal array of R8 cells. We obtained two novel findings. First, the coupling between the long-range inductive signals produced by the proneural Hh signaling and the short-range restrictive signals produced by the antineural Notch and EGFR signaling is essential for generating accurately spaced R8s. Second, the spatiotemporal orders of key signaling events reveal a robust pattern of lateral inhibition conducted by Ato-coordinated Notch and EGFR signaling to collectively determine R8 patterning. This pattern, stipulating the orders of signaling and comparable to the protocols of communication, may help decipher the well-appreciated but poorly defined logic of developmental signaling. The model is available upon request. hao.zhu@ymail.com Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
The Construction of Impossibility: A Logic-Based Analysis of Conjuring Tricks
Smith, Wally; Dignum, Frank; Sonenberg, Liz
2016-01-01
Psychologists and cognitive scientists have long drawn insights and evidence from stage magic about human perceptual and attentional errors. We present a complementary analysis of conjuring tricks that seeks to understand the experience of impossibility that they produce. Our account is first motivated by insights about the constructional aspects of conjuring drawn from magicians' instructional texts. A view is then presented of the logical nature of impossibility as an unresolvable contradiction between a perception-supported belief about a situation and a memory-supported expectation. We argue that this condition of impossibility is constructed not simply through misperceptions and misattentions, but rather it is an outcome of a trick's whole structure of events. This structure is conceptualized as two parallel event sequences: an effect sequence that the spectator is intended to believe; and a method sequence that the magician understands as happening. We illustrate the value of this approach through an analysis of a simple close-up trick, Martin Gardner's Turnabout. A formalism called propositional dynamic logic is used to describe some of its logical aspects. This elucidates the nature and importance of the relationship between a trick's effect sequence and its method sequence, characterized by the careful arrangement of four evidence relationships: similarity, perceptual equivalence, structural equivalence, and congruence. The analysis further identifies two characteristics of magical apparatus that enable the construction of apparent impossibility: substitutable elements and stable occlusion. PMID:27378959
The spatiotemporal order of signaling events unveils the logic of development signaling
Zhu, Hao; Owen, Markus R.; Mao, Yanlan
2016-01-01
Motivation: Animals from worms and insects to birds and mammals show distinct body plans; however, the embryonic development of diverse body plans with tissues and organs within is controlled by a surprisingly few signaling pathways. It is well recognized that combinatorial use of and dynamic interactions among signaling pathways follow specific logic to control complex and accurate developmental signaling and patterning, but it remains elusive what such logic is, or even, what it looks like. Results: We have developed a computational model for Drosophila eye development with innovated methods to reveal how interactions among multiple pathways control the dynamically generated hexagonal array of R8 cells. We obtained two novel findings. First, the coupling between the long-range inductive signals produced by the proneural Hh signaling and the short-range restrictive signals produced by the antineural Notch and EGFR signaling is essential for generating accurately spaced R8s. Second, the spatiotemporal orders of key signaling events reveal a robust pattern of lateral inhibition conducted by Ato-coordinated Notch and EGFR signaling to collectively determine R8 patterning. This pattern, stipulating the orders of signaling and comparable to the protocols of communication, may help decipher the well-appreciated but poorly defined logic of developmental signaling. Availability and implementation: The model is available upon request. Contact: hao.zhu@ymail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153573
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Boerschlein, David P.
1993-01-01
Fault-Tree Compiler (FTC) program, is software tool used to calculate probability of top event in fault tree. Gates of five different types allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language easy to understand and use. In addition, program supports hierarchical fault-tree definition feature, which simplifies tree-description process and reduces execution time. Set of programs created forming basis for reliability-analysis workstation: SURE, ASSIST, PAWS/STEM, and FTC fault-tree tool (LAR-14586). Written in PASCAL, ANSI-compliant C language, and FORTRAN 77. Other versions available upon request.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Martensen, Anna L.
1992-01-01
FTC, Fault-Tree Compiler program, is reliability-analysis software tool used to calculate probability of top event of fault tree. Five different types of gates allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language of FTC easy to understand and use. Program supports hierarchical fault-tree-definition feature simplifying process of description of tree and reduces execution time. Solution technique implemented in FORTRAN, and user interface in Pascal. Written to run on DEC VAX computer operating under VMS operating system.
Perception or Fact: Measuring the Effectiveness of the Terrorism Early Warning (TEW) Group
2005-09-01
alternatives ” (Campbell 2005). The logic model process is a tool that has been used by evaluators for many years to identify performance measures and...pertinent information is obtained, this cell is responsible for the development (pre-event) and use (trans- and post-event) of playbooks and...
Adapting current Arden Syntax knowledge for an object oriented event monitor.
Choi, Jeeyae; Lussier, Yves A; Mendoça, Eneida A
2003-01-01
Arden Syntax for Medical Logic Module (MLM)1 was designed for writing and sharing task-specific health knowledge in 1989. Several researchers have developed frameworks to improve the sharability and adaptability of Arden Syntax MLMs, an issue known as "curly braces" problem. Karadimas et al proposed an Arden Syntax MLM-based decision support system that uses an object oriented model and the dynamic linking features of the Java platform.2 Peleg et al proposed creating a Guideline Expression Language (GEL) based on Arden Syntax's logic grammar.3 The New York Presbyterian Hospital (NYPH) has a collection of about 200 MLMs. In a process of adapting the current MLMs for an object-oriented event monitor, we identified two problems that may influence the "curly braces" one: (1) the query expressions within the curly braces of Arden Syntax used in our institution are cryptic to the physicians, institutional dependent and written ineffectively (unpublished results), and (2) the events are coded individually within a curly braces, resulting sometimes in a large number of events - up to 200.
Risk Analysis of Return Support Material on Gas Compressor Platform Project
NASA Astrophysics Data System (ADS)
Silvianita; Aulia, B. U.; Khakim, M. L. N.; Rosyid, Daniel M.
2017-07-01
On a fixed platforms project are not only carried out by a contractor, but two or more contractors. Cooperation in the construction of fixed platforms is often not according to plan, it is caused by several factors. It takes a good synergy between the contractor to avoid miss communication may cause problems on the project. For the example is about support material (sea fastening, skid shoe and shipping support) used in the process of sending a jacket structure to operation place often does not return to the contractor. It needs a systematic method to overcome the problem of support material. This paper analyses the causes and effects of GAS Compressor Platform that support material is not return, using Fault Tree Analysis (FTA) and Event Tree Analysis (ETA). From fault tree analysis, the probability of top event is 0.7783. From event tree analysis diagram, the contractors lose Rp.350.000.000, - to Rp.10.000.000.000, -.
Mines Systems Safety Improvement Using an Integrated Event Tree and Fault Tree Analysis
NASA Astrophysics Data System (ADS)
Kumar, Ranjan; Ghosh, Achyuta Krishna
2017-04-01
Mines systems such as ventilation system, strata support system, flame proof safety equipment, are exposed to dynamic operational conditions such as stress, humidity, dust, temperature, etc., and safety improvement of such systems can be done preferably during planning and design stage. However, the existing safety analysis methods do not handle the accident initiation and progression of mine systems explicitly. To bridge this gap, this paper presents an integrated Event Tree (ET) and Fault Tree (FT) approach for safety analysis and improvement of mine systems design. This approach includes ET and FT modeling coupled with redundancy allocation technique. In this method, a concept of top hazard probability is introduced for identifying system failure probability and redundancy is allocated to the system either at component or system level. A case study on mine methane explosion safety with two initiating events is performed. The results demonstrate that the presented method can reveal the accident scenarios and improve the safety of complex mine systems simultaneously.
Improving the flash flood frequency analysis applying dendrogeomorphological evidences
NASA Astrophysics Data System (ADS)
Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.
2009-09-01
Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait.) influenced by flash flood events were sampled using an increment borer. For each tree sampled, additional information were recorded including the geographical position (GPS measure), the geomorphological situation based on a detailed geomorphological map, the social position within neighbouring trees, a description of the external disturbances and information on tree diameter, tree height and the position of the cores extracted. 265 cores were collected. In the laboratory, the 265 samples were analyzed using the standard methods: surface preparation, counting of tree rings as well as measuring of ring widths using a digital LINTAB positioning table and TSAP 4.6 software. Increment curves of the disturbed trees were then crossdated with a reference chronology in order to correct faulty tree-ring series derived from disturbed samples and to determine initiation of abrupt growth suppression or release. The age of the trees in this field site is between 50 and 100 years old. In the field most of the trees were tilted (93 %) and showed exposed roots (64 %). In the laboratory, growth suppressions were detected in 165 samples. Based on the number of trees showing disturbances, the intensity of the disturbance and the spatial distribution of the trees in the field, seven well represented events were dated for the last 50 years: 2005, 2000, 1996, 1976, 1973, 1966 and 1963. The second field site was a reach of 2 km length along the Arenal River, where the stream is channelized. Here stumps from previously felled trees could be analyzed directly in the field. 100 Alnus glutinosa (L.) Gaertn. and Fraxinus angustifolia (Vahl.) cross sections were investigated in order to date internal wounds. Different carpenter tools, sanding paper and magnifying glasses were used to count tree rings and to date the wounds in the field. In addition to the dating in the field, 22 cross sections were sampled and analyzed in the laboratory using the standard methods. The age of the trees ranges between 30 and 50 years. Based on the injuries dated in the field and in the laboratory, and based on the location of the trees, 8 main events were dated for the last 30 years: 2005, 2003, 2000, 1998, 1997, 1995, 1993 and 1978. Additional results are in progress, such as the amount of rainfall responsible for the triggering of the events, estimation of the magnitude, and the influence of the channelization in the case of the Arenal River. The strength of Dendrogeomorphology in flood analysis has been demonstrated, especially in areas where the lack of historical documents, rainfall and flow data limits the use of traditional methods.
Priority Queues for Computer Simulations
NASA Technical Reports Server (NTRS)
Steinman, Jeffrey S. (Inventor)
1998-01-01
The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.
Pedologic and geomorphic impacts of a tornado blowdown event in a mixed pine-hardwood forest
Jonathan D. Phillips; Daniel A. Marion; Alice V. Turkington
2008-01-01
Biomechanical effects of trees on soils and surface processes may be extensive in forest environments. Two blowdown sites caused by a November 2005 tornado in the Ouachita National Forest, Arkansas allowed a case study examination of bioturbation associated with a specific forest blowdown event, as well as detailed examination of relationships between tree root systems...
A fast bottom-up algorithm for computing the cut sets of noncoherent fault trees
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corynen, G.C.
1987-11-01
An efficient procedure for finding the cut sets of large fault trees has been developed. Designed to address coherent or noncoherent systems, dependent events, shared or common-cause events, the method - called SHORTCUT - is based on a fast algorithm for transforming a noncoherent tree into a quasi-coherent tree (COHERE), and on a new algorithm for reducing cut sets (SUBSET). To assure sufficient clarity and precision, the procedure is discussed in the language of simple sets, which is also developed in this report. Although the new method has not yet been fully implemented on the computer, we report theoretical worst-casemore » estimates of its computational complexity. 12 refs., 10 figs.« less
A fuzzy Petri-net-based mode identification algorithm for fault diagnosis of complex systems
NASA Astrophysics Data System (ADS)
Propes, Nicholas C.; Vachtsevanos, George
2003-08-01
Complex dynamical systems such as aircraft, manufacturing systems, chillers, motor vehicles, submarines, etc. exhibit continuous and event-driven dynamics. These systems undergo several discrete operating modes from startup to shutdown. For example, a certain shipboard system may be operating at half load or full load or may be at start-up or shutdown. Of particular interest are extreme or "shock" operating conditions, which tend to severely impact fault diagnosis or the progression of a fault leading to a failure. Fault conditions are strongly dependent on the operating mode. Therefore, it is essential that in any diagnostic/prognostic architecture, the operating mode be identified as accurately as possible so that such functions as feature extraction, diagnostics, prognostics, etc. can be correlated with the predominant operating conditions. This paper introduces a mode identification methodology that incorporates both time- and event-driven information about the process. A fuzzy Petri net is used to represent the possible successive mode transitions and to detect events from processed sensor signals signifying a mode change. The operating mode is initialized and verified by analysis of the time-driven dynamics through a fuzzy logic classifier. An evidence combiner module is used to combine the results from both the fuzzy Petri net and the fuzzy logic classifier to determine the mode. Unlike most event-driven mode identifiers, this architecture will provide automatic mode initialization through the fuzzy logic classifier and robustness through the combining of evidence of the two algorithms. The mode identification methodology is applied to an AC Plant typically found as a component of a shipboard system.
SPACE PROPULSION SYSTEM PHASED-MISSION PROBABILITY ANALYSIS USING CONVENTIONAL PRA METHODS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis Smith; James Knudsen
As part of a series of papers on the topic of advance probabilistic methods, a benchmark phased-mission problem has been suggested. This problem consists of modeling a space mission using an ion propulsion system, where the mission consists of seven mission phases. The mission requires that the propulsion operate for several phases, where the configuration changes as a function of phase. The ion propulsion system itself consists of five thruster assemblies and a single propellant supply, where each thruster assembly has one propulsion power unit and two ion engines. In this paper, we evaluate the probability of mission failure usingmore » the conventional methodology of event tree/fault tree analysis. The event tree and fault trees are developed and analyzed using Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE). While the benchmark problem is nominally a "dynamic" problem, in our analysis the mission phases are modeled in a single event tree to show the progression from one phase to the next. The propulsion system is modeled in fault trees to account for the operation; or in this case, the failure of the system. Specifically, the propulsion system is decomposed into each of the five thruster assemblies and fed into the appropriate N-out-of-M gate to evaluate mission failure. A separate fault tree for the propulsion system is developed to account for the different success criteria of each mission phase. Common-cause failure modeling is treated using traditional (i.e., parametrically) methods. As part of this paper, we discuss the overall results in addition to the positive and negative aspects of modeling dynamic situations with non-dynamic modeling techniques. One insight from the use of this conventional method for analyzing the benchmark problem is that it requires significant manual manipulation to the fault trees and how they are linked into the event tree. The conventional method also requires editing the resultant cut sets to obtain the correct results. While conventional methods may be used to evaluate a dynamic system like that in the benchmark, the level of effort required may preclude its use on real-world problems.« less
Research frontiers for improving our understanding of drought‐induced tree and forest mortality
Hartmann, Henrik; Moura, Catarina; Anderegg, William R. L.; Ruehr, Nadine; Salmon, Yann; Allen, Craig D.; Arndt, Stefan K.; Breshears, David D.; Davi, Hendrik; Galbraith, David; Ruthrof, Katinka X.; Wunder, Jan; Adams, Henry D.; Bloemen, Jasper; Cailleret, Maxime; Cobb, Richard; Gessler, Arthur; Grams, Thorsten E. E.; Jansen, Steven; Kautz, Markus; Lloret, Francisco; O’Brien, Michael
2018-01-01
Accumulating evidence highlights increased mortality risks for trees during severe drought, particularly under warmer temperatures and increasing vapour pressure deficit (VPD). Resulting forest die‐off events have severe consequences for ecosystem services, biophysical and biogeochemical land–atmosphere processes. Despite advances in monitoring, modelling and experimental studies of the causes and consequences of tree death from individual tree to ecosystem and global scale, a general mechanistic understanding and realistic predictions of drought mortality under future climate conditions are still lacking. We update a global tree mortality map and present a roadmap to a more holistic understanding of forest mortality across scales. We highlight priority research frontiers that promote: (1) new avenues for research on key tree ecophysiological responses to drought; (2) scaling from the tree/plot level to the ecosystem and region; (3) improvements of mortality risk predictions based on both empirical and mechanistic insights; and (4) a global monitoring network of forest mortality. In light of recent and anticipated large forest die‐off events such a research agenda is timely and needed to achieve scientific understanding for realistic predictions of drought‐induced tree mortality. The implementation of a sustainable network will require support by stakeholders and political authorities at the international level.
NASA Astrophysics Data System (ADS)
Tonini, Roberto; Sandri, Laura; Rouwet, Dmitri; Caudron, Corentin; Marzocchi, Warner; Suparjan
2016-07-01
Although most of volcanic hazard studies focus on magmatic eruptions, volcanic hazardous events can also occur when no migration of magma can be recognized. Examples are tectonic and hydrothermal unrest that may lead to phreatic eruptions. Recent events (e.g., Ontake eruption on September 2014) have demonstrated that phreatic eruptions are still hard to forecast, despite being potentially very hazardous. For these reasons, it is of paramount importance to identify indicators that define the condition of nonmagmatic unrest, in particular for hydrothermal systems. Often, this type of unrest is driven by movement of fluids, requiring alternative monitoring setups, beyond the classical seismic-geodetic-geochemical architectures. Here we present a new version of the probabilistic BET (Bayesian Event Tree) model, specifically developed to include the forecasting of nonmagmatic unrest and related hazards. The structure of the new event tree differs from the previous schemes by adding a specific branch to detail nonmagmatic unrest outcomes. A further goal of this work consists in providing a user-friendly, open-access, and straightforward tool to handle the probabilistic forecast and visualize the results as possible support during a volcanic crisis. The new event tree and tool are here applied to Kawah Ijen stratovolcano, Indonesia, as exemplificative application. In particular, the tool is set on the basis of monitoring data for the learning period 2000-2010, and is then blindly applied to the test period 2010-2012, during which significant unrest phases occurred.
[Application of root cause analysis in healthcare].
Hsu, Tsung-Fu
2007-12-01
The main purpose of this study was to explore various aspects of root cause analysis (RCA), including its definition, rationale concept, main objective, implementation procedures, most common analysis methodology (fault tree analysis, FTA), and advantages and methodologic limitations in regard to healthcare. Several adverse events that occurred at a certain hospital were also analyzed by the author using FTA as part of this study. RCA is a process employed to identify basic and contributing causal factors underlying performance variations associated with adverse events. The rationale concept of RCA offers a systemic approach to improving patient safety that does not assign blame or liability to individuals. The four-step process involved in conducting an RCA includes: RCA preparation, proximate cause identification, root cause identification, and recommendation generation and implementation. FTA is a logical, structured process that can help identify potential causes of system failure before actual failures occur. Some advantages and significant methodologic limitations of RCA were discussed. Finally, we emphasized that errors stem principally from faults attributable to system design, practice guidelines, work conditions, and other human factors, which induce health professionals to make negligence or mistakes with regard to healthcare. We must explore the root causes of medical errors to eliminate potential RCA system failure factors. Also, a systemic approach is needed to resolve medical errors and move beyond a current culture centered on assigning fault to individuals. In constructing a real environment of patient-centered safety healthcare, we can help encourage clients to accept state-of-the-art healthcare services.
The fundamental theorem of asset pricing under default and collateral in finite discrete time
NASA Astrophysics Data System (ADS)
Alvarez-Samaniego, Borys; Orrillo, Jaime
2006-08-01
We consider a financial market where time and uncertainty are modeled by a finite event-tree. The event-tree has a length of N, a unique initial node at the initial date, and a continuum of branches at each node of the tree. Prices and returns of J assets are modeled, respectively, by a R2JxR2J-valued stochastic process . In this framework we prove a version of the Fundamental Theorem of Asset Pricing which applies to defaultable securities backed by exogenous collateral suffering a contingent linear depreciation.
A spatio-temporal model for probabilistic seismic hazard zonation of Tehran
NASA Astrophysics Data System (ADS)
Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza
2013-08-01
A precondition for all disaster management steps, building damage prediction, and construction code developments is a hazard assessment that shows the exceedance probabilities of different ground motion levels at a site considering different near- and far-field earthquake sources. The seismic sources are usually categorized as time-independent area sources and time-dependent fault sources. While the earlier incorporates the small and medium events, the later takes into account only the large characteristic earthquakes. In this article, a probabilistic approach is proposed to aggregate the effects of time-dependent and time-independent sources on seismic hazard. The methodology is then applied to generate three probabilistic seismic hazard maps of Tehran for 10%, 5%, and 2% exceedance probabilities in 50 years. The results indicate an increase in peak ground acceleration (PGA) values toward the southeastern part of the study area and the PGA variations are mostly controlled by the shear wave velocities across the city. In addition, the implementation of the methodology takes advantage of GIS capabilities especially raster-based analyses and representations. During the estimation of the PGA exceedance rates, the emphasis has been placed on incorporating the effects of different attenuation relationships and seismic source models by using a logic tree.
Starting Circuit For Erasable Programmable Logic Device
NASA Technical Reports Server (NTRS)
Cole, Steven W.
1990-01-01
Voltage regulator bypassed to supply starting current. Starting or "pullup" circuit supplies large inrush of current required by erasable programmable logic device (EPLD) while being turned on. Operates only during such intervals of high demand for current and has little effect any other time. Performs needed bypass, acting as current-dependent shunt connecting battery or other source of power more nearly directly to EPLD. Input capacitor of regulator removed when starting circuit installed, reducing probability of damage to transistor in event of short circuit in or across load.
Conditional Inference and Logic for Intelligent Systems: A Theory of Measure-Free Conditioning
1991-08-01
work in this direction was executed. 1 Bruno and Gilio (1985), inspired by DeFimetti’s much earlier work, proposed an abbreviated algebra of measure...See also Section 1.5.) Based on DeFinetti’s work, but independent of Bruno and Gilio , Darigelli and Scozzafi;, . (1984) mentioned the lack of apparent...1.5 below. 1.5 Logical operations among conditional events Schay (1968), Bruno and Gilio (1985) and Calabrese (1987) contain developments 36 A Survey of
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.
1989-01-01
It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.
TRACE Model for Simulation of Anticipated Transients Without Scram in a BWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng L. Y.; Baek J.; Cuadra,A.
2013-11-10
A TRACE model has been developed for using theTRACE/PARCS computational package [1, 2] to simulate anticipated transients without scram (ATWS) events in a boiling water reactor (BWR). The model represents a BWR/5 housed in a Mark II containment. The reactor and the balance of plant systems are modeled in sufficient detail to enable the evaluation of plant responses and theeffectiveness of automatic and operator actions tomitigate this beyond design basis accident.The TRACE model implements features thatfacilitate the simulation of ATWS events initiated by turbine trip and closure of the main steam isolation valves (MSIV). It also incorporates control logic tomore » initiate actions to mitigate the ATWS events, such as water levelcontrol, emergency depressurization, and injection of boron via the standby liquid control system (SLCS). Two different approaches have been used to model boron mixing in the lower plenum of the reactor vessel: modulate coolant flow in the lower plenum by a flow valve, and use control logic to modular.« less
Effects of biotic and abiotic factors on resistance versus resilience of Douglas fir to drought.
Carnwath, Gunnar; Nelson, Cara
2017-01-01
Significant increases in tree mortality due to drought-induced physiological stress have been documented worldwide. This trend is likely to continue with increased frequency and severity of extreme drought events in the future. Therefore, understanding the factors that influence variability in drought responses among trees will be critical to predicting ecosystem responses to climate change and developing effective management actions. In this study, we used hierarchical mixed-effects models to analyze drought responses of Pseudotsuga menziesii in 20 unmanaged forests stands across a broad range of environmental conditions in northeastern Washington, USA. We aimed to 1) identify the biotic and abiotic attributes most closely associated with the responses of individual trees to drought and 2) quantify the variability in drought responses at different spatial scales. We found that growth rates and competition for resources significantly affected resistance to a severe drought event in 2001: slow-growing trees and trees growing in subordinate canopy positions and/or with more neighbors suffered greater declines in radial growth during the drought event. In contrast, the ability of a tree to return to normal growth when climatic conditions improved (resilience) was unaffected by competition or relative growth rates. Drought responses were significantly influenced by tree age: older trees were more resistant but less resilient than younger trees. Finally, we found differences between resistance and resilience in spatial scale: a significant proportion (approximately 50%) of the variability in drought resistance across the study area was at broad spatial scales (i.e. among different forest types), most likely due to differences in the total amount of precipitation received at different elevations; in contrast, variation in resilience was overwhelmingly (82%) at the level of individual trees within stands and there was no difference in drought resilience among forest types. Our results suggest that for Pseudotsuga menziesii resistance and resilience to drought are driven by different factors and vary at different spatial scales.
A synthesis of radial growth patterns preceding tree mortality
Cailleret, Maxime; Jansen, Steven; Robert, Elisabeth M.R.; Desoto, Lucia; Aakala, Tuomas; Antos, Joseph A.; Beikircher, Barbara; Bigler, Christof; Bugmann, Harald; Caccianiga, Marco; Cada, Vojtech; Camarero, Jesus J.; Cherubini, Paolo; Cochard, Herve; Coyea, Marie R.; Cufar, Katarina; Das, Adrian J.; Davi, Hendrik; Delzon, Sylvain; Dorman, Michael; Gea-Izquierdo, Guillermo; Gillner, Sten; Haavik, Laurel J.; Hartmann, Henrik; Heres, Ana-Maria; Hultine, Kevin R.; Janda, Pavel; Kane, Jeffrey M.; Kharuk, Vyacheslav I.; Kitzberger, Thomas; Klein, Tamir; Kramer, Koen; Lens, Frederic; Levanic, Tom; Calderon, Juan C. Linares; Lloret, Francisco; Lobo-Do-Vale, Raquel; Lombardi, Fabio; Lopez Rodriguez, Rosana; Makinen, Harri; Mayr, Stefan; Meszaros, IIona; Metsaranta, Juha M.; Minunno, Francesco; Oberhuber, Walter; Papadopoulos, Andreas; Peltoniemi, Mikko; Petritan, Any M.; Rohner, Brigitte; Sanguesa-Barreda, Gabriel; Sarris, Dimitrios; Smith, Jeremy M.; Stan, Amanda B.; Sterck, Frank; Stojanovic, Dejan B.; Suarez, Maria L.; Svoboda, Miroslav; Tognetti, Roberto; Torres-Ruiz, Jose M.; Trotsiuk, Volodymyr; Villalba, Ricardo; Vodde, Floor; Westwood, Alana R.; Wyckoff, Peter H.; Zafirov, Nikolay; Martinez-Vilalta, Jordi
2017-01-01
Tree mortality is a key factor influencing forest functions and dynamics, but our understanding of the mechanisms leading to mortality and the associated changes in tree growth rates are still limited. We compiled a new pan-continental tree-ring width database from sites where both dead and living trees were sampled (2970 dead and 4224 living trees from 190 sites, including 36 species), and compared early and recent growth rates between trees that died and those that survived a given mortality event. We observed a decrease in radial growth before death in ca. 84% of the mortality events. The extent and duration of these reductions were highly variable (1–100 years in 96% of events) due to the complex interactions among study species and the source(s) of mortality. Strong and long-lasting declines were found for gymnosperms, shade- and drought-tolerant species, and trees that died from competition. Angiosperms and trees that died due to biotic attacks (especially bark-beetles) typically showed relatively small and short-term growth reductions. Our analysis did not highlight any universal trade-off between early growth and tree longevity within a species, although this result may also reflect high variability in sampling design among sites. The intersite and interspecific variability in growth patterns before mortality provides valuable information on the nature of the mortality process, which is consistent with our understanding of the physiological mechanisms leading to mortality. Abrupt changes in growth immediately before death can be associated with generalized hydraulic failure and/or bark-beetle attack, while long-term decrease in growth may be associated with a gradual decline in hydraulic performance coupled with depletion in carbon reserves. Our results imply that growth-based mortality algorithms may be a powerful tool for predicting gymnosperm mortality induced by chronic stress, but not necessarily so for angiosperms and in case of intense drought or bark-beetle outbreaks.
A synthesis of radial growth patterns preceding tree mortality.
Cailleret, Maxime; Jansen, Steven; Robert, Elisabeth M R; Desoto, Lucía; Aakala, Tuomas; Antos, Joseph A; Beikircher, Barbara; Bigler, Christof; Bugmann, Harald; Caccianiga, Marco; Čada, Vojtěch; Camarero, Jesus J; Cherubini, Paolo; Cochard, Hervé; Coyea, Marie R; Čufar, Katarina; Das, Adrian J; Davi, Hendrik; Delzon, Sylvain; Dorman, Michael; Gea-Izquierdo, Guillermo; Gillner, Sten; Haavik, Laurel J; Hartmann, Henrik; Hereş, Ana-Maria; Hultine, Kevin R; Janda, Pavel; Kane, Jeffrey M; Kharuk, Vyacheslav I; Kitzberger, Thomas; Klein, Tamir; Kramer, Koen; Lens, Frederic; Levanic, Tom; Linares Calderon, Juan C; Lloret, Francisco; Lobo-Do-Vale, Raquel; Lombardi, Fabio; López Rodríguez, Rosana; Mäkinen, Harri; Mayr, Stefan; Mészáros, Ilona; Metsaranta, Juha M; Minunno, Francesco; Oberhuber, Walter; Papadopoulos, Andreas; Peltoniemi, Mikko; Petritan, Any M; Rohner, Brigitte; Sangüesa-Barreda, Gabriel; Sarris, Dimitrios; Smith, Jeremy M; Stan, Amanda B; Sterck, Frank; Stojanović, Dejan B; Suarez, Maria L; Svoboda, Miroslav; Tognetti, Roberto; Torres-Ruiz, José M; Trotsiuk, Volodymyr; Villalba, Ricardo; Vodde, Floor; Westwood, Alana R; Wyckoff, Peter H; Zafirov, Nikolay; Martínez-Vilalta, Jordi
2017-04-01
Tree mortality is a key factor influencing forest functions and dynamics, but our understanding of the mechanisms leading to mortality and the associated changes in tree growth rates are still limited. We compiled a new pan-continental tree-ring width database from sites where both dead and living trees were sampled (2970 dead and 4224 living trees from 190 sites, including 36 species), and compared early and recent growth rates between trees that died and those that survived a given mortality event. We observed a decrease in radial growth before death in ca. 84% of the mortality events. The extent and duration of these reductions were highly variable (1-100 years in 96% of events) due to the complex interactions among study species and the source(s) of mortality. Strong and long-lasting declines were found for gymnosperms, shade- and drought-tolerant species, and trees that died from competition. Angiosperms and trees that died due to biotic attacks (especially bark-beetles) typically showed relatively small and short-term growth reductions. Our analysis did not highlight any universal trade-off between early growth and tree longevity within a species, although this result may also reflect high variability in sampling design among sites. The intersite and interspecific variability in growth patterns before mortality provides valuable information on the nature of the mortality process, which is consistent with our understanding of the physiological mechanisms leading to mortality. Abrupt changes in growth immediately before death can be associated with generalized hydraulic failure and/or bark-beetle attack, while long-term decrease in growth may be associated with a gradual decline in hydraulic performance coupled with depletion in carbon reserves. Our results imply that growth-based mortality algorithms may be a powerful tool for predicting gymnosperm mortality induced by chronic stress, but not necessarily so for angiosperms and in case of intense drought or bark-beetle outbreaks. © 2016 John Wiley & Sons Ltd.
Effects of biotic and abiotic factors on resistance versus resilience of Douglas fir to drought
Nelson, Cara
2017-01-01
Significant increases in tree mortality due to drought-induced physiological stress have been documented worldwide. This trend is likely to continue with increased frequency and severity of extreme drought events in the future. Therefore, understanding the factors that influence variability in drought responses among trees will be critical to predicting ecosystem responses to climate change and developing effective management actions. In this study, we used hierarchical mixed-effects models to analyze drought responses of Pseudotsuga menziesii in 20 unmanaged forests stands across a broad range of environmental conditions in northeastern Washington, USA. We aimed to 1) identify the biotic and abiotic attributes most closely associated with the responses of individual trees to drought and 2) quantify the variability in drought responses at different spatial scales. We found that growth rates and competition for resources significantly affected resistance to a severe drought event in 2001: slow-growing trees and trees growing in subordinate canopy positions and/or with more neighbors suffered greater declines in radial growth during the drought event. In contrast, the ability of a tree to return to normal growth when climatic conditions improved (resilience) was unaffected by competition or relative growth rates. Drought responses were significantly influenced by tree age: older trees were more resistant but less resilient than younger trees. Finally, we found differences between resistance and resilience in spatial scale: a significant proportion (approximately 50%) of the variability in drought resistance across the study area was at broad spatial scales (i.e. among different forest types), most likely due to differences in the total amount of precipitation received at different elevations; in contrast, variation in resilience was overwhelmingly (82%) at the level of individual trees within stands and there was no difference in drought resilience among forest types. Our results suggest that for Pseudotsuga menziesii resistance and resilience to drought are driven by different factors and vary at different spatial scales. PMID:28973008
Windthrown trees on the Kings River Ranger District, Sierra National Forest: meteorological aspects
Michael A. Fosberg
1986-01-01
Blowdown in shelterwood, sanitation cuts, and other partial cuts on the Kings River Ranger District, Sierra National Forest, are due to Mono winds. Both winter storm and Mono winds were considered as causes of winter blowdown. All evidence, e.g., direction of tree-fall and occurrence of high wind events, point to Mono wind events as the cause of blowdown. Only 12...
Object-Oriented Algorithm For Evaluation Of Fault Trees
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Koen, B. V.
1992-01-01
Algorithm for direct evaluation of fault trees incorporates techniques of object-oriented programming. Reduces number of calls needed to solve trees with repeated events. Provides significantly improved software environment for such computations as quantitative analyses of safety and reliability of complicated systems of equipment (e.g., spacecraft or factories).
Carrer, Marco; Brunetti, Michele; Castagneri, Daniele
2016-01-01
Extreme climate events are of key importance for forest ecosystems. However, both the inherent infrequency, stochasticity and multiplicity of extreme climate events, and the array of biological responses, challenges investigations. To cope with the long life cycle of trees and the paucity of the extreme events themselves, our inferences should be based on long-term observations. In this context, tree rings and the related xylem anatomical traits represent promising sources of information, due to the wide time perspective and quality of the information they can provide. Here we test, on two high-elevation conifers (Larix decidua and Picea abies sampled at 2100 m a.s.l. in the Eastern Alps), the associations among temperature extremes during the growing season and xylem anatomical traits, specifically the number of cells per ring (CN), cell wall thickness (CWT), and cell diameter (CD). To better track the effect of extreme events over the growing season, tree rings were partitioned in 10 sectors. Climate variability has been reconstructed, for 1800–2011 at monthly resolution and for 1926–2011 at daily resolution, by exploiting the excellent availability of very long and high quality instrumental records available for the surrounding area, and taking into account the relationship between meteorological variables and site topographical settings. Summer temperature influenced anatomical traits of both species, and tree-ring anatomical profiles resulted as being associated to temperature extremes. Most of the extreme values in anatomical traits occurred with warm (positive extremes) or cold (negative) conditions. However, 0–34% of occurrences did not match a temperature extreme event. Specifically, CWT and CN extremes were more clearly associated to climate than CD, which presented a bias to track cold extremes. Dendroanatomical analysis, coupled to high-quality daily-resolved climate records, seems a promising approach to study the effects of extreme events on trees, but further investigations are needed to improve our comprehension of the critical role of such elusive events in forest ecosystems. PMID:27242880
Susan J. Prichard; Maureen C. Kennedy
2012-01-01
Fuel reduction treatments are increasingly used to mitigate future wildfire severity in dry forests, but few opportunities exist to assess their effectiveness. We evaluated the influence of fuel treatment, tree size and species on tree mortality following a large wildfire event in recent thin-only, thin and prescribed burn (thin-Rx) units. Of the trees that died within...
Michael J. Clifford; Patrick D. Royer; Neil S. Cobb; David D. Breshears; Paulette L. Ford
2013-01-01
Recent regional tree die-off events appear to have been triggered by a combination of drought and heat - referred to as 'global-change-type drought'. To complement experiments focused on resolving mechanisms of drought-induced tree mortality, an evaluation of how patterns of tree die-off relate to highly spatially variable precipitation is needed....
Describing and recognizing patterns of events in smart environments with description logic.
Scalmato, Antonello; Sgorbissa, Antonio; Zaccaria, Renato
2013-12-01
This paper describes a system for context awareness in smart environments, which is based on an ontology expressed in description logic and implemented in OWL 2 EL, which is a subset of the Web Ontology Language that allows for reasoning in polynomial time. The approach is different from all other works in the literature since the proposed system requires only the basic reasoning mechanisms of description logic, i.e., subsumption and instance checking, without any additional external reasoning engine. Experiments performed with data collected in three different scenarios are described, i.e., the CASAS Project at Washington State University, the assisted living facility Villa Basilea in Genoa, and the Merry Porter mobile robot at the Polyclinic of Modena.
Spin-Polarization Control in a Two-Dimensional Semiconductor
NASA Astrophysics Data System (ADS)
Appelbaum, Ian; Li, Pengke
2016-05-01
Long carrier spin lifetimes are a double-edged sword for the prospect of constructing "spintronic" logic devices: Preservation of the logic variable within the transport channel or interconnect is essential to successful completion of the logic operation, but any spins remaining past this event will pollute the environment for subsequent clock cycles. Electric fields can be used to manipulate these spins on a fast time scale by careful interplay of spin-orbit effects, but efficient controlled depolarization can only be completely achieved with amenable materials properties. Taking III-VI monochalcogenide monolayers as an example 2D semiconductor, we use symmetry analysis, perturbation theory, and ensemble calculation to show how this longstanding problem can be solved by suitable manipulation of conduction electrons.
Laser Scanner Tests For Single-Event Upsets
NASA Technical Reports Server (NTRS)
Kim, Quiesup; Soli, George A.; Schwartz, Harvey R.
1992-01-01
Microelectronic advanced laser scanner (MEALS) is opto/electro/mechanical apparatus for nondestructive testing of integrated memory circuits, logic circuits, and other microelectronic devices. Multipurpose diagnostic system used to determine ultrafast time response, leakage, latchup, and electrical overstress. Used to simulate some of effects of heavy ions accelerated to high energies to determine susceptibility of digital device to single-event upsets.
The Conjunction Fallacy and the Many Meanings of "And"
ERIC Educational Resources Information Center
Hertwig, Ralph; Benz, Bjorn; Krauss, Stefan
2008-01-01
According to the conjunction rule, the probability of A "and" B cannot exceed the probability of either single event. This rule reads "and" in terms of the logical operator [inverted v], interpreting A and B as an intersection of two events. As linguists have long argued, in natural language "and" can convey a wide range of relationships between…
Interpretation of Verb Phrase Telicity: Sensitivity to Verb Type and Determiner Type
ERIC Educational Resources Information Center
Ogiela, Diane A.; Schmitt, Cristina; Casby, Michael W.
2014-01-01
Purpose: The authors examine how adults use linguistic information from verbs, direct objects, and particles to interpret an event description as encoding a logical endpoint to the event described (in which case, it is telic) or not (in which case, it is atelic). Current models of aspectual composition predict that quantity-sensitive verbs…
When the Future Feels Worse than the Past: A Temporal Inconsistency in Moral Judgment
ERIC Educational Resources Information Center
Caruso, Eugene M.
2010-01-01
Logically, an unethical behavior performed yesterday should also be unethical if performed tomorrow. However, the present studies suggest that the timing of a transgression has a systematic effect on people's beliefs about its moral acceptability. Because people's emotional reactions tend to be more extreme for future events than for past events,…
NASA Astrophysics Data System (ADS)
Cominelli, Alessandro; Acconcia, Giulia; Ghioni, Massimo; Rech, Ivan
2018-03-01
Time-correlated single-photon counting (TCSPC) is a powerful optical technique, which permits recording fast luminous signals with picosecond precision. Unfortunately, given its repetitive nature, TCSPC is recognized as a relatively slow technique, especially when a large time-resolved image has to be recorded. In recent years, there has been a fast trend toward the development of TCPSC imagers. Unfortunately, present systems still suffer from a trade-off between number of channels and performance. Even worse, the overall measurement speed is still limited well below the saturation of the transfer bandwidth toward the external processor. We present a routing algorithm that enables a smart connection between a 32×32 detector array and five shared high-performance converters able to provide an overall conversion rate up to 10 Gbit/s. The proposed solution exploits a fully digital logic circuit distributed in a tree structure to limit the number and length of interconnections, which is a major issue in densely integrated circuits. The behavior of the logic has been validated by means of a field-programmable gate array, while a fully integrated prototype has been designed in 180-nm technology and analyzed by means of postlayout simulations.
Gene-Tree Reconciliation with MUL-Trees to Resolve Polyploidy Events.
Gregg, W C Thomas; Ather, S Hussain; Hahn, Matthew W
2017-11-01
Polyploidy can have a huge impact on the evolution of species, and it is a common occurrence, especially in plants. The two types of polyploids-autopolyploids and allopolyploids-differ in the level of divergence between the genes that are brought together in the new polyploid lineage. Because allopolyploids are formed via hybridization, the homoeologous copies of genes within them are at least as divergent as orthologs in the parental species that came together to form them. This means that common methods for estimating the parental lineages of allopolyploidy events are not accurate, and can lead to incorrect inferences about the number of gene duplications and losses. Here, we have adapted an algorithm for topology-based gene-tree reconciliation to work with multi-labeled trees (MUL-trees). By definition, MUL-trees have some tips with identical labels, which makes them a natural representation of the genomes of polyploids. Using this new reconciliation algorithm we can: accurately place allopolyploidy events on a phylogeny, identify the parental lineages that hybridized to form allopolyploids, distinguish between allo-, auto-, and (in most cases) no polyploidy, and correctly count the number of duplications and losses in a set of gene trees. We validate our method using gene trees simulated with and without polyploidy, and revisit the history of polyploidy in data from the clades including both baker's yeast and bread wheat. Our re-analysis of the yeast data confirms the allopolyploid origin and parental lineages previously identified for this group. The method presented here should find wide use in the growing number of genomes from species with a history of polyploidy. [Polyploidy; reconciliation; whole-genome duplication.]. © The Author(s) 2017. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Tree mortality in the eastern Mediterranean, causes and implications under climatic change
NASA Astrophysics Data System (ADS)
Sarris, Dimitrios; Iacovou, Valentina; Hoch, Guenter; Vennetier, Michel; Siegwolf, Rolf; Christodoulakis, Dimitrios; Koerner, Christian
2015-04-01
The eastern Mediterranean has experienced repeated incidents of forest mortality related to drought in recent decades. Such events may become more frequent in the future as drought conditions are projected to further intensify due to global warming. We have been investigating the causes behind such forest mortality events in Pinus halepensis, (the most drought tolerant pine in the Mediterranean). We cored tree stems and sampled various tissue types from dry habitats close to sea level and explored growth responses, stable isotope signals and non-structural carbohydrate (NSC) concentrations. Under intense drought that coincided with pine desiccation events in natural populations our result indicate a significant reduction in tree growth, the most significant in more than a century despite the increase in atmospheric CO2 concentrations in recent decades. This has been accompanied by a lengthening in the integration periods of rainfall needed for pine growth, reaching even 5-6 years before and including the year of mortality occurrence. Oxygen stable isotopes indicate that these signals were associated with a shift in tree water utilization from deeper moisture pools related to past rainfall events. Furthermore, where the driest conditions occur, pine carbon reserves were found to increase in stem tissue, indicating that mortality in these pines cannot be explained by carbon starvation. Our findings suggest that for pine populations that are already water limited (i) a further atmospheric CO2 increase will not compensate for the reduction in growth because of a drier climate, (ii) hydraulic failure appears as the most likely cause of pine desiccation, as no shortage occurs in tree carbon reserves, (iii) a further increase in mortality events may cause these systems to become carbon sources.
A study of fuzzy logic ensemble system performance on face recognition problem
NASA Astrophysics Data System (ADS)
Polyakova, A.; Lipinskiy, L.
2017-02-01
Some problems are difficult to solve by using a single intelligent information technology (IIT). The ensemble of the various data mining (DM) techniques is a set of models which are able to solve the problem by itself, but the combination of which allows increasing the efficiency of the system as a whole. Using the IIT ensembles can improve the reliability and efficiency of the final decision, since it emphasizes on the diversity of its components. The new method of the intellectual informational technology ensemble design is considered in this paper. It is based on the fuzzy logic and is designed to solve the classification and regression problems. The ensemble consists of several data mining algorithms: artificial neural network, support vector machine and decision trees. These algorithms and their ensemble have been tested by solving the face recognition problems. Principal components analysis (PCA) is used for feature selection.
Methodology for the systems engineering process. Volume 2: Technical parameters
NASA Technical Reports Server (NTRS)
Nelson, J. H.
1972-01-01
A scheme based on starting the logic networks from the development and mission factors that are of primary concern in an aerospace system is described. This approach required identifying the primary states (design, design verification, premission, mission, postmission), identifying the attributes within each state (performance capability, survival, evaluation, operation, etc), and then developing the generic relationships of variables for each branch. To illustrate this concept, a system was used that involved a launch vehicle and payload for an earth orbit mission. Examination showed that this example was sufficient to illustrate the concept. A more complicated mission would follow the same basic approach, but would have more extensive sets of generic trees and more correlation points between branches. It has been shown that in each system state (production, test, and use), a logic could be developed to order and classify the parameters involved in the translation from general requirements to specific requirements for system elements.
Modeling uncertainty in computerized guidelines using fuzzy logic.
Jaulent, M. C.; Joyaux, C.; Colombet, I.; Gillois, P.; Degoulet, P.; Chatellier, G.
2001-01-01
Computerized Clinical Practice Guidelines (CPGs) improve quality of care by assisting physicians in their decision making. A number of problems emerges since patients with close characteristics are given contradictory recommendations. In this article, we propose to use fuzzy logic to model uncertainty due to the use of thresholds in CPGs. A fuzzy classification procedure has been developed that provides for each message of the CPG, a strength of recommendation that rates the appropriateness of the recommendation for the patient under consideration. This work is done in the context of a CPG for the diagnosis and the management of hypertension, published in 1997 by the French agency ANAES. A population of 82 patients with mild to moderate hypertension was selected and the results of the classification system were compared to whose given by a classical decision tree. Observed agreement is 86.6% and the variability of recommendations for patients with close characteristics is reduced. PMID:11825196
Distinguishing between evidence and its explanations in the steering of atomic clocks
NASA Astrophysics Data System (ADS)
Myers, John M.; Hadi Madjid, F.
2014-11-01
Quantum theory reflects within itself a separation of evidence from explanations. This separation leads to a known proof that: (1) no wave function can be determined uniquely by evidence, and (2) any chosen wave function requires a guess reaching beyond logic to things unforeseeable. Chosen wave functions are encoded into computer-mediated feedback essential to atomic clocks, including clocks that step computers through their phases of computation and clocks in space vehicles that supply evidence of signal propagation explained by hypotheses of spacetimes with metric tensor fields. The propagation of logical symbols from one computer to another requires a shared rhythm-like a bucket brigade. Here we show how hypothesized metric tensors, dependent on guesswork, take part in the logical synchronization by which clocks are steered in rate and position toward aiming points that satisfy phase constraints, thereby linking the physics of signal propagation with the sharing of logical symbols among computers. Recognizing the dependence of the phasing of symbol arrivals on guesses about signal propagation transports logical synchronization from the engineering of digital communications to a discipline essential to physics. Within this discipline we begin to explore questions invisible under any concept of time that fails to acknowledge unforeseeable events. In particular, variation of spacetime curvature is shown to limit the bit rate of logical communication.
Growth and reproduction respond differently to climate in three Neotropical tree species.
Alfaro-Sánchez, Raquel; Muller-Landau, Helene C; Wright, S Joseph; Camarero, J Julio
2017-06-01
The response of tropical forests to anthropogenic climate change is critically important to future global carbon budgets, yet remains highly uncertain. Here, we investigate how precipitation, temperature, solar radiation and dry- and wet-season lengths are related to annual tree growth, flower production, and fruit production in three moist tropical forest tree species using long-term datasets from tree rings and litter traps in central Panama. We also evaluated how growth, flower, and fruit production were interrelated. We found that growth was positively correlated with wet-season precipitation in all three species: Jacaranda copaia (r = 0.63), Tetragastris panamensis (r = 0.39) and Trichilia tuberculata (r = 0.39). Flowering and fruiting in Jacaranda were negatively related to current-year dry-season rainfall and positively related to prior-year dry-season rainfall. Flowering in Tetragastris was negatively related to current-year annual mean temperature while Trichilia showed no significant relationships of reproduction with climate. Growth was significantly related to reproduction only in Tetragastris, where it was positively related to previous year fruiting. Our results suggest that tree growth in moist tropical forest tree species is generally reduced by drought events such as those associated with strong El Niño events. In contrast, interannual variation in reproduction is not generally associated with growth and has distinct and species-specific climate responses, with positive effects of El Niño events in some species. Understanding these contrasting climate effects on tree growth and reproduction is critical to predicting changes in tropical forest dynamics and species composition under climate change.
NASA Astrophysics Data System (ADS)
Priest, G. R.; Goldfinger, C.; Wang, K.; Witter, R. C.; Zhang, Y.; Baptista, A.
2008-12-01
To update the tsunami hazard assessment method for Oregon, we (1) evaluate geologically reasonable variability of the earthquake rupture process on the Cascadia megathrust, (2) compare those scenarios to geological and geophysical evidence for plate locking, (3) specify 25 deterministic earthquake sources, and (4) use the resulting vertical coseismic deformations as initial conditions for simulation of Cascadia tsunami inundation at Cannon Beach, Oregon. Because of the Cannon Beach focus, the north-south extent of source scenarios is limited to Neah Bay, Washington to Florence, Oregon. We use the marine paleoseismic record to establish recurrence bins from the 10,000 year event record and select representative coseismic slips from these data. Assumed slips on the megathrust are 8.4 m (290 yrs of convergence), 15.2 m (525 years of convergence), 21.6 m (748 years of convergence), and 37.5 m (1298 years of convergence) which, if the sources were extended to the entire Cascadia margin, give Mw varying from approximately 8.3 to 9.3. Additional parameters explored by these scenarios characterize ruptures with a buried megathrust versus splay faulting, local versus regional slip patches, and seaward skewed versus symmetrical slip distribution. By assigning variable weights to the 25 source scenarios using a logic tree approach, we derived percentile inundation lines that express the confidence level (percentage) that a Cascadia tsunami will NOT exceed the line. Lines of 50, 70, 90, and 99 percent confidence correspond to maximum runup of 8.9, 10.5, 13.2, and 28.4 m (NAVD88). The tsunami source with highest logic tree weight (preferred scenario) involved rupture of a splay fault with 15.2 m slip that produced tsunami inundation near the 70 percent confidence line. Minimum inundation consistent with the inland extent of three Cascadia tsunami sand layers deposited east of Cannon Beach within the last 1000 years suggests a minimum of 15.2 m slip on buried megathrust ruptures. The largest tsunami run-up at the 99 percent isoline was from 37.5 m slip partitioned to a splay fault. This type of extreme event is considered to be very rare, perhaps once in 10,000 years based on offshore paleoseismic evidence, but it can produce waves rivaling the 2004 Indian Ocean tsunami. Cascadia coseismic deformation most similar to the Indian Ocean earthquake produced generally smaller tsunamis than at the Indian Ocean due mostly to the 1 km shallower water depth on the Cascadia margin. Inundation from distant tsunami sources was assessed by simulation of only two Mw 9.2 earthquakes in the Gulf of Alaska, a hypothetical worst-case developed by the Tsunami Pilot Study Working Group (2006) and a historical worst case, the 1964 Prince William Sound Earthquake; maximum runups were, respectively, 12.4 m and 7.5 m.
Arenal-type pyroclastic flows: A probabilistic event tree risk analysis
NASA Astrophysics Data System (ADS)
Meloy, Anthony F.
2006-09-01
A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such, an ETA is considered to be a valuable quantitative decision support tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitehead, Donnie Wayne; Varnado, G. Bruce
2008-09-01
U.S. Nuclear Regulatory Commission nuclear power plant licensees and new reactor applicants are required to provide protection of their plants against radiological sabotage, including the placement of vital equipment in vital areas. This document describes a systematic process for the identification of the minimum set of areas that must be designated as vital areas in order to ensure that all radiological sabotage scenarios are prevented. Vital area identification involves the use of logic models to systematically identify all of the malicious acts or combinations of malicious acts that could lead to radiological sabotage. The models available in the plant probabilisticmore » risk assessment and other safety analyses provide a great deal of the information and basic model structure needed for the sabotage logic model. Once the sabotage logic model is developed, the events (or malicious acts) in the model are replaced with the areas in which the events can be accomplished. This sabotage area logic model is then analyzed to identify the target sets (combinations of areas the adversary must visit to cause radiological sabotage) and the candidate vital area sets (combinations of areas that must be protected against adversary access to prevent radiological sabotage). Any one of the candidate vital area sets can be selected for protection. Appropriate selection criteria will allow the licensee or new reactor applicant to minimize the impacts of vital area protection measures on plant safety, cost, operations, or other factors of concern.« less
NASA Astrophysics Data System (ADS)
Peng, Hao
2015-10-01
A fundamental challenge for PET block detector designs is to deploy finer crystal elements while limiting the number of readout channels. The standard Anger-logic scheme including light sharing (an 8 by 8 crystal array coupled to a 2×2 photodetector array with an optical diffuser, multiplexing ratio: 16:1) has been widely used to address such a challenge. Our work proposes a generalized model to study the impacts of two critical parameters on spatial resolution performance of a PET block detector: multiple interaction events and signal-to-noise ratio (SNR). The study consists of the following three parts: (1) studying light output profile and multiple interactions of 511 keV photons within crystal arrays of different crystal widths (from 4 mm down to 1 mm, constant height: 20 mm); (2) applying the Anger-logic positioning algorithm to investigate positioning/decoding uncertainties (i.e., "block effect") in terms of peak-to-valley ratio (PVR), with light sharing, multiple interactions and photodetector SNR taken into account; and (3) studying the dependency of spatial resolution on SNR in the context of modulation transfer function (MTF). The proposed model can be used to guide the development and evaluation of a standard Anger-logic based PET block detector including: (1) selecting/optimizing the configuration of crystal elements for a given photodetector SNR; and (2) predicting to what extent additional electronic multiplexing may be implemented to further reduce the number of readout channels.
2014-01-01
This study documents tree mortality in Big Bend National Park in Texas in response to the most acute one-year drought on record, which occurred following a five-day winter freeze. I estimated changes in forest stand structure and species composition due to freezing and drought in the Chisos Mountains of Big Bend National Park using permanent monitoring plot data. The drought killed over half (63%) of the sampled trees over the entire elevation gradient. Significant mortality occurred in trees up to 20 cm diameter (P < 0.05). Pinus cembroides Zucc. experienced the highest seedling and tree mortality (P < 0.0001) (55% of piñon pines died), and over five times as many standing dead pines were observed in 2012 than in 2009. Juniperus deppeana vonSteudal and Quercus emoryi Leibmann also experienced significant declines in tree density (P < 0.02) (30.9% and 20.7%, respectively). Subsequent droughts under climate change will likely cause even greater damage to trees that survived this record drought, especially if such events follow freezes. The results from this study highlight the vulnerability of trees in the Southwest to climatic change and that future shifts in forest structure can have large-scale community consequences. PMID:24949231
Klepiszewski, K; Schmitt, T G
2002-01-01
While conventional rule based, real time flow control of sewer systems is in common use, control systems based on fuzzy logic have been used only rarely, but successfully. The intention of this study is to compare a conventional rule based control of a combined sewer system with a fuzzy logic control by using hydrodynamic simulation. The objective of both control strategies is to reduce the combined sewer overflow volume by an optimization of the utilized storage capacities of four combined sewer overflow tanks. The control systems affect the outflow of four combined sewer overflow tanks depending on the water levels inside the structures. Both systems use an identical rule base. The developed control systems are tested and optimized for a single storm event which affects heterogeneously hydraulic load conditions and local discharge. Finally the efficiencies of the two different control systems are compared for two more storm events. The results indicate that the conventional rule based control and the fuzzy control similarly reach the objective of the control strategy. In spite of the higher expense to design the fuzzy control system its use provides no advantages in this case.
78 FR 66762 - Notice of Public Meeting and Request for Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-06
... of request for public meeting and public comments on the planning of the National Christmas Tree... and suggestions on the planning of the 2013 National Christmas Tree Lighting and the subsequent 26-day... on the planning of the 2013 National Christmas Tree Lighting and the subsequent 26-day event, which...
76 FR 66082 - Notice of Public Meeting and Request for Comments
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-25
... for Public Meeting and Public Comments--The National Christmas Tree Lighting and the subsequent 31 day... the 2011 National Christmas Tree Lighting and the subsequent 31 day event. DATES: The meeting will be... and suggestions on the planning of the 2011 National Christmas Tree Lighting and the subsequent 31 day...
Tasting the Tree of Life: Development of a Collaborative, Cross-Campus, Science Outreach Meal Event.
Clement, Wendy L; Elliott, Kathryn T; Cordova-Hoyos, Okxana; Distefano, Isabel; Kearns, Kate; Kumar, Raagni; Leto, Ashley; Tumaliuan, Janis; Franchetti, Lauren; Kulesza, Evelyn; Tineo, Nicole; Mendes, Patrice; Roth, Karen; Osborn, Jeffrey M
2018-01-01
Communicating about science with the public can present a number of challenges, from participation to engagement to impact. In an effort to broadly communicate messages regarding biodiversity, evolution, and tree-thinking with the campus community at The College of New Jersey (TCNJ), a public, primarily undergraduate institution, we created a campus-wide, science-themed meal, "Tasting the Tree of Life: Exploring Biodiversity through Cuisine." We created nine meals that incorporated 149 species/ingredients across the Tree of Life. Each meal illustrated a scientific message communicated through interactions with undergraduate biology students, informational signs, and an interactive website. To promote tree-thinking, we reconstructed a phylogeny of all 149 ingredients. In total, 3,262 people attended the meal, and evaluations indicated that participants left with greater appreciation for the biodiversity and evolutionary relatedness of their food. A keynote lecture and a coordinated social media campaign enhanced the scientific messages, and media coverage extended the reach of this event. "Tasting the Tree of Life" highlights the potential of cuisine as a valuable science communication tool.
NASA Astrophysics Data System (ADS)
Di Vittorio, Alan V.; Negrón-Juárez, Robinson I.; Higuchi, Niro; Chambers, Jeffrey Q.
2014-03-01
Debate continues over the adequacy of existing field plots to sufficiently capture Amazon forest dynamics to estimate regional forest carbon balance. Tree mortality dynamics are particularly uncertain due to the difficulty of observing large, infrequent disturbances. A recent paper (Chambers et al 2013 Proc. Natl Acad. Sci. 110 3949-54) reported that Central Amazon plots missed 9-17% of tree mortality, and here we address ‘why’ by elucidating two distinct mortality components: (1) variation in annual landscape-scale average mortality and (2) the frequency distribution of the size of clustered mortality events. Using a stochastic-empirical tree growth model we show that a power law distribution of event size (based on merged plot and satellite data) is required to generate spatial clustering of mortality that is consistent with forest gap observations. We conclude that existing plots do not sufficiently capture losses because their placement, size, and longevity assume spatially random mortality, while mortality is actually distributed among differently sized events (clusters of dead trees) that determine the spatial structure of forest canopies.
A method for investigating relative timing information on phylogenetic trees.
Ford, Daniel; Matsen, Frederick A; Stadler, Tanja
2009-04-01
In this paper, we present a new way to describe the timing of branching events in phylogenetic trees. Our description is in terms of the relative timing of diversification events between sister clades; as such it is complementary to existing methods using lineages-through-time plots which consider diversification in aggregate. The method can be applied to look for evidence of diversification happening in lineage-specific "bursts", or the opposite, where diversification between 2 clades happens in an unusually regular fashion. In order to be able to distinguish interesting events from stochasticity, we discuss 2 classes of neutral models on trees with relative timing information and develop a statistical framework for testing these models. These model classes include both the coalescent with ancestral population size variation and global rate speciation-extinction models. We end the paper with 2 example applications: first, we show that the evolution of the hepatitis C virus deviates from the coalescent with arbitrary population size. Second, we analyze a large tree of ants, demonstrating that a period of elevated diversification rates does not appear to have occurred in a bursting manner.
Integrative relational machine-learning for understanding drug side-effect profiles
2013-01-01
Background Drug side effects represent a common reason for stopping drug development during clinical trials. Improving our ability to understand drug side effects is necessary to reduce attrition rates during drug development as well as the risk of discovering novel side effects in available drugs. Today, most investigations deal with isolated side effects and overlook possible redundancy and their frequent co-occurrence. Results In this work, drug annotations are collected from SIDER and DrugBank databases. Terms describing individual side effects reported in SIDER are clustered with a semantic similarity measure into term clusters (TCs). Maximal frequent itemsets are extracted from the resulting drug x TC binary table, leading to the identification of what we call side-effect profiles (SEPs). A SEP is defined as the longest combination of TCs which are shared by a significant number of drugs. Frequent SEPs are explored on the basis of integrated drug and target descriptors using two machine learning methods: decision-trees and inductive-logic programming. Although both methods yield explicit models, inductive-logic programming method performs relational learning and is able to exploit not only drug properties but also background knowledge. Learning efficiency is evaluated by cross-validation and direct testing with new molecules. Comparison of the two machine-learning methods shows that the inductive-logic-programming method displays a greater sensitivity than decision trees and successfully exploit background knowledge such as functional annotations and pathways of drug targets, thereby producing rich and expressive rules. All models and theories are available on a dedicated web site. Conclusions Side effect profiles covering significant number of drugs have been extracted from a drug ×side-effect association table. Integration of background knowledge concerning both chemical and biological spaces has been combined with a relational learning method for discovering rules which explicitly characterize drug-SEP associations. These rules are successfully used for predicting SEPs associated with new drugs. PMID:23802887
Integrative relational machine-learning for understanding drug side-effect profiles.
Bresso, Emmanuel; Grisoni, Renaud; Marchetti, Gino; Karaboga, Arnaud Sinan; Souchet, Michel; Devignes, Marie-Dominique; Smaïl-Tabbone, Malika
2013-06-26
Drug side effects represent a common reason for stopping drug development during clinical trials. Improving our ability to understand drug side effects is necessary to reduce attrition rates during drug development as well as the risk of discovering novel side effects in available drugs. Today, most investigations deal with isolated side effects and overlook possible redundancy and their frequent co-occurrence. In this work, drug annotations are collected from SIDER and DrugBank databases. Terms describing individual side effects reported in SIDER are clustered with a semantic similarity measure into term clusters (TCs). Maximal frequent itemsets are extracted from the resulting drug x TC binary table, leading to the identification of what we call side-effect profiles (SEPs). A SEP is defined as the longest combination of TCs which are shared by a significant number of drugs. Frequent SEPs are explored on the basis of integrated drug and target descriptors using two machine learning methods: decision-trees and inductive-logic programming. Although both methods yield explicit models, inductive-logic programming method performs relational learning and is able to exploit not only drug properties but also background knowledge. Learning efficiency is evaluated by cross-validation and direct testing with new molecules. Comparison of the two machine-learning methods shows that the inductive-logic-programming method displays a greater sensitivity than decision trees and successfully exploit background knowledge such as functional annotations and pathways of drug targets, thereby producing rich and expressive rules. All models and theories are available on a dedicated web site. Side effect profiles covering significant number of drugs have been extracted from a drug ×side-effect association table. Integration of background knowledge concerning both chemical and biological spaces has been combined with a relational learning method for discovering rules which explicitly characterize drug-SEP associations. These rules are successfully used for predicting SEPs associated with new drugs.
Responses of tree species to heat waves and extreme heat events.
Teskey, Robert; Wertin, Timothy; Bauweraerts, Ingvar; Ameye, Maarten; McGuire, Mary Anne; Steppe, Kathy
2015-09-01
The number and intensity of heat waves has increased, and this trend is likely to continue throughout the 21st century. Often, heat waves are accompanied by drought conditions. It is projected that the global land area experiencing heat waves will double by 2020, and quadruple by 2040. Extreme heat events can impact a wide variety of tree functions. At the leaf level, photosynthesis is reduced, photooxidative stress increases, leaves abscise and the growth rate of remaining leaves decreases. In some species, stomatal conductance increases at high temperatures, which may be a mechanism for leaf cooling. At the whole plant level, heat stress can decrease growth and shift biomass allocation. When drought stress accompanies heat waves, the negative effects of heat stress are exacerbated and can lead to tree mortality. However, some species exhibit remarkable tolerance to thermal stress. Responses include changes that minimize stress on photosynthesis and reductions in dark respiration. Although there have been few studies to date, there is evidence of within-species genetic variation in thermal tolerance, which could be important to exploit in production forestry systems. Understanding the mechanisms of differing tree responses to extreme temperature events may be critically important for understanding how tree species will be affected by climate change. © 2014 John Wiley & Sons Ltd.
System level latchup mitigation for single event and transient radiation effects on electronics
Kimbrough, J.R.; Colella, N.J.
1997-09-30
A ``blink`` technique, analogous to a person blinking at a flash of bright light, is provided for mitigating the effects of single event current latchup and prompt pulse destructive radiation on a micro-electronic circuit. The system includes event detection circuitry, power dump logic circuitry, and energy limiting measures with autonomous recovery. The event detection circuitry includes ionizing radiation pulse detection means for detecting a pulse of ionizing radiation and for providing at an output terminal thereof a detection signal indicative of the detection of a pulse of ionizing radiation. The current sensing circuitry is coupled to the power bus for determining an occurrence of excess current through the power bus caused by ionizing radiation or by ion-induced destructive latchup of a semiconductor device. The power dump circuitry includes power dump logic circuitry having a first input terminal connected to the output terminal of the ionizing radiation pulse detection circuitry and having a second input terminal connected to the output terminal of the current sensing circuitry. The power dump logic circuitry provides an output signal to the input terminal of the circuitry for opening the power bus and the circuitry for shorting the power bus to a ground potential to remove power from the power bus. The energy limiting circuitry with autonomous recovery includes circuitry for opening the power bus and circuitry for shorting the power bus to a ground potential. The circuitry for opening the power bus and circuitry for shorting the power bus to a ground potential includes a series FET and a shunt FET. The invention provides for self-contained sensing for latchup, first removal of power to protect latched components, and autonomous recovery to enable transparent operation of other system elements. 18 figs.
System level latchup mitigation for single event and transient radiation effects on electronics
Kimbrough, Joseph Robert; Colella, Nicholas John
1997-01-01
A "blink" technique, analogous to a person blinking at a flash of bright light, is provided for mitigating the effects of single event current latchup and prompt pulse destructive radiation on a micro-electronic circuit. The system includes event detection circuitry, power dump logic circuitry, and energy limiting measures with autonomous recovery. The event detection circuitry includes ionizing radiation pulse detection means for detecting a pulse of ionizing radiation and for providing at an output terminal thereof a detection signal indicative of the detection of a pulse of ionizing radiation. The current sensing circuitry is coupled to the power bus for determining an occurrence of excess current through the power bus caused by ionizing radiation or by ion-induced destructive latchup of a semiconductor device. The power dump circuitry includes power dump logic circuitry having a first input terminal connected to the output terminal of the ionizing radiation pulse detection circuitry and having a second input terminal connected to the output terminal of the current sensing circuitry. The power dump logic circuitry provides an output signal to the input terminal of the circuitry for opening the power bus and the circuitry for shorting the power bus to a ground potential to remove power from the power bus. The energy limiting circuitry with autonomous recovery includes circuitry for opening the power bus and circuitry for shorting the power bus to a ground potential. The circuitry for opening the power bus and circuitry for shorting the power bus to a ground potential includes a series FET and a shunt FET. The invention provides for self-contained sensing for latchup, first removal of power to protect latched components, and autonomous recovery to enable transparent operation of other system elements.
NASA Astrophysics Data System (ADS)
Lev, S. M.; Gallo, J.
2017-12-01
The international Arctic scientific community has identified the need for a sustained and integrated portfolio of pan-Arctic Earth-observing systems. In 2017, an international effort was undertaken to develop the first ever Value Tree framework for identifying common research and operational objectives that rely on Earth observation data derived from Earth-observing systems, sensors, surveys, networks, models, and databases to deliver societal benefits in the Arctic. A Value Tree Analysis is a common tool used to support decision making processes and is useful for defining concepts, identifying objectives, and creating a hierarchical framework of objectives. A multi-level societal benefit area value tree establishes the connection from societal benefits to the set of observation inputs that contribute to delivering those benefits. A Value Tree that relies on expert domain knowledge from Arctic and non-Arctic nations, international researchers, Indigenous knowledge holders, and other experts to develop a framework to serve as a logical and interdependent decision support tool will be presented. Value tree examples that map the contribution of Earth observations in the Arctic to achieving societal benefits will be presented in the context of the 2017 International Arctic Observations Assessment Framework. These case studies will highlight specific observing products and capability groups where investment is needed to contribute to the development of a sustained portfolio of Arctic observing systems.
NASA Astrophysics Data System (ADS)
Ghosh, Amal K.
2010-09-01
The parity generators and the checkers are the most important circuits in communication systems. With the development of multi-valued logic (MVL), the proposed system with parity generators and checkers is the most required using the recently developed optoelectronic technology in the modified trinary number (MTN) system. This system also meets up the tremendous needs of speeds by exploiting the savart plates and spatial light modulators (SLM) in the optical tree architecture (OTA).
Cooperative answers in database systems
NASA Technical Reports Server (NTRS)
Gaasterland, Terry; Godfrey, Parke; Minker, Jack; Novik, Lev
1993-01-01
A major concern of researchers who seek to improve human-computer communication involves how to move beyond literal interpretations of queries to a level of responsiveness that takes the user's misconceptions, expectations, desires, and interests into consideration. At Maryland, we are investigating how to better meet a user's needs within the framework of the cooperative answering system of Gal and Minker. We have been exploring how to use semantic information about the database to formulate coherent and informative answers. The work has two main thrusts: (1) the construction of a logic formula which embodies the content of a cooperative answer; and (2) the presentation of the logic formula to the user in a natural language form. The information that is available in a deductive database system for building cooperative answers includes integrity constraints, user constraints, the search tree for answers to the query, and false presuppositions that are present in the query. The basic cooperative answering theory of Gal and Minker forms the foundation of a cooperative answering system that integrates the new construction and presentation methods. This paper provides an overview of the cooperative answering strategies used in the CARMIN cooperative answering system, an ongoing research effort at Maryland. Section 2 gives some useful background definitions. Section 3 describes techniques for collecting cooperative logical formulae. Section 4 discusses which natural language generation techniques are useful for presenting the logic formula in natural language text. Section 5 presents a diagram of the system.
NASA Astrophysics Data System (ADS)
Griebel, A.; Maier, C.; Barton, C. V.; Metzen, D.; Renchon, A.; Boer, M. M.; Pendall, E.
2017-12-01
Mistletoe is a globally distributed group of parasitic plants that infiltrates the vascular tissue of its host trees to acquire water, carbon and nutrients, making it a leading agent of biotic disturbance. Many mistletoes occur in water-limited ecosystems, thus mistletoe infection in combination with increased climatic stress may exacerbate water stress and potentially accelerate mortality rates of infected trees during extreme events. This is an emerging problem in Australia, as mistletoe distribution is increasing and clear links between mistletoe infection and mortality have been established. However, direct observations about how mistletoes alter host physiological processes during extreme events are rare, which impedes our understanding of mechanisms underlying increased tree mortality rates. We addressed this gap by continuously monitoring stem and branch sap flow and a range of leaf traits of infected and uninfected trees of two co-occurring eucalypt species during a severe heatwave in south-eastern Australia. We demonstrate that mistletoes' leaf water potentials were maintained 30% lower than hosts' to redirect the trees' transpiration flow path towards mistletoe leaves. Eucalypt leaves reduced water loss through stomatal regulation when atmospheric dryness exceeded 2 kPa, but the magnitude of stomatal regulation in non-infected eucalypts differed by species (between 40-80%). Remarkably, when infected, sap flow rates of stems and branches of both eucalypt species remained unregulated even under extreme atmospheric dryness (>8 kPa). Our observations indicate that excessive water use of mistletoes likely increases xylem cavitation rates in hosts during prolonged droughts and supports that hydraulic failure contributes to increased mortality of infected trees. Hence, in order to accurately model the contribution of biotic disturbances to tree mortality under a changing climate, it will be crucial to increase our process-based understanding of the interaction between biotic and abiotic dynamics, especially to establish thresholds of critical cavitation rates of infected trees.
Quantum theory as plausible reasoning applied to data obtained by robust experiments.
De Raedt, H; Katsnelson, M I; Michielsen, K
2016-05-28
We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independent and for which the frequency distribution of these events is robust with respect to small changes of the conditions under which the experiments are carried out yields, without introducing any concept of quantum theory, the quantum theoretical description in terms of the Schrödinger or the Pauli equation, the Stern-Gerlach or Einstein-Podolsky-Rosen-Bohm experiments. The extraordinary descriptive power of quantum theory then follows from the fact that it is plausible reasoning, that is common sense, applied to reproducible and robust experimental data. © 2016 The Author(s).
User's guide to the Event Monitor: Part of Prognosis Model Version 6
Nicholas L. Crookston
1990-01-01
Describes how to use the Event Monitor to dynamically invoke management activities in the Prognosis Model for Stand Development. The program accepts statements of conditions -- expressed as logical expressions of stand-state variables -- to be met and sets of activities to be simulated when the conditions are met. The combination of a condition and a set of activities...
McTavish, Emily Jane; Steel, Mike; Holder, Mark T
2015-12-01
Statistically consistent estimation of phylogenetic trees or gene trees is possible if pairwise sequence dissimilarities can be converted to a set of distances that are proportional to the true evolutionary distances. Susko et al. (2004) reported some strikingly broad results about the forms of inconsistency in tree estimation that can arise if corrected distances are not proportional to the true distances. They showed that if the corrected distance is a concave function of the true distance, then inconsistency due to long branch attraction will occur. If these functions are convex, then two "long branch repulsion" trees will be preferred over the true tree - though these two incorrect trees are expected to be tied as the preferred true. Here we extend their results, and demonstrate the existence of a tree shape (which we refer to as a "twisted Farris-zone" tree) for which a single incorrect tree topology will be guaranteed to be preferred if the corrected distance function is convex. We also report that the standard practice of treating gaps in sequence alignments as missing data is sufficient to produce non-linear corrected distance functions if the substitution process is not independent of the insertion/deletion process. Taken together, these results imply inconsistent tree inference under mild conditions. For example, if some positions in a sequence are constrained to be free of substitutions and insertion/deletion events while the remaining sites evolve with independent substitutions and insertion/deletion events, then the distances obtained by treating gaps as missing data can support an incorrect tree topology even given an unlimited amount of data. Copyright © 2015 Elsevier Inc. All rights reserved.
Goel, Vinod; Dolan, Raymond J
2003-12-01
Logic is widely considered the basis of rationality. Logical choices, however, are often influenced by emotional responses, sometimes to our detriment, sometimes to our advantage. To understand the neural basis of emotionally neutral ("cold") and emotionally salient ("hot") reasoning we studied 19 volunteers using event-related fMRI, as they made logical judgments about arguments that varied in emotional saliency. Despite identical logical form and content categories across "hot" and "cold" reasoning conditions, lateral and ventral medial prefrontal cortex showed reciprocal response patterns as a function of emotional saliency of content. "Cold" reasoning trials resulted in enhanced activity in lateral/dorsal lateral prefrontal cortex (L/DLPFC) and suppression of activity in ventral medial prefrontal cortex (VMPFC). By contrast, "hot" reasoning trials resulted in enhanced activation in VMPFC and suppression of activation in L/DLPFC. This reciprocal engagement of L/DLPFC and VMPFC provides evidence for a dynamic neural system for reasoning, the configuration of which is strongly influenced by emotional saliency.
A method for feature selection of APT samples based on entropy
NASA Astrophysics Data System (ADS)
Du, Zhenyu; Li, Yihong; Hu, Jinsong
2018-05-01
By studying the known APT attack events deeply, this paper propose a feature selection method of APT sample and a logic expression generation algorithm IOCG (Indicator of Compromise Generate). The algorithm can automatically generate machine readable IOCs (Indicator of Compromise), to solve the existing IOCs logical relationship is fixed, the number of logical items unchanged, large scale and cannot generate a sample of the limitations of the expression. At the same time, it can reduce the redundancy and useless APT sample processing time consumption, and improve the sharing rate of information analysis, and actively respond to complex and volatile APT attack situation. The samples were divided into experimental set and training set, and then the algorithm was used to generate the logical expression of the training set with the IOC_ Aware plug-in. The contrast expression itself was different from the detection result. The experimental results show that the algorithm is effective and can improve the detection effect.
Effects of the 2015/16 ENSO event on tropical trees in regrowing secondary forests in Central Panama
NASA Astrophysics Data System (ADS)
Bretfeld, M.; Ewers, B. E.; Hall, J. S.; Ogden, F. L.
2016-12-01
The 2015/16 El Niño-Southern Oscillation (ENSO) event ranks amongst the driest and hottest periods on record in Panama, with severe drought conditions reported for over 90% of the country. A predicted long-term transition into a drier climatic period makes this event an ideal opportunity to study the effects of drought on tropical tree species in secondary forests of central Panama. These forests are associated with desirable hydrological ecosystem services, characterized by reduced peak runoff during high precipitation events in the rainy season and increased base flow during the dry season ("sponge-effect"), making these forest invaluable for water provisioning for the Panama Canal's $2 billion business and Panama's thriving capital city. Starting in February 2015, we installed heat-ratio sap flow sensors in 76 trees (representing 42 different species) in secondary forests of three different ages (8, 25, and 80+ years) in the 15 km2 Agua Salud study area, located in the Panama Canal Watershed. Within each site, trees were selected to represent local tree size distribution. Additional sensors were installed on the roots of a subset of trees. Sap flow data were logged every 30 minutes and soil moisture was measured every 3 minutes at 10, 30, 50, and 100 cm depth. Pre-dawn, mid-day, and pre-dusk leaf water potentials were measured during the dry season (March 2016) and rainy season (July 2016). Meteorological data were taken from a nearby met-station ("Celestino"). Primary drivers of transpiration were vapor pressure deficit and solar radiation. Trees of the 25 and 80+ year old forests appear not water limited during the dry season following ENSO while reduced sap flow rates of trees in the 8 year old forest are indicative of a regulatory response to the drought. Younger understory trees in the 80+ year old forest showed no signs of a drought response. Throughout most of the dry season, volumetric water content at 30 and 50 cm depths was 8% lower in the 8 year old forest than in the 80+ year old forest. Our data indicate higher resilience to drought in older forest and support that hydrological properties improve as secondary forests mature in central Panama.
NASA Astrophysics Data System (ADS)
Wang, F.; Gu, L.; Guha, A.; Han, J.; Warren, J.
2017-12-01
The current projections for global climate change forecast an increase in the intensity and frequency of extreme climatic events, such as droughts and short-term heat waves. Understanding the effects of short-term heat wave on photosynthesis process is of critical importance to predict global impacts of extreme weather event on vegetation. The diurnal and seasonal characteristics of SIF emitted from natural vegetation, e.g., forest and crop, have been studied at the ecosystem-scale, regional-scale and global-scale. However, the detailed response of SIF from different plant species under extremely weather event, especially short-term heat wave, have not been reported. The purpose of this study was to study the response of solar-induced chlorophyll fluorescence, gas exchange and continuous fluorescence at leaf scale for different temperate tree species. The short-term heatwave experiment was conducted using plant growth chamber (CMP6050, Conviron Inc., Canada). We developed an advanced spectral fitting method to obtain the plant SIF in the plant growth chamber. We compared SIF variation among different wavelength and chlorophyll difference among four temperate tree species. The diurnal variation of SIF signals at leaf-scales for temperate tree species are different under heat stress. The SIF response at leaf-scales and their difference for four temperate tree species are different during a cycle of short-term heatwave stress. We infer that SIF be used as a measure of heat tolerance for temperate tree species.
Assessment of Methods to Determine Tree Ring Response to Large Magnitude Mississippi River Floods
NASA Astrophysics Data System (ADS)
Therrell, M. D.; Meko, M. D.; Bialecki, M.; Remo, J. W.
2017-12-01
Riparian trees that experience prolonged inundation can record major flood events as inter-and intra-annual variability in size, shape and arrangement of vessels in the annual xylem growth increment. As part of an NSF-funded project to develop tree-ring records of past flooding, we have made collections of several oak species (e.g., Quercus lyrata, Q. macrocarpa) at six sites in the Mississippi River Basin. At each of these sites sampled trees exhibit notably anomalous anatomy of growth increments formed in years coinciding with major recorded floods. We have used these "flood rings" to develop individual site chronologies as well as a regional chronology of spring flood events in the basin for the past several hundred years. We have also analyzed earlywood vessel diameter as a proxy for flooding and find that although this variable reflects only a fraction of the annual-growth increment it strongly reflects tree response to flooding at all the sites so far examined. We compare both these chronologies with the instrumental and historical record of flooding and find that our chronologies are recording nearly all large observed Mississippi River floods in the 20th century, and provide a new record of similar events in the 18th and 19th centuries. These results suggest that tree-rings can be effectively used to develop and improve pre-instrumental flood records throughout the basin and potentially other similar systems.
Cun-Yang Niu; Frederick C. Meinzer; Guang-You Hao
2017-01-01
1. In temperate ecosystems, freeze-thaw events are an important environmental stress that can induce severe xylem embolism (i.e. clogging of conduits by air bubbles) in overwintering organs of trees. However, no comparative studies of different adaptive strategies among sympatric tree species for coping with winter embolism have examined the potential role of the...
A Multi-Encoding Approach for LTL Symbolic Satisfiability Checking
NASA Technical Reports Server (NTRS)
Rozier, Kristin Y.; Vardi, Moshe Y.
2011-01-01
Formal behavioral specifications written early in the system-design process and communicated across all design phases have been shown to increase the efficiency, consistency, and quality of the system under development. To prevent introducing design or verification errors, it is crucial to test specifications for satisfiability. Our focus here is on specifications expressed in linear temporal logic (LTL). We introduce a novel encoding of symbolic transition-based Buchi automata and a novel, "sloppy," transition encoding, both of which result in improved scalability. We also define novel BDD variable orders based on tree decomposition of formula parse trees. We describe and extensively test a new multi-encoding approach utilizing these novel encoding techniques to create 30 encoding variations. We show that our novel encodings translate to significant, sometimes exponential, improvement over the current standard encoding for symbolic LTL satisfiability checking.
Strip-Bark Morphology and Radial Growth Trends in Ancient Pinus sibirica Trees From Central Mongolia
NASA Astrophysics Data System (ADS)
Leland, Caroline; Cook, Edward R.; Andreu-Hayles, Laia; Pederson, Neil; Hessl, Amy; Anchukaitis, Kevin J.; Byambasuren, Oyunsanaa; Nachin, Baatarbileg; Davi, Nicole; D'Arrigo, Rosanne; Griffin, Kevin; Bishop, Daniel A.; Rao, Mukund Palat
2018-03-01
Some of the oldest and most important trees used for dendroclimatic reconstructions develop strip-bark morphology, in which only a portion of the stem contains living tissue. Yet the ecophysiological factors initiating strip bark and the potential effect of cambial dieback on annual ring widths and tree-ring estimates of past climate remain poorly understood. Using a combination of field observations and tree-ring data, we investigate the causes and timing of cambial dieback events in Pinus sibirica strip-bark trees from central Mongolia and compare the radial growth rates and trends of strip-bark and whole-bark trees over the past 515 years. Results indicate that strip bark is more common on the southern aspect of trees, and dieback events were most prevalent in the 19th century, a cold and dry period. Further, strip-bark and whole-bark trees have differing centennial trends, with strip-bark trees exhibiting notably large increases in ring widths at the beginning of the 20th century. We find a steeper positive trend in the strip-bark chronology relative to the whole-bark chronology when standardizing with age-dependent splines. We hypothesize that localized warming on the southern side of stems due to solar irradiance results in physiological damage and dieback and leads to increasing tree-ring increment along the living portion of strip-bark trees. Because the impact of cambial dieback on ring widths likely varies depending on species and site, we suggest conducting a comparison of strip-bark and whole-bark ring widths before statistically treating ring-width data for climate reconstructions.
The use of minimal spanning trees in particle physics
Rainbolt, J. Lovelace; Schmitt, M.
2017-02-14
Minimal spanning trees (MSTs) have been used in cosmology and astronomy to distinguish distributions of points in a multi-dimensional space. They are essentially unknown in particle physics, however. We briefly define MSTs and illustrate their properties through a series of examples. We show how they might be applied to study a typical event sample from a collider experiment and conclude that MSTs may prove useful in distinguishing different classes of events.
The use of minimal spanning trees in particle physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rainbolt, J. Lovelace; Schmitt, M.
Minimal spanning trees (MSTs) have been used in cosmology and astronomy to distinguish distributions of points in a multi-dimensional space. They are essentially unknown in particle physics, however. We briefly define MSTs and illustrate their properties through a series of examples. We show how they might be applied to study a typical event sample from a collider experiment and conclude that MSTs may prove useful in distinguishing different classes of events.
Temporal Association Between Nonfatal Self-Directed Violence and Tree and Grass Pollen Counts.
Jeon-Slaughter, Haekyung; Claassen, Cynthia A; Khan, David A; Mihalakos, Perry; Lee, Kevin B; Brown, E Sherwood
2016-09-01
Prior research suggests a possible association between pollen and suicide. No studies have examined the relationship between pollen and attempted suicide. This study examines the temporal association between airborne pollen counts and nonfatal suicidal and nonsuicidal self-directed violence (SDV) requiring an emergency department visit. Data on daily emergency department visits due to nonfatal SDV as identified by ICD-9 diagnosis criteria were extracted from emergency department medical records of Parkland Memorial Hospital in Dallas, Texas, between January 2000 and December 2003. Concurrent daily airborne tree, grass, and ragweed pollen data from the city of Dallas were extracted from the National Allergy Bureau online database. The data were analyzed using the time series method of generalized autoregressive conditional heteroskedasticity. There were statistically significant and positive temporal associations between tree pollen counts and the number of nonfatal SDV events among women (P = .04) and between grass pollen counts and number of nonfatal SDV events among both men (P = .03) and women (P < .0001). There was no significant temporal association found between ragweed pollen counts and number of nonfatal SDV events. The study findings suggest that an increase in nonfatal SDV is associated with changes in tree and grass pollen counts. This is the first study that has examined an association between seasonal variation in tree and grass pollen levels and nonfatal SDV event data. The study also used a narrowly defined geographic area and temporal window. The findings suggest that pollen count may be a factor influencing seasonal patterns in suicidal behavior. © Copyright 2016 Physicians Postgraduate Press, Inc.
Baneshi, Mohammad Reza; Haghdoost, Ali Akbar; Zolala, Farzaneh; Nakhaee, Nouzar; Jalali, Maryam; Tabrizi, Reza; Akbari, Maryam
2017-04-01
This study aimed to assess using tree-based models the impact of different dimensions of religion and other risk factors on suicide attempts in the Islamic Republic of Iran. Three hundred patients who attempted suicide and 300 age- and sex-matched patient attendants with other types of disease who referred to Kerman Afzalipour Hospital were recruited for this study following a convenience sampling. Religiosity was assessed by the Duke University Religion Index. A tree-based model was constructed using the Gini Index as the homogeneity criterion. A complementary discrimination analysis was also applied. Variables contributing to the construction of the tree were stressful life events, mental disorder, family support, and religious belief. Strong religious belief was a protective factor for those with a low number of stressful life events and those with a high mental disorder score; 72 % of those who formed these two groups had not attempted suicide. Moreover, 63 % of those with a high number of stressful life events, strong family support, strong problem-solving skills, and a low mental disorder score were less likely to attempt suicide. The significance of four other variables, GHQ, problem-coping skills, friend support, and neuroticism, was revealed in the discrimination analysis. Religious beliefs seem to be an independent factor that can predict risk for suicidal behavior. Based on the decision tree, religious beliefs among people with a high number of stressful life events might not be a dissuading factor. Such subjects need more family support and problem-solving skills.
F-15 digital electronic engine control system description
NASA Technical Reports Server (NTRS)
Myers, L. P.
1984-01-01
A digital electronic engine control (DEEC) was developed for use on the F100-PW-100 turbofan engine. This control system has full authority control, capable of moving all the controlled variables over their full ranges. The digital computational electronics and fault detection and accomodation logic maintains safe engine operation. A hydromechanical backup control (BUC) is an integral part of the fuel metering unit and provides gas generator control at a reduced performance level in the event of an electronics failure. The DEEC's features, hardware, and major logic diagrams are described.
Quantum Logic with Cavity Photons From Single Atoms.
Holleczek, Annemarie; Barter, Oliver; Rubenok, Allison; Dilley, Jerome; Nisbet-Jones, Peter B R; Langfahl-Klabes, Gunnar; Marshall, Graham D; Sparrow, Chris; O'Brien, Jeremy L; Poulios, Konstantinos; Kuhn, Axel; Matthews, Jonathan C F
2016-07-08
We demonstrate quantum logic using narrow linewidth photons that are produced with an a priori nonprobabilistic scheme from a single ^{87}Rb atom strongly coupled to a high-finesse cavity. We use a controlled-not gate integrated into a photonic chip to entangle these photons, and we observe nonclassical correlations between photon detection events separated by periods exceeding the travel time across the chip by 3 orders of magnitude. This enables quantum technology that will use the properties of both narrow-band single photon sources and integrated quantum photonics.
Historical vegetation change in Oakland and its implications for urban forest management
David J. Nowak
1993-01-01
The history of Oakland, California's urban forest was researched to determine events that could influence future urban forests. Vegetation in Oakland has changed drastically from a preurbanized area with approximately 2% tree cover to a present tree cover of 19%. Species composition of trees was previously dominated by coast live oak (Quercus agrifolia...
Andrew J. Larson; Jerry F. Franklin
2005-01-01
We investigated the effect of fire severity and environmental conditions on conifer tree regeneration 11 years after an autumn wildfire in the western Oregon Cascade Range. Conifer tree seedlings, including those of Pseudotsuga menziesii, established promptly and at high densities following fire, in contrast to long establishment periods documented...
Reliability computation using fault tree analysis
NASA Technical Reports Server (NTRS)
Chelson, P. O.
1971-01-01
A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.
A framework for adapting urban forests to climate change
Leslie Brandt; Abigail Derby Lewis; Robert Fahey; Lydia Scott; Lindsay Darling; Chris Swanston
2016-01-01
Planting urban trees and expanding urban forest canopy cover are often considered key strategies for reducing climate change impacts in urban areas. However, urban trees and forests can also be vulnerable to climate change through shifts in tree habitat suitability, changes in pests and diseases, and changes in extreme weather events. We developed a three-step...
Betinna M.J. Engelbrecht; S. Joseph Wright; Diane De Steven
2002-01-01
In tropical forests, severe droughts caused by El Nino events may strongly influence the water relations of tree seedlings and thereby increase their mortality. Data on known-aged seedlings of three common shade-tolerant canopy tree species (Trichilia tuberculata, Tetragastris panamensis and Quararibea asterolepis) in a Panamanian...
Generation of a mixture model ground-motion prediction equation for Northern Chile
NASA Astrophysics Data System (ADS)
Haendel, A.; Kuehn, N. M.; Scherbaum, F.
2012-12-01
In probabilistic seismic hazard analysis (PSHA) empirically derived ground motion prediction equations (GMPEs) are usually applied to estimate the ground motion at a site of interest as a function of source, path and site related predictor variables. Because GMPEs are derived from limited datasets they are not expected to give entirely accurate estimates or to reflect the whole range of possible future ground motion, thus giving rise to epistemic uncertainty in the hazard estimates. This is especially true for regions without an indigenous GMPE where foreign models have to be applied. The choice of appropriate GMPEs can then dominate the overall uncertainty in hazard assessments. In order to quantify this uncertainty, the set of ground motion models used in a modern PSHA has to capture (in SSHAC language) the center, body, and range of the possible ground motion at the site of interest. This was traditionally done within a logic tree framework in which existing (or only slightly modified) GMPEs occupy the branches of the tree and the branch weights describe the degree-of-belief of the analyst in their applicability. This approach invites the problem to combine GMPEs of very different quality and hence to potentially overestimate epistemic uncertainty. Some recent hazard analysis have therefore resorted to using a small number of high quality GMPEs as backbone models from which the full distribution of GMPEs for the logic tree (to capture the full range of possible ground motion uncertainty) where subsequently generated by scaling (in a general sense). In the present study, a new approach is proposed to determine an optimized backbone model as weighted components of a mixture model. In doing so, each GMPE is assumed to reflect the generation mechanism (e. g. in terms of stress drop, propagation properties, etc.) for at least a fraction of possible ground motions in the area of interest. The combination of different models into a mixture model (which is learned from observed ground motion data in the region of interest) is then transferring information from other regions to the region where the observations have been produced in a data driven way. The backbone model is learned by comparing the model predictions to observations of the target region. For each observation and each model, the likelihood of an observation given a certain GMPE is calculated. Mixture weights can then be assigned using the expectation maximization (EM) algorithm or Bayesian inference. The new method is used to generate a backbone reference model for Northern Chile, an area for which no dedicated GMPE exists. Strong motion recordings from the target area are used to learn the backbone model from a set of 10 GMPEs developed for different subduction zones of the world. The formation of mixture models is done individually for interface and intraslab type events. The ability of the resulting backbone models to describe ground motions in Northern Chile is then compared to the predictive performance of their constituent models.
How does tree age influence damage and recovery in forests impacted by freezing rain and snow?
Zhu, LiRong; Zhou, Ting; Chen, BaoMing; Peng, ShaoLin
2015-05-01
The response and recovery mechanisms of forests to damage from freezing rain and snow events are a key topic in forest research and management. However, the relationship between the degree of damage and tree age, i.e., whether seedlings, young trees, or adult trees are most vulnerable, remains unclear and is rarely reported. We investigated the effect of tree age on the degrees of vegetation damage and subsequent recovery in three subtropical forest types-coniferous, mixed, and broad-leaved-in the Tianjing Mountains, South China, after a series of rare icy rain and freezing snow events in 2008. The results showed that damage and recovery rates were both dependent on tree age, with the proportion of damaged vegetation increasing with age (estimated by diameter at breast height, DBH) in all three forest types and gradually plateauing. Significant variation occurred among forest types. Young trees in the coniferous forest were more vulnerable than those in the broad-leaved forest. The type of damage also varied with tree age in different ways in the three forest types. The proportion of young seedlings that were uprooted (the most severe type of damage) was highest in the coniferous forest. In the mixed forest, young trees were significantly more likely to be uprooted than seedlings and adult trees, while in the broad-leaved forest, the proportion of uprooted adult trees was significantly higher than that of seedlings and young trees. There were also differences among forest types in how tree age affected damage recovery. In the coniferous forest, the recovery rate of trees with broken trunks or crowns (DBH > 2.5 cm) increased with tree age. However, in the mixed and broad-leaved forests, no obvious correlation between the recovery rate of trees with broken trunks or crowns and tree age was observed. Trees with severe root damage did not recover; they were uprooted and died. In these forests, vegetation damage and recovery showed tree age dependencies, which varied with tree shape, forest type, and damage type. Understanding this dependency will guide restoration after freezing rain and snow disturbances.
Diagnostic Features of Common Oral Ulcerative Lesions: An Updated Decision Tree
Safi, Yaser
2016-01-01
Diagnosis of oral ulcerative lesions might be quite challenging. This narrative review article aims to introduce an updated decision tree for diagnosing oral ulcerative lesions on the basis of their diagnostic features. Various general search engines and specialized databases including PubMed, PubMed Central, Medline Plus, EBSCO, Science Direct, Scopus, Embase, and authenticated textbooks were used to find relevant topics by means of MeSH keywords such as “oral ulcer,” “stomatitis,” and “mouth diseases.” Thereafter, English-language articles published since 1983 to 2015 in both medical and dental journals including reviews, meta-analyses, original papers, and case reports were appraised. Upon compilation of the relevant data, oral ulcerative lesions were categorized into three major groups: acute, chronic, and recurrent ulcers and into five subgroups: solitary acute, multiple acute, solitary chronic, multiple chronic, and solitary/multiple recurrent, based on the number and duration of lesions. In total, 29 entities were organized in the form of a decision tree in order to help clinicians establish a logical diagnosis by stepwise progression. PMID:27781066
NASA Astrophysics Data System (ADS)
Šilhán, Karel
2017-01-01
Dendrogeomorphic methods are frequently used in landslide analyses. Although methods of landslide dating based on tree rings are well developed, they still indicated many questions. The aim of this study was to evaluate the frequently used theoretical scheme based on the event-response relationship. Seventy-four individuals of Norway spruce (Picea abies (L.) Karst.) exhibiting visible external disturbance, were sampled on the Girová landslide (the largest historical flow-like landslide in the Czech Republic). This landslide reactivated in May 2010, and post-landslide tree growth responses were studied in detail. These growth responses were compared with the intensity and occurrence of visible external tree disturbance: tilted stems, damaged root systems, and decapitation. Twenty-nine trees (39.2%) died within one to four years following the 2010 landslide movement. The trees that died following the landslide movement were significantly younger and displayed significantly greater stem tilting than the live trees. Abrupt growth suppression was a more-frequent response among the dead trees, whereas growth release dominated among the live trees. Only two trees (2.7%) created no reaction wood in response to the landslide movement. Forty-four percent of the trees started to produce reaction wood structure after a delay, which generally spanned one year. Some eccentric growth was evident in the tree rings of the landslide year and was significant in the first years following the landslide movement. Missing rings were observed only on the upper sides of the stems, and no false tree rings were observed. The results confirm the general validity of event-response relationship, nevertheless this study points out the limitations and uncertainties of this generally accepted working scheme.
Drought-induced changes in Amazon forest structure from repeat airborne lidar
NASA Astrophysics Data System (ADS)
Morton, D. C.; Leitold, V.; Longo, M.; Keller, M.; dos-Santos, M. N.; Scaranello, M. A., Sr.
2017-12-01
Drought events in tropical forests, including the 2015-2016 El Niño, may reduce net primary productivity and increase canopy tree mortality, thereby altering the short and long-term net carbon balance of tropical forests. Given the broad extent of drought impacts, forest inventory plots or eddy flux towers may not capture regional variability in forest response to drought. Here, we analyzed repeat airborne lidar data to evaluate canopy turnover from branch and tree fall before (2013-2014) and during (2014-2016) the recent El Niño drought in the eastern and central Brazilian Amazon. Coincident field surveys for a 16-ha subset of the lidar coverage provided complementary information to classify turnover areas by mechanism (branch, multiple branch, tree fall, multiple tree fall) and estimate the total coarse woody debris volume from canopy and understory tree mortality. Annualized rates of canopy turnover increased by 50%, on average, during the drought period in both intact and fragmented forests near Santarém, Pará. Turnover increased uniformly across all size classes, and there was limited evidence that taller trees contributed a greater proportion of turnover events in any size class in 2014-2016 compared to 2013-2014. This short-term increase in canopy turnover differs from findings in multi-year rainfall exclusion experiments that large trees were more sensitive to drought impacts. Field measurements confirmed the separability of the smallest (single branch) and largest damage classes (multiple tree falls), but single tree and multiple branch fall events generated similar coarse woody debris production and lidar-derived changes in canopy volume. Large-scale sampling possible with repeat airborne lidar data also captured strong local and regional gradients in canopy turnover. Differences in slope partially explained the north-south gradient in canopy turnover dynamics near Santarém, with larger increases in turnover on flatter terrain. Regional variability in canopy turnover in response to drought conditions highlights the need for a mechanistic representation of branch and tree fall dynamics in ecosystem models to resolve changes in net carbon balance from the increase in coarse woody debris production and reorganization of canopy light environments during drought years.
NASA Astrophysics Data System (ADS)
Ragettli, S.; Zhou, J.; Wang, H.; Liu, C.
2017-12-01
Flash floods in small mountain catchments are one of the most frequent causes of loss of life and property from natural hazards in China. Hydrological models can be a useful tool for the anticipation of these events and the issuing of timely warnings. Since sub-daily streamflow information is unavailable for most small basins in China, one of the main challenges is finding appropriate parameter values for simulating flash floods in ungauged catchments. In this study, we use decision tree learning to explore parameter set transferability between different catchments. For this purpose, the physically-based, semi-distributed rainfall-runoff model PRMS-OMS is set up for 35 catchments in ten Chinese provinces. Hourly data from more than 800 storm runoff events are used to calibrate the model and evaluate the performance of parameter set transfers between catchments. For each catchment, 58 catchment attributes are extracted from several data sets available for whole China. We then use a data mining technique (decision tree learning) to identify catchment similarities that can be related to good transfer performance. Finally, we use the splitting rules of decision trees for finding suitable donor catchments for ungauged target catchments. We show that decision tree learning allows to optimally utilize the information content of available catchment descriptors and outperforms regionalization based on a conventional measure of physiographic-climatic similarity by 15%-20%. Similar performance can be achieved with a regionalization method based on spatial proximity, but decision trees offer flexible rules for selecting suitable donor catchments, not relying on the vicinity of gauged catchments. This flexibility makes the method particularly suitable for implementation in sparsely gauged environments. We evaluate the probability to detect flood events exceeding a given return period, considering measured discharge and PRMS-OMS simulated flows with regionalized parameters. Overall, the probability of detection of an event with a return period of 10 years is 62%. 44% of all 10-year flood peaks can be detected with a timing error of 2 hours or less. These results indicate that the modeling system can provide useful information about the timing and magnitude of flood events at ungauged sites.
Chen, Yingyi; Zhen, Zhumi; Yu, Huihui; Xu, Jing
2017-01-14
In the Internet of Things (IoT) equipment used for aquaculture is often deployed in outdoor ponds located in remote areas. Faults occur frequently in these tough environments and the staff generally lack professional knowledge and pay a low degree of attention in these areas. Once faults happen, expert personnel must carry out maintenance outdoors. Therefore, this study presents an intelligent method for fault diagnosis based on fault tree analysis and a fuzzy neural network. In the proposed method, first, the fault tree presents a logic structure of fault symptoms and faults. Second, rules extracted from the fault trees avoid duplicate and redundancy. Third, the fuzzy neural network is applied to train the relationship mapping between fault symptoms and faults. In the aquaculture IoT, one fault can cause various fault symptoms, and one symptom can be caused by a variety of faults. Four fault relationships are obtained. Results show that one symptom-to-one fault, two symptoms-to-two faults, and two symptoms-to-one fault relationships can be rapidly diagnosed with high precision, while one symptom-to-two faults patterns perform not so well, but are still worth researching. This model implements diagnosis for most kinds of faults in the aquaculture IoT.
Chen, Yingyi; Zhen, Zhumi; Yu, Huihui; Xu, Jing
2017-01-01
In the Internet of Things (IoT) equipment used for aquaculture is often deployed in outdoor ponds located in remote areas. Faults occur frequently in these tough environments and the staff generally lack professional knowledge and pay a low degree of attention in these areas. Once faults happen, expert personnel must carry out maintenance outdoors. Therefore, this study presents an intelligent method for fault diagnosis based on fault tree analysis and a fuzzy neural network. In the proposed method, first, the fault tree presents a logic structure of fault symptoms and faults. Second, rules extracted from the fault trees avoid duplicate and redundancy. Third, the fuzzy neural network is applied to train the relationship mapping between fault symptoms and faults. In the aquaculture IoT, one fault can cause various fault symptoms, and one symptom can be caused by a variety of faults. Four fault relationships are obtained. Results show that one symptom-to-one fault, two symptoms-to-two faults, and two symptoms-to-one fault relationships can be rapidly diagnosed with high precision, while one symptom-to-two faults patterns perform not so well, but are still worth researching. This model implements diagnosis for most kinds of faults in the aquaculture IoT. PMID:28098822
Drought frequency in central California since 101 B.C. recordered in giant sequoia tree rings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, M.K.; Brown, P.M.
1992-01-01
Well replicated tree-ring width index chronologies have been developed for giant sequoia at three sites in the Sierra Nevada, California. Extreme low-growth events in these chronologies correspond with regional drought events in the twentieth century in the San Joaquin drainage, in which the giant sequoia sites are located. This relationship is based upon comparison of tree-ring indices with August Palmer Drought Severity Indices for California Climate Division 5. Ring-width indices in the lowest decile from each site were compared. The frequency of low-growth events which occurred at all three sites in the same year is reconstructed from 101 B.C. tomore » A.D. 1988. The inferred frequency of severe drought events changes through time, sometimes suddenly. The period from roughly 1850 to 1950 had one of the lowest frequencies of drought of any one hundred year period in the 2089 year record. The twentieth century so far has had a below-average frequency of extreme droughts. 26 refs., 6 figs., 1 tab.« less
Fault tree analysis for urban flooding.
ten Veldhuis, J A E; Clemens, F H L R; van Gelder, P H A J M
2009-01-01
Traditional methods to evaluate flood risk generally focus on heavy storm events as the principal cause of flooding. Conversely, fault tree analysis is a technique that aims at modelling all potential causes of flooding. It quantifies both overall flood probability and relative contributions of individual causes of flooding. This paper presents a fault model for urban flooding and an application to the case of Haarlem, a city of 147,000 inhabitants. Data from a complaint register, rainfall gauges and hydrodynamic model calculations are used to quantify probabilities of basic events in the fault tree. This results in a flood probability of 0.78/week for Haarlem. It is shown that gully pot blockages contribute to 79% of flood incidents, whereas storm events contribute only 5%. This implies that for this case more efficient gully pot cleaning is a more effective strategy to reduce flood probability than enlarging drainage system capacity. Whether this is also the most cost-effective strategy can only be decided after risk assessment has been complemented with a quantification of consequences of both types of events. To do this will be the next step in this study.
Using Boosting Decision Trees in Gravitational Wave Searches triggered by Gamma-ray Bursts
NASA Astrophysics Data System (ADS)
Zuraw, Sarah; LIGO Collaboration
2015-04-01
The search for gravitational wave bursts requires the ability to distinguish weak signals from background detector noise. Gravitational wave bursts are characterized by their transient nature, making them particularly difficult to detect as they are similar to non-Gaussian noise fluctuations in the detector. The Boosted Decision Tree method is a powerful machine learning algorithm which uses Multivariate Analysis techniques to explore high-dimensional data sets in order to distinguish between gravitational wave signal and background detector noise. It does so by training with known noise events and simulated gravitational wave events. The method is tested using waveform models and compared with the performance of the standard gravitational wave burst search pipeline for Gamma-ray Bursts. It is shown that the method is able to effectively distinguish between signal and background events under a variety of conditions and over multiple Gamma-ray Burst events. This example demonstrates the usefulness and robustness of the Boosted Decision Tree and Multivariate Analysis techniques as a detection method for gravitational wave bursts. LIGO, UMass, PREP, NEGAP.
Vulnerability of Amazon forests to storm-driven tree mortality
NASA Astrophysics Data System (ADS)
Negrón-Juárez, Robinson I.; Holm, Jennifer A.; Magnabosco Marra, Daniel; Rifai, Sami W.; Riley, William J.; Chambers, Jeffrey Q.; Koven, Charles D.; Knox, Ryan G.; McGroddy, Megan E.; Di Vittorio, Alan V.; Urquiza-Muñoz, Jose; Tello-Espinoza, Rodil; Alegria Muñoz, Waldemar; Ribeiro, Gabriel H. P. M.; Higuchi, Niro
2018-05-01
Tree mortality is a key driver of forest community composition and carbon dynamics. Strong winds associated with severe convective storms are dominant natural drivers of tree mortality in the Amazon. Why forests vary with respect to their vulnerability to wind events and how the predicted increase in storm events might affect forest ecosystems within the Amazon are not well understood. We found that windthrows are common in the Amazon region extending from northwest (Peru, Colombia, Venezuela, and west Brazil) to central Brazil, with the highest occurrence of windthrows in the northwest Amazon. More frequent winds, produced by more frequent severe convective systems, in combination with well-known processes that limit the anchoring of trees in the soil, help to explain the higher vulnerability of the northwest Amazon forests to winds. Projected increases in the frequency and intensity of convective storms in the Amazon have the potential to increase wind-related tree mortality. A forest demographic model calibrated for the northwestern and the central Amazon showed that northwestern forests are more resilient to increased wind-related tree mortality than forests in the central Amazon. Our study emphasizes the importance of including wind-related tree mortality in model simulations for reliable predictions of the future of tropical forests and their effects on the Earth’ system.
Enabling Medical Device Interoperability for the Integrated Clinical Environment
2013-08-01
include the unique device identifier (UDI) as specified by the FDA , a logical timestamp as described above, and the data. 17 Existing adverse event...failure or malfunction that led to an adverse effect during a medical procedure. User: clinical and legal experts, IT-experts, biomed experts...diagnosis, treatment, research, safety and quality improvements, equipment management, and adverse event detection and reporting . The Medical
How Much Water Trees Access and How It Determines Forest Response to Drought
NASA Astrophysics Data System (ADS)
Berdanier, A. B.; Clark, J. S.
2015-12-01
Forests are transformed by drought as water availability drops below levels where trees of different sizes and species can maintain productivity and survive. Physiological studies have provided detailed understanding of how species differences affect drought vulnerability but they offer almost no insights about the amount of water different trees can access beyond general statements about rooting depth. While canopy architecture provides strong evidence for light availability aboveground, belowground moisture availability remains essentially unknown. For example, do larger trees always have greater access to soil moisture? In temperate mixed forests, the ability to access a large soil moisture pool could minimize damage during drought events and facilitate post-drought recovery, potentially at the expense of neighboring trees. We show that the pool of accessible soil moisture can be estimated for trees with data on whole-plant transpiration and that this data can be used to predict water availability for forest stands. We estimate soil water availability with a Bayesian state-space model based on a simple water balance, where cumulative depressions in water use below potential transpiration indicate soil resource depletion. We compare trees of different sizes and species, extend these findings to the entire stand, and connect them to our recent research showing that tree survival after drought depends on post-drought growth recovery and local moisture availability. These results can be used to predict competitive abilities for soil water, understand ecohydrological variation within stands, and identify trees that are at risk of damage from future drought events.
NASA Astrophysics Data System (ADS)
Whitetree, A.; Van Stan, J. T., II; Wagner, S.; Guillemette, F.; Lewis, J.; Silva, L.; Stubbins, A.
2017-12-01
Studies on the fate and transport of dissolved organic matter (DOM) along the rainfall-to-discharge flow pathway typically begin in streams or soils, neglecting the initial enrichment of rainfall with DOM during contact with plant canopies. However, rain water can gather significant amounts of tree-derived DOM (tree-DOM) when it drains from the canopy, as throughfall, and down the stem, as stemflow. We examined the temporal variability of event-scale tree-DOM concentrations, yield, and optical (light absorbance and fluorescence) characteristics from an epiphyte-laden Quercus virginiana-Juniperus virginiana forest on Skidaway Island, Savannah, Georgia (USA). All tree-DOM fluxes were highly enriched compared to rainfall and epiphytes further increased concentrations. Stemflow DOC concentrations were greater than throughfall across study species, yet larger throughfall water yields produced greater DOC yields versus stemflow. Tree-DOM optical characteristics indicate it is aromatic-rich with FDOM dominated by humic-like fluorescence, containing 10-20% protein-like (tryptophan-like) fluorescence. Storm size was the only storm condition that strongly correlated with tree-DOM concentration and flux; however, throughfall and stemflow optical characteristics varied little across a wide range of storm conditions (from low magnitude events to intense tropical storms). Annual tree-DOM yields from the study forest (0.8-46 g-C m-2 yr-1) compared well to other yields along the rainfall-to- discharge flow pathway, exceeding DOM yields from some river watersheds.
Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees.
Martínez-Aquino, Andrés
2016-08-01
Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host-parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a "compass" when "walking" through jungles of tangled phylogenetic trees.
Phylogenetic framework for coevolutionary studies: a compass for exploring jungles of tangled trees
2016-01-01
Abstract Phylogenetics is used to detect past evolutionary events, from how species originated to how their ecological interactions with other species arose, which can mirror cophylogenetic patterns. Cophylogenetic reconstructions uncover past ecological relationships between taxa through inferred coevolutionary events on trees, for example, codivergence, duplication, host-switching, and loss. These events can be detected by cophylogenetic analyses based on nodes and the length and branching pattern of the phylogenetic trees of symbiotic associations, for example, host–parasite. In the past 2 decades, algorithms have been developed for cophylogetenic analyses and implemented in different software, for example, statistical congruence index and event-based methods. Based on the combination of these approaches, it is possible to integrate temporal information into cophylogenetical inference, such as estimates of lineage divergence times between 2 taxa, for example, hosts and parasites. Additionally, the advances in phylogenetic biogeography applying methods based on parametric process models and combined Bayesian approaches, can be useful for interpreting coevolutionary histories in a scenario of biogeographical area connectivity through time. This article briefly reviews the basics of parasitology and provides an overview of software packages in cophylogenetic methods. Thus, the objective here is to present a phylogenetic framework for coevolutionary studies, with special emphasis on groups of parasitic organisms. Researchers wishing to undertake phylogeny-based coevolutionary studies can use this review as a “compass” when “walking” through jungles of tangled phylogenetic trees. PMID:29491928
Comparing Phylogenetic Trees by Matching Nodes Using the Transfer Distance Between Partitions.
Bogdanowicz, Damian; Giaro, Krzysztof
2017-05-01
Ability to quantify dissimilarity of different phylogenetic trees describing the relationship between the same group of taxa is required in various types of phylogenetic studies. For example, such metrics are used to assess the quality of phylogeny construction methods, to define optimization criteria in supertree building algorithms, or to find horizontal gene transfer (HGT) events. Among the set of metrics described so far in the literature, the most commonly used seems to be the Robinson-Foulds distance. In this article, we define a new metric for rooted trees-the Matching Pair (MP) distance. The MP metric uses the concept of the minimum-weight perfect matching in a complete bipartite graph constructed from partitions of all pairs of leaves of the compared phylogenetic trees. We analyze the properties of the MP metric and present computational experiments showing its potential applicability in tasks related to finding the HGT events.
Almeida, Fernando; Moreira, Diana
2017-01-01
Many clinical patients present to mental health clinics with depressive symptoms, anxiety, psychosomatic complaints, and sleeping problems. These symptoms which originated may originate from marital problems, conflictual interpersonal relationships, problems in securing work, and housing issues, among many others. These issues might interfere which underlie the difficulties that with the ability of the patients face in maintaining faultless logical reasoning (FLR) and faultless logical functioning (FLF). FLR implies to assess correctly premises, rules, and conclusions. And FLF implies assessing not only FLR, but also the circumstances, life experience, personality, events that validate a conclusion. Almost always, the symptomatology is accompanied by intense emotional changes. Clinical experience shows that a logic-based psychotherapy (LBP) approach is not practiced, and that therapists’ resort to psychopharmacotherapy or other types of psychotherapeutic approaches that are not focused on logical reasoning and, especially, logical functioning. Because of this, patients do not learn to overcome their reasoning and functioning errors. The aim of this work was to investigate how LBP works to improve the patients’ ability to think and function in a faultless logical way. This work describes the case studies of three patients. For this purpose we described the treatment of three patients. With this psychotherapeutic approach, patients gain knowledge that can then be applied not only to the issues that led them to the consultation, but also to other problems they have experienced, thus creating a learning experience and helping to prevent such patients from becoming involved in similar problematic situations. This highlights that LBP is a way of treating symptoms that interfere on some level with daily functioning. This psychotherapeutic approach is relevant for improving patients’ quality of life, and it fills a gap in the literature by describing original case analyses. PMID:29312088
NASA Astrophysics Data System (ADS)
Battipaglia, G.; Frank, D.; Buentgen, U.; Dobrovolný, P.; Brázdil, R.; Pfister, C.; Esper, J.
2009-09-01
In this project three different summer temperature sensitive tree-ring chronologies across the European Alpine region were compiled and analyzed to make a calendar of extreme warm and cold summers. We identified 100 extreme events during the past millennium from the tree ring data, and 44 extreme years during the 1550-2003 period based upon tree-ring, documentary and instrumental evidence. Comparisons with long instrumental series and documentary evidence verify the tree-ring extremes and indicate the possibility to use this dataset towards a better understanding of the characteristics prior to the instrumental period. Potential links between the occurrence of extreme events over Alps and anomalous large-scale patterns were explored and indicate that the average pattern of the 20 warmest summers (over the 1700-2002 period) describes maximum positive anomalies over Central Europe, whereas the average pattern of the 20 coldest summers shows maximum negative anomalies over Western Europe. Challenges with the present approach included determining an appropriate classification scheme for extreme events and the development of a methodology able to identify and characterize the occurrence of extreme episodes back in time. As a future step, our approach will be extended to help verify the sparse documentary data from the beginning of the past millennium and will be used in conjunction with climate models to assess model capabilities in reproducing characteristics of temperature extremes.
NASA Astrophysics Data System (ADS)
Fang, Ouya; Alfaro, René I.; Zhang, Qi-Bin
2018-04-01
There is a growing research interest on studying forest mortality in relation to ongoing climate warming, but little is known about such events in past history. The study of past forest mortality provides valuable information for determining baselines that establish the normal parameters of functioning in forest ecosystems. Here we report a major episode of previously undocumented forest mortality in the late 18th century on the northern Tibetan Plateau, China. The event was not spatially uniform, in which a more severe mortality happened in dryer sites. We used dendrochronology to compare radial growth trajectories of individual trees from 11 sites in the region, and found that many trees showed positive growth trend, or growth release, during 1796-1800 CE. Growth releases are a proxy indicator of stand thinning caused by tree mortality. The growth release was preceded by an almost two-decade long growth reduction. Long-term drought related to weakened North Atlantic Oscillation and frequent El Niño events are the likely factors causing the tree mortality in a large area of the plateau. Our findings suggest that, besides the effect of drought in the late 18th century, large-scale forest mortality may be an additional factor that further deteriorated the environment and increased the intensity of dust storms.
Updated precipitation reconstruction (AD 1482-2012) for Huashan, north-central China
NASA Astrophysics Data System (ADS)
Chen, Feng; Zhang, Ruibo; Wang, Huiqin; Qin, Li; Yuan, Yujiang
2016-02-01
We developed a tree-ring width chronology from pine trees ( Pinus tabulaeformis and Pinus armandii) stand near the peaks of Huashan, Shaanxi, north-central China. Growth-climate response analyses showed that the radial growth of pine trees is mainly influenced by April-June precipitation. A model to reconstruct precipitation based on tree widths was constructed, accounting for 55 % of the instrumental variance during the period 1953-2012. Spatial correlation analyses between the reconstruction and observed gridded precipitation data shows that the seasonal precipitation reconstruction captures regional climatic variations over north China. Compared with the historical archives and other tree-ring records in north China, many large-scale drought events, linked to the El Niño-Southern Oscillation (ENSO), were found. Many of these events have had profound impacts on the people of north China over the past several centuries. Composite maps of sea surface temperatures and 500 hPa geopotential heights for selected extremely dry and wet years in Huashan show characteristics similar to those related to the ENSO patterns, particularly with regard to ocean and atmospheric conditions in the equatorial and north Pacific. Our 531-year precipitation reconstruction for Huashan provides a long-term perspective on current and 20th century wet and dry events in north China, and is useful to guide expectations of future variability, and helps us to address climate change.
Intrathoracic airway wall detection using graph search and scanner PSF information
NASA Astrophysics Data System (ADS)
Reinhardt, Joseph M.; Park, Wonkyu; Hoffman, Eric A.; Sonka, Milan
1997-05-01
Measurements of the in vivo bronchial tree can be used to assess regional airway physiology. High-resolution CT (HRCT) provides detailed images of the lungs and has been used to evaluate bronchial airway geometry. Such measurements have been sued to assess diseases affecting the airways, such as asthma and cystic fibrosis, to measure airway response to external stimuli, and to evaluate the mechanics of airway collapse in sleep apnea. To routinely use CT imaging in a clinical setting to evaluate the in vivo airway tree, there is a need for an objective, automatic technique for identifying the airway tree in the CT images and measuring airway geometry parameters. Manual or semi-automatic segmentation and measurement of the airway tree from a 3D data set may require several man-hours of work, and the manual approaches suffer from inter-observer and intra- observer variabilities. This paper describes a method for automatic airway tree analysis that combines accurate airway wall location estimation with a technique for optimal airway border smoothing. A fuzzy logic, rule-based system is used to identify the branches of the 3D airway tree in thin-slice HRCT images. Raycasting is combined with a model-based parameter estimation technique to identify the approximate inner and outer airway wall borders in 2D cross-sections through the image data set. Finally, a 2D graph search is used to optimize the estimated airway wall locations and obtain accurate airway borders. We demonstrate this technique using CT images of a plexiglass tube phantom.
A Seismic Source Model for Central Europe and Italy
NASA Astrophysics Data System (ADS)
Nyst, M.; Williams, C.; Onur, T.
2006-12-01
We present a seismic source model for Central Europe (Belgium, Germany, Switzerland, and Austria) and Italy, as part of an overall seismic risk and loss modeling project for this region. A separate presentation at this conference discusses the probabilistic seismic hazard and risk assessment (Williams et al., 2006). Where available we adopt regional consensus models and adjusts these to fit our format, otherwise we develop our own model. Our seismic source model covers the whole region under consideration and consists of the following components: 1. A subduction zone environment in Calabria, SE Italy, with interface events between the Eurasian and African plates and intraslab events within the subducting slab. The subduction zone interface is parameterized as a set of dipping area sources that follow the geometry of the surface of the subducting plate, whereas intraslab events are modeled as plane sources at depth; 2. The main normal faults in the upper crust along the Apennines mountain range, in Calabria and Central Italy. Dipping faults and (sub-) vertical faults are parameterized as dipping plane and line sources, respectively; 3. The Upper and Lower Rhine Graben regime that runs from northern Italy into eastern Belgium, parameterized as a combination of dipping plane and line sources, and finally 4. Background seismicity, parameterized as area sources. The fault model is based on slip rates using characteristic recurrence. The modeling of background and subduction zone seismicity is based on a compilation of several national and regional historic seismic catalogs using a Gutenberg-Richter recurrence model. Merging the catalogs encompasses the deletion of double, fake and very old events and the application of a declustering algorithm (Reasenberg, 2000). The resulting catalog contains a little over 6000 events, has an average b-value of -0.9, is complete for moment magnitudes 4.5 and larger, and is used to compute a gridded a-value model (smoothed historical seismicity) for the region. The logic tree weighs various completeness intervals and minimum magnitudes. Using a weighted scheme of European and global ground motion models together with a detailed site classification map for Europe based on Eurocode 8, we generate hazard maps for recurrence periods of 200, 475, 1000 and 2500 yrs.
Assessing the stability of tree ranges and influence of disturbance in eastern US forests
C.W. Woodall; K. Zhu; J.A. Westfall; C.M. Oswalt; A.W. D' Amato; B.F. Walters; H.E. Lintz
2013-01-01
Shifts in tree species ranges may occur due to global climate change, which in turn may be exacerbated by natural disturbance events. Within the context of global climate change, developing techniques to monitor tree range dynamics as affected by natural disturbances may enable mitigation/adaptation of projected impacts. Using a forest inventory across the eastern U.S...
R.R. Pattison; C.M. D' Antonio; T.L. Dudley
2011-01-01
We monitored the impacts of a biological control agent, the saltcedar leaf beetle (Diorhabda carinulata), on the saltcedar tree (Tamarix spp.) at two sites (Humboldt and Walker rivers) in Nevada, USA. At the Humboldt site trees that had experienced three to four defoliation events had more negative water potentials and lower...
Vernal freeze damage and genetic variation alter tree growth, chemistry, and insect interactions.
Rubert-Nason, Kennedy F; Couture, John J; Gryzmala, Elizabeth A; Townsend, Philip A; Lindroth, Richard L
2017-11-01
Anticipated consequences of climate change in temperate regions include early spring warmup punctuated by intermittent hard freezes. Warm weather accelerates leaf flush in perennial woody species, potentially exposing vulnerable young tissues to damaging frosts. We employed a 2 × 6 randomized factorial design to examine how the interplay of vernal (springtime) freeze damage and genetic variation in a hardwood species (Populus tremuloides) influences tree growth, phytochemistry, and interactions with an insect herbivore (Chaitophorus stevensis). Acute effects of freezing included defoliation and mortality. Surviving trees exhibited reduced growth and altered biomass distribution. Reflushed leaves on these trees had lower mass per area, lower lignin concentrations, and higher nitrogen concentrations, altered chemical defence profiles, and supported faster aphid population growth. Many effects varied among plant genotypes and were related with herbivore performance. This study suggests that a single damaging vernal freeze event can alter tree-insect interactions through effects on plant growth and chemistry. Differential responses of various genotypes to freeze damage suggest that more frequent vernal freeze events could also influence natural selection, favouring trees with greater freeze hardiness, and more resistance or tolerance to herbivores following damage. © 2017 John Wiley & Sons Ltd.
The role of hybridization in facilitating tree invasion
2017-01-01
Abstract Hybridization events can generate additional genetic diversity upon which natural selection can act and at times enhance invasiveness of the species. Invasive tree species are a growing ecological concern worldwide, and some of these invasions involve hybridization events pre- or post-introduction. There are 20 hybrid invasive tree taxa in 15 genera (11 plant families) discussed here. When reported, abundance of hybrids comprised 10–100 % of an invasion, the remainder being parental taxa. In seven hybrid taxa, researchers identified phenotypes that may make hybrids better invaders. Twelve hybrid tree taxa involved introgression and more hybrids involved all non-native taxa than native × non-native taxa. Three hybrid tree taxa were the result of intentional crosses, and all hybrid taxa involved intentional introduction of either one or more parental taxon or the hybrid itself. The knowledge gaps present in some hybrid tree taxa can weaken our effectiveness in predicting and controlling invasions, as hybrids can add a level of complexity to an invasion by being morphologically cryptic, causing genetic pollution of a native parental taxon, presenting novel genotypes for which there may not be coevolved biological control agents, or evolving adaptive traits through increased genetic variation. PMID:28028055
Analysis of the Space Propulsion System Problem Using RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
diego mandelli; curtis smith; cristian rabiti
This paper presents the solution of the space propulsion problem using a PRA code currently under development at Idaho National Laboratory (INL). RAVEN (Reactor Analysis and Virtual control ENviroment) is a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities. It is designed to derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures) and to perform both Monte- Carlo sampling of random distributed events and Event Tree based analysis. In order to facilitate the input/output handling, a Graphical User Interface (GUI) and a post-processing data-mining module are available.more » RAVEN allows also to interface with several numerical codes such as RELAP5 and RELAP-7 and ad-hoc system simulators. For the space propulsion system problem, an ad-hoc simulator has been developed and written in python language and then interfaced to RAVEN. Such simulator fully models both deterministic (e.g., system dynamics and interactions between system components) and stochastic behaviors (i.e., failures of components/systems such as distribution lines and thrusters). Stochastic analysis is performed using random sampling based methodologies (i.e., Monte-Carlo). Such analysis is accomplished to determine both the reliability of the space propulsion system and to propagate the uncertainties associated to a specific set of parameters. As also indicated in the scope of the benchmark problem, the results generated by the stochastic analysis are used to generate risk-informed insights such as conditions under witch different strategy can be followed.« less
Billings, S.A.; Boone, A.S.; Stephen, F.M.
2016-01-01
Understanding how tree growth strategies may influence tree susceptibility to disturbance is an important goal, especially given projected increases in diverse ecological disturbances this century. We use growth responses of tree rings to climate, relationships between tree-ring stable isotopic signatures of carbon (δ13C) and oxygen (δ18O), wood nitrogen concentration [N], and contemporary leaf [N] and δ13C values to assess potential historic drivers of tree photosynthesis in dying and apparently healthy co-occurring northern red oak (Quercus rubra L. (Fagaceae)) during a region-wide oak decline event in Arkansas, USA. Bole growth of both healthy and dying trees responded negatively to drought severity (Palmer Drought Severity Index) and temperature; healthy trees exhibited a positive, but small, response to growing season precipitation. Contrary to expectations, tree-ring δ13C did not increase with drought severity. A significantly positive relationship between tree-ring δ13C and δ18O was evident in dying trees (P < 0.05) but not in healthy trees. Healthy trees’ wood exhibited lower [N] than that of dying trees throughout most of their lives (P < 0.05), and we observed a significant, positive relationship (P < 0.05) in healthy trees between contemporary leaf δ13C and leaf N (by mass), but not in dying trees. Our work provides evidence that for plants in which strong relationships between δ13C and δ18O are not evident, δ13C may be governed by plant N status. The data further imply that historic photosynthesis in healthy trees was linked to N status and, perhaps, C sink strength to a greater extent than in dying trees, in which tree-ring stable isotopes suggest that historic photosynthesis was governed primarily by stomatal regulation. This, in turn, suggests that assessing the relative dominance of photosynthetic capacity vs stomatal regulation as drivers of trees’ C accrual may be a feasible means of predicting tree responses to some disturbance events. Our work demonstrates that a dual isotope, tree-ring approach can be integrated with tree N status to begin to unravel a fundamental question in forest ecology: why do some trees die during a disturbance, while other conspecifics with apparently similar access to resources remain healthy? PMID:26960389
Computer network defense through radial wave functions
NASA Astrophysics Data System (ADS)
Malloy, Ian J.
The purpose of this research is to synthesize basic and fundamental findings in quantum computing, as applied to the attack and defense of conventional computer networks. The concept focuses on uses of radio waves as a shield for, and attack against traditional computers. A logic bomb is analogous to a landmine in a computer network, and if one was to implement it as non-trivial mitigation, it will aid computer network defense. As has been seen in kinetic warfare, the use of landmines has been devastating to geopolitical regions in that they are severely difficult for a civilian to avoid triggering given the unknown position of a landmine. Thus, the importance of understanding a logic bomb is relevant and has corollaries to quantum mechanics as well. The research synthesizes quantum logic phase shifts in certain respects using the Dynamic Data Exchange protocol in software written for this work, as well as a C-NOT gate applied to a virtual quantum circuit environment by implementing a Quantum Fourier Transform. The research focus applies the principles of coherence and entanglement from quantum physics, the concept of expert systems in artificial intelligence, principles of prime number based cryptography with trapdoor functions, and modeling radio wave propagation against an event from unknown parameters. This comes as a program relying on the artificial intelligence concept of an expert system in conjunction with trigger events for a trapdoor function relying on infinite recursion, as well as system mechanics for elliptic curve cryptography along orbital angular momenta. Here trapdoor both denotes the form of cipher, as well as the implied relationship to logic bombs.
Markov logic network based complex event detection under uncertainty
NASA Astrophysics Data System (ADS)
Lu, Jingyang; Jia, Bin; Chen, Genshe; Chen, Hua-mei; Sullivan, Nichole; Pham, Khanh; Blasch, Erik
2018-05-01
In a cognitive reasoning system, the four-stage Observe-Orient-Decision-Act (OODA) reasoning loop is of interest. The OODA loop is essential for the situational awareness especially in heterogeneous data fusion. Cognitive reasoning for making decisions can take advantage of different formats of information such as symbolic observations, various real-world sensor readings, or the relationship between intelligent modalities. Markov Logic Network (MLN) provides mathematically sound technique in presenting and fusing data at multiple levels of abstraction, and across multiple intelligent sensors to conduct complex decision-making tasks. In this paper, a scenario about vehicle interaction is investigated, in which uncertainty is taken into consideration as no systematic approaches can perfectly characterize the complex event scenario. MLNs are applied to the terrestrial domain where the dynamic features and relationships among vehicles are captured through multiple sensors and information sources regarding the data uncertainty.
NASA Astrophysics Data System (ADS)
Yang, Jian; Sun, Shuaishuai; Tian, Tongfei; Li, Weihua; Du, Haiping; Alici, Gursel; Nakano, Masami
2016-03-01
Protecting civil engineering structures from uncontrollable events such as earthquakes while maintaining their structural integrity and serviceability is very important; this paper describes the performance of a stiffness softening magnetorheological elastomer (MRE) isolator in a scaled three storey building. In order to construct a closed-loop system, a scaled three storey building was designed and built according to the scaling laws, and then four MRE isolator prototypes were fabricated and utilised to isolate the building from the motion induced by a scaled El Centro earthquake. Fuzzy logic was used to output the current signals to the isolators, based on the real-time responses of the building floors, and then a simulation was used to evaluate the feasibility of this closed loop control system before carrying out an experimental test. The simulation and experimental results showed that the stiffness softening MRE isolator controlled by fuzzy logic could suppress structural vibration well.
NASA Astrophysics Data System (ADS)
Šilhán, Karel
2016-01-01
Knowledge of past landslide activity is crucial for understanding landslide behaviour and for modelling potential future landslide occurrence. Dendrogeomorphic approaches represent the most precise methods of landslide dating (where trees annually create tree-rings in the timescale of up to several hundred years). Despite the advantages of these methods, many open questions remain. One of the less researched uncertainties, and the focus of this study, is the impact of two common methods of geomorphic signal extraction on the spatial and temporal results of landslide reconstruction. In total, 93 Norway spruce (Picea abies (L.) Karst.) trees were sampled at one landslide location dominated by block-type movements in the forefield of the Orlické hory Mts., Bohemian Massif. Landslide signals were examined by the classical subjective method based on reaction (compression) wood analysis and by a numerical method based on eccentric growth analysis. The chronology of landslide movements obtained by the mathematical method resulted in twice the number of events detected compared to the subjective method. This finding indicates that eccentric growth is a more accurate indicator for landslide movements than the classical analysis of reaction wood. The reconstructed spatial activity of landslide movements shows a similar distribution of recurrence intervals (Ri) for both methods. The differences (maximally 30% of the total Ri ranges) in results obtained by both methods may be caused by differences in the ability of trees to react to tilting of their stems by a specific growth response (reaction wood formation or eccentric growth). Finally, the ability of trees to record tilting events (by both growth responses) in their tree-ring series was analysed for different decades of tree life. The highest sensitivity to external tilting events occurred at tree ages from 70 to 80 years for reaction wood formation and from 80 to 90 years for eccentric growth response. This means that the ability of P. abies to record geomorphic signals varies with not only eccentric growth responses but also with age.
A Risk Assessment System with Automatic Extraction of Event Types
NASA Astrophysics Data System (ADS)
Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula
In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.
Drug safety data mining with a tree-based scan statistic.
Kulldorff, Martin; Dashevsky, Inna; Avery, Taliser R; Chan, Arnold K; Davis, Robert L; Graham, David; Platt, Richard; Andrade, Susan E; Boudreau, Denise; Gunter, Margaret J; Herrinton, Lisa J; Pawloski, Pamala A; Raebel, Marsha A; Roblin, Douglas; Brown, Jeffrey S
2013-05-01
In post-marketing drug safety surveillance, data mining can potentially detect rare but serious adverse events. Assessing an entire collection of drug-event pairs is traditionally performed on a predefined level of granularity. It is unknown a priori whether a drug causes a very specific or a set of related adverse events, such as mitral valve disorders, all valve disorders, or different types of heart disease. This methodological paper evaluates the tree-based scan statistic data mining method to enhance drug safety surveillance. We use a three-million-member electronic health records database from the HMO Research Network. Using the tree-based scan statistic, we assess the safety of selected antifungal and diabetes drugs, simultaneously evaluating overlapping diagnosis groups at different granularity levels, adjusting for multiple testing. Expected and observed adverse event counts were adjusted for age, sex, and health plan, producing a log likelihood ratio test statistic. Out of 732 evaluated disease groupings, 24 were statistically significant, divided among 10 non-overlapping disease categories. Five of the 10 signals are known adverse effects, four are likely due to confounding by indication, while one may warrant further investigation. The tree-based scan statistic can be successfully applied as a data mining tool in drug safety surveillance using observational data. The total number of statistical signals was modest and does not imply a causal relationship. Rather, data mining results should be used to generate candidate drug-event pairs for rigorous epidemiological studies to evaluate the individual and comparative safety profiles of drugs. Copyright © 2013 John Wiley & Sons, Ltd.
Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)
NASA Astrophysics Data System (ADS)
Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián
2015-04-01
The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases towards the northeast across the Reverse Faults Province and up to Yucatan Platform, where the lowest values are reached. We also produced uniform hazard spectra (UHS) for the three main cities of Chiapas. Tapachula city presents the highest spectral accelerations, while Tuxtla Gutierrez and San Cristobal de las Casas cities show similar values. We conclude that seismic hazard in Chiapas is chiefly controlled by the subduction of the Cocos beneath Northamerica and Caribe tectonic plates, that makes the coastal areas the most hazardous. Additionally, the Motagua and Polochic Fault Zones are also important, increasing the hazard particularly in southeastern Chiapas.
Eller, Cleiton B; Burgess, Stephen S O; Oliveira, Rafael S
2015-04-01
Trees from tropical montane cloud forest (TMCF) display very dynamic patterns of water use. They are capable of downwards water transport towards the soil during leaf-wetting events, likely a consequence of foliar water uptake (FWU), as well as high rates of night-time transpiration (Enight) during drier nights. These two processes might represent important sources of water losses and gains to the plant, but little is known about the environmental factors controlling these water fluxes. We evaluated how contrasting atmospheric and soil water conditions control diurnal, nocturnal and seasonal dynamics of sap flow in Drimys brasiliensis (Miers), a common Neotropical cloud forest species. We monitored the seasonal variation of soil water content, micrometeorological conditions and sap flow of D. brasiliensis trees in the field during wet and dry seasons. We also conducted a greenhouse experiment exposing D. brasiliensis saplings under contrasting soil water conditions to deuterium-labelled fog water. We found that during the night D. brasiliensis possesses heightened stomatal sensitivity to soil drought and vapour pressure deficit, which reduces night-time water loss. Leaf-wetting events had a strong suppressive effect on tree transpiration (E). Foliar water uptake increased in magnitude with drier soil and during longer leaf-wetting events. The difference between diurnal and nocturnal stomatal behaviour in D. brasiliensis could be attributed to an optimization of carbon gain when leaves are dry, as well as minimization of nocturnal water loss. The leaf-wetting events on the other hand seem important to D. brasiliensis water balance, especially during soil droughts, both by suppressing tree transpiration (E) and as a small additional water supply through FWU. Our results suggest that decreases in leaf-wetting events in TMCF might increase D. brasiliensis water loss and decrease its water gains, which could compromise its ecophysiological performance and survival during dry periods. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Use of Fuzzy Logic Systems for Assessment of Primary Faults
NASA Astrophysics Data System (ADS)
Petrović, Ivica; Jozsa, Lajos; Baus, Zoran
2015-09-01
In electric power systems, grid elements are often subjected to very complex and demanding disturbances or dangerous operating conditions. Determining initial fault or cause of those states is a difficult task. When fault occurs, often it is an imperative to disconnect affected grid element from the grid. This paper contains an overview of possibilities for using fuzzy logic in an assessment of primary faults in the transmission grid. The tool for this task is SCADA system, which is based on information of currents, voltages, events of protection devices and status of circuit breakers in the grid. The function model described with the membership function and fuzzy logic systems will be presented in the paper. For input data, diagnostics system uses information of protection devices tripping, states of circuit breakers and measurements of currents and voltages before and after faults.
Fire safety in transit systems fault tree analysis
DOT National Transportation Integrated Search
1981-09-01
Fire safety countermeasures applicable to transit vehicles are identified and evaluated. This document contains fault trees which illustrate the sequences of events which may lead to a transit-fire related casualty. A description of the basis for the...
Beyond the extreme: Recovery dynamics following heat and drought stress in trees
NASA Astrophysics Data System (ADS)
Ruehr, N.; Duarte, A. G.; Arneth, A.
2016-12-01
Plant recovery processes following extreme events can have profound impacts on forest carbon and water cycling. However, large knowledge gaps persist on recovery dynamics of tree physiological processes following heat and drought stress. To date, few experimental studies exist that include recovery responses in stress research. We synthesized recent research on tree recovery processes related to carbon and water exchange following heat and drought stress, and show that the intensity of stress can affect the pace of recovery with large variations among tree species and processes. Following stress release, leaf water potential recovers instantaneously upon rewatering as found in most studies. Transpiration (T), stomatal conductance (gs) and photosynthesis (A) often lag behind, with lowest recovery following severe stress. Interestingly, the patterns in heat and drought stress recovery apparently differ. While A recovers generally more quickly than gs following drought, which increases water-use-efficiency, both gs and A tend to remain reduced following heat events. The pace of recovery following heat events likely depends on water availability during stress and temperature maxima reached (photosynthetic impairment at temperatures > 40°C). Slow recovery during the initial post-stress days might result from hydraulic limitation and elevated levels of abscisic acid. The mechanisms resulting in a continued impairment of T and gs during a later stage of the recovery period (from weeks up to months) are still elusive. Feedback loops from the photosynthetic machinery, reduced mesophyll conductance or leaf morphological changes may play an important role. In summary, post-stress recovery can substantially affect tree carbon and water cycling. Thus, in order to estimate the impacts of extreme climate events on forest ecosystems in the long-term, we need a better understanding of recovery dynamics and their limitations in terms of stress timing, intensity and duration.
A Multi-stakeholder Approach to Moving Beyond Tree Mortality in the Sierra Nevada
NASA Astrophysics Data System (ADS)
Balachowski, J.; Buluc, L.; Fischer, C.; Ko, J.; Ostoja, S.
2017-12-01
The US Forest Service has estimated that 102 million trees have died in California since 2010. This die off event has been attributed to the combined effects of historical land management practices, fire suppression, insect outbreaks, and climate-related stressors. This tree mortality event represents the largest and most significant ecological disturbance in California in centuries, if not longer. Both scientists and managers recognize the need to rethink our approach to forest management in the face of a changing climate and increasingly frequent, uncharacteristically large wildfires, while budgets and staffing capacity continue to decrease. Addressing the uncertainly in managing under climate change with fewer financial resources will require multiple partners and stakeholders—including federal and state agencies, local governments, and non-governmental organizations—to work together to identify common goals and paths moving forward. The USDA California Climate Hub and USFS Region 5 convened a symposium on drought and tree mortality in July 2017. With nearly 170 attendees across a wide range of sectors, the event provided a meaningful opportunity for reflection, analysis, and consideration of next steps. Among the outcomes of this symposium was the identification of areas in which our capacity for individual and synergistic action is stronger, and those in which it is lacking that will thus require additional attention and effort. From this symposium, which included a series of smaller, stakeholder and partner working groups, we collectively identified research and information needs, possible policy adjustments, future management actions, and funding needs and opportunities. Here, we present these findings and suggest approaches for addressing the current tree mortality event based on the shared interests of multiple, diverse stakeholder groups.
NASA Astrophysics Data System (ADS)
Magnon, Anne
2005-04-01
A non geometric cosmology is presented, based on logic of observability, where logical categories of our perception set frontiers to comprehensibility. The Big-Bang singularity finds here a substitute (comparable to a "quantum jump"): a logical process (tied to self-referent and divisible totality) by which information emerges, focalizes on events and recycles, providing a transition from incoherence to causal coherence. This jump manufactures causal order and space-time localization, as exact solutions to Einstein's equation, where the last step of the process disentangles complex Riemann spheres into real null-cones (a geometric overturning imposed by self-reference, reminding us of our ability to project the cosmos within our mental sphere). Concepts such as antimatter and dark energy (dual entities tied to bifurcations or broken symmetries, and their compensation), are presented as hidden in the virtual potentialities, while irreversible time appears with the recycling of information and related flow. Logical bifurcations (such as the "part-totality" category, a quantum of information which owes its recycling to non localizable logical separations, as anticipated by unstability or horizon dependence of the quantum vacuum) induce broken symmetries, at the (complex or real) geometric level [eg. the antiselfdual complex non linear graviton solutions, which break duality symmetry, provide a model for (hidden) anti-matter, itself compensated with dark-energy, and providing, with space-time localization, the radiative gravitational energy (Bondi flux and related bifurcations of the peeling off type), as well as mass of isolated bodies]. These bifurcations are compensated by inertial effects (non geometric precursors of the Coriolis forces) able to explain (on logical grounds) the cosmic expansion (a repulsion?) and critical equilibrium of the cosmic tissue. Space-time environment, itself, emerges through the jump, as a censor to totality, a screen to incoherence (as anticipated by black-hole event horizons, cosmic censors able to shelter causal geometry). In analogy with black-hole singularities, the Big-Bang can be viewed as a geometric hint that a transition from incoherence to (causal space-time) localization and related coherence (comprehensibility), is taking place (space-time demolition, a reverse process towards incoherence or information recycling, is expected in the vicinity of singularities, as hinted by black-holes and related "time-machines"). A theory of the emergence of perception (and life?), in connection with observability and the function of partition (able to screen totality), is on its way [interface incoherence-coherence, sleeping and awaking states of localization, horizons of perception etc, are anticipated by black-hole event horizons, beyond which a non causal, dimensionless incoherent regime or memorization process, presents itself with the loss of localization, suggesting a unifying regime (ultimate energies?) hidden in cosmic potentialities]. The decoherence process presented here, suggests an ultimate interaction, expression of the logical relation of subsystems to totality, and to be identified to the flow of information or its recycling through cosmic jump (this is anticipated by the dissipation of distance or hierarchies on null-cones, themselves recycled with information and events). The geometric projection of this unified irreversible dynamics is expressed by unified Yang-Mills field equations (coupled to Einsteinian gravity). An ultimate form of action ("set"-volumes of information) presents itself, whose extrema can be achieved through extremal transfer of information and related partition of cells of information (thus anticipating the mitosis of living cells, possibly triggered at the non localizable level, as imposed by the logical regime of cosmic decoherence: participating subsystems ?). The matching of the objective and subjective facets of (information and) decoherences is perceived as contact with a reality.
Graphical fault tree analysis for fatal falls in the construction industry.
Chi, Chia-Fen; Lin, Syuan-Zih; Dewi, Ratna Sari
2014-11-01
The current study applied a fault tree analysis to represent the causal relationships among events and causes that contributed to fatal falls in the construction industry. Four hundred and eleven work-related fatalities in the Taiwanese construction industry were analyzed in terms of age, gender, experience, falling site, falling height, company size, and the causes for each fatality. Given that most fatal accidents involve multiple events, the current study coded up to a maximum of three causes for each fall fatality. After the Boolean algebra and minimal cut set analyses, accident causes associated with each falling site can be presented as a fault tree to provide an overview of the basic causes, which could trigger fall fatalities in the construction industry. Graphical icons were designed for each falling site along with the associated accident causes to illustrate the fault tree in a graphical manner. A graphical fault tree can improve inter-disciplinary discussion of risk management and the communication of accident causation to first line supervisors. Copyright © 2014 Elsevier Ltd. All rights reserved.
Goal-Function Tree Modeling for Systems Engineering and Fault Management
NASA Technical Reports Server (NTRS)
Johnson, Stephen B.; Breckenridge, Jonathan T.
2013-01-01
This paper describes a new representation that enables rigorous definition and decomposition of both nominal and off-nominal system goals and functions: the Goal-Function Tree (GFT). GFTs extend the concept and process of functional decomposition, utilizing state variables as a key mechanism to ensure physical and logical consistency and completeness of the decomposition of goals (requirements) and functions, and enabling full and complete traceabilitiy to the design. The GFT also provides for means to define and represent off-nominal goals and functions that are activated when the system's nominal goals are not met. The physical accuracy of the GFT, and its ability to represent both nominal and off-nominal goals enable the GFT to be used for various analyses of the system, including assessments of the completeness and traceability of system goals and functions, the coverage of fault management failure detections, and definition of system failure scenarios.
Embedding global and collective in a torus network with message class map based tree path selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Dong; Coteus, Paul W.; Eisley, Noel A.
Embodiments of the invention provide a method, system and computer program product for embedding a global barrier and global interrupt network in a parallel computer system organized as a torus network. The computer system includes a multitude of nodes. In one embodiment, the method comprises taking inputs from a set of receivers of the nodes, dividing the inputs from the receivers into a plurality of classes, combining the inputs of each of the classes to obtain a result, and sending said result to a set of senders of the nodes. Embodiments of the invention provide a method, system and computermore » program product for embedding a collective network in a parallel computer system organized as a torus network. In one embodiment, the method comprises adding to a torus network a central collective logic to route messages among at least a group of nodes in a tree structure.« less
A logic programming approach to medical errors in imaging.
Rodrigues, Susana; Brandão, Paulo; Nelas, Luís; Neves, José; Alves, Victor
2011-09-01
In 2000, the Institute of Medicine reported disturbing numbers on the scope it covers and the impact of medical error in the process of health delivery. Nevertheless, a solution to this problem may lie on the adoption of adverse event reporting and learning systems that can help to identify hazards and risks. It is crucial to apply models to identify the adverse events root causes, enhance the sharing of knowledge and experience. The efficiency of the efforts to improve patient safety has been frustratingly slow. Some of this insufficiency of progress may be assigned to the lack of systems that take into account the characteristic of the information about the real world. In our daily lives, we formulate most of our decisions normally based on incomplete, uncertain and even forbidden or contradictory information. One's knowledge is less based on exact facts and more on hypothesis, perceptions or indications. From the data collected on our adverse event treatment and learning system on medical imaging, and through the use of Extended Logic Programming to knowledge representation and reasoning, and the exploitation of new methodologies for problem solving, namely those based on the perception of what is an agent and/or multi-agent systems, we intend to generate reports that identify the most relevant causes of error and define improvement strategies, concluding about the impact, place of occurrence, form or type of event recorded in the healthcare institutions. The Eindhoven Classification Model was extended and adapted to the medical imaging field and used to classify adverse events root causes. Extended Logic Programming was used for knowledge representation with defective information, allowing for the modelling of the universe of discourse in terms of data and knowledge default. A systematization of the evolution of the body of knowledge about Quality of Information embedded in the Root Cause Analysis was accomplished. An adverse event reporting and learning system was developed based on the presented approach to medical errors in imaging. This system was deployed in two Portuguese healthcare institutions, with an appealing outcome. The system enabled to verify that the majority of occurrences were concentrated in a few events that could be avoided. The developed system allowed automatic knowledge extraction, enabling report generation with strategies for the improvement of quality-of-care. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Fault-Tolerant Sequencer Using FPGA-Based Logic Designs for Space Applications
2013-12-01
Prototype Board SBU single bit upset SDK software development kit SDRAM synchronous dynamic random-access memory SEB single-event burnout ...current VHDL VHSIC hardware description language VHSIC very-high-speed integrated circuits VLSI very-large- scale integration VQFP very...transient pulse, called a single-event transient (SET), or even cause permanent damage to the device in the form of a burnout or gate rupture. The SEE
Program For Parallel Discrete-Event Simulation
NASA Technical Reports Server (NTRS)
Beckman, Brian C.; Blume, Leo R.; Geiselman, John S.; Presley, Matthew T.; Wedel, John J., Jr.; Bellenot, Steven F.; Diloreto, Michael; Hontalas, Philip J.; Reiher, Peter L.; Weiland, Frederick P.
1991-01-01
User does not have to add any special logic to aid in synchronization. Time Warp Operating System (TWOS) computer program is special-purpose operating system designed to support parallel discrete-event simulation. Complete implementation of Time Warp mechanism. Supports only simulations and other computations designed for virtual time. Time Warp Simulator (TWSIM) subdirectory contains sequential simulation engine interface-compatible with TWOS. TWOS and TWSIM written in, and support simulations in, C programming language.
2014-06-28
constructed from inexpensive semiconductor lasers could lead to the development of novel neuro-inspired optical computing devices (threshold detectors ...optical computing devices (threshold detectors , logic gates, signal recognition, etc.). Other topics of research included the analysis of extreme events in...Extreme events is nowadays a highly active field of research. Rogue waves, earthquakes of high magnitude and financial crises are all rare and
Studies Of Single-Event-Upset Models
NASA Technical Reports Server (NTRS)
Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.
1988-01-01
Report presents latest in series of investigations of "soft" bit errors known as single-event upsets (SEU). In this investigation, SEU response of low-power, Schottky-diode-clamped, transistor/transistor-logic (TTL) static random-access memory (RAM) observed during irradiation by Br and O ions in ranges of 100 to 240 and 20 to 100 MeV, respectively. Experimental data complete verification of computer model used to simulate SEU in this circuit.
Using inventory data to determine the impact of drought on tree mortality
Greg C. Liknes; Christopher W. Woodall; Charles H. Perry
2012-01-01
Drought has been the subject of numerous recent studies that hint at an acceleration of tree mortality due to climate change. In particular, a recent global survey of tree mortality events implicates drought as the cause of quaking aspen mortality in Minnesota, USA in 2007. In this study, data from the Forest Inventory and Analysis program of the USDA Forest Service...
2013-05-01
specifics of the correlation will be explored followed by discussion of new paradigms— the ordered event list (OEL) and the decision tree — that result from...4.2.1 Brief Overview of the Decision Tree Paradigm ................................................15 4.2.2 OEL Explained...6 Figure 3. A depiction of a notional fault/activation tree . ................................................................7
Lindsay M. Grayson; Robert A. Progar; Sharon M. Hood
2017-01-01
Fire is a driving force in the North American landscape and predicting post-fire tree mortality is vital to land management. Post-fire tree mortality can have substantial economic and social impacts, and natural resource managers need reliable predictive methods to anticipate potential mortality following fire events. Current fire mortality models are limited to a few...
Grossi, Enzo
2006-05-03
In recent years a number of algorithms for cardiovascular risk assessment has been proposed to the medical community. These algorithms consider a number of variables and express their results as the percentage risk of developing a major fatal or non-fatal cardiovascular event in the following 10 to 20 years The author has identified three major pitfalls of these algorithms, linked to the limitation of the classical statistical approach in dealing with this kind of non linear and complex information. The pitfalls are the inability to capture the disease complexity, the inability to capture process dynamics, and the wide confidence interval of individual risk assessment. Artificial Intelligence tools can provide potential advantage in trying to overcome these limitations. The theoretical background and some application examples related to artificial neural networks and fuzzy logic have been reviewed and discussed. The use of predictive algorithms to assess individual absolute risk of cardiovascular future events is currently hampered by methodological and mathematical flaws. The use of newer approaches, such as fuzzy logic and artificial neural networks, linked to artificial intelligence, seems to better address both the challenge of increasing complexity resulting from a correlation between predisposing factors, data on the occurrence of cardiovascular events, and the prediction of future events on an individual level.
Yielding to desire: the durability of affective preferences.
Rapp, David N; Jacovina, Matthew E; Slaten, Daniel G; Krause, Elise
2014-09-01
People's expectations about the future are guided not just by the contingencies of situations but also by what they hope or wish will happen next. These preferences can inform predictions that run counter to what should or must occur based on the logic of unfolding events. Effects of this type have been regularly identified in studies of judgment and decision making, with individuals' choices often reflecting emotional rather than rational influences. Encouraging individuals to rely less on their emotional considerations has proven a challenge as affective responses are generated quickly and are seemingly informative for decisions. In 6 experiments we examined whether individuals could be encouraged to rely less on their affective preferences when making judgments about future events. Participants read stories in which contexts informed the likelihood of events in ways that might run counter to their preferential investments in particular outcomes. While being less than relevant given the logic of events, participants' affective considerations remained influential despite time allotted for predictive reflection. In contrast, instructional warnings helped attenuate the influence of affective considerations, even under conditions previously shown to encourage preferential biases. The findings are discussed with respect to factors that mediate preference effects, and highlight challenges for overcoming people's reliance on affective contributors to everyday judgments and comprehension.
NASA Astrophysics Data System (ADS)
Swann, Abigail L. S.; Laguë, Marysa M.; Garcia, Elizabeth S.; Field, Jason P.; Breshears, David D.; Moore, David J. P.; Saleska, Scott R.; Stark, Scott C.; Villegas, Juan Camilo; Law, Darin J.; Minor, David M.
2018-05-01
Regional-scale tree die-off events driven by drought and warming and associated pests and pathogens have occurred recently on all forested continents and are projected to increase in frequency and extent with future warming. Within areas where tree mortality has occurred, ecological, hydrological and meteorological consequences are increasingly being documented. However, the potential for tree die-off to impact vegetation processes and related carbon dynamics in areas remote to where die-off occurs has rarely been systematically evaluated, particularly for multiple distinct regions within a given continent. Such remote impacts can occur when climate effects of local vegetation change are propagated by atmospheric circulation—the phenomena of ‘ecoclimate teleconnections’. We simulated tree die-off events in the 13 most densely forested US regions (selected from the 20 US National Ecological Observatory Network [NEON] domains) and found that tree die-off even for smaller regions has potential to affect climate and hence Gross Primary Productivity (GPP) in disparate regions (NEON domains), either positively or negatively. Some regions exhibited strong teleconnections to several others, and some regions were relatively sensitive to tree loss regardless of what other region the tree loss occurred in. For the US as a whole, loss of trees in the Pacific Southwest—an area undergoing rapid tree die-off—had the largest negative impact on remote US GPP whereas loss of trees in the Mid-Atlantic had the largest positive impact. This research lays a foundation for hypotheses that identify how the effects of tree die-off (or other types of tree loss such as deforestation) can ricochet across regions by revealing hot-spots of forcing and response. Such modes of connectivity have direct applicability for improving models of climate change impacts and for developing more informed and coordinated carbon accounting across regions.
Logic Model Checking of Time-Periodic Real-Time Systems
NASA Technical Reports Server (NTRS)
Florian, Mihai; Gamble, Ed; Holzmann, Gerard
2012-01-01
In this paper we report on the work we performed to extend the logic model checker SPIN with built-in support for the verification of periodic, real-time embedded software systems, as commonly used in aircraft, automobiles, and spacecraft. We first extended the SPIN verification algorithms to model priority based scheduling policies. Next, we added a library to support the modeling of periodic tasks. This library was used in a recent application of the SPIN model checker to verify the engine control software of an automobile, to study the feasibility of software triggers for unintended acceleration events.
Mathematical logic as a mean of solving the problems of power supply for buildings and constructions
NASA Astrophysics Data System (ADS)
Pryadko, Igor; Nozdrina, Ekaterina; Boltaevsky, Andrey
2017-10-01
The article analyzes the questions of application of mathematical logic in engineering design associated with machinery and construction. The aim of the work is to study the logical working-out of Russian electrical engineer V.I. Shestakov. These elaborations are considered in connection with the problem of analysis and synthesis of relay contact circuits of the degenerate (A) class which the scientist solved. The article proposes to use Shestakov’s elaborations for optimization of buildings and constructions of modern high-tech. In the second part of the article the events are actualized in association with the development of problems of application of mathematical logic in the analysis and synthesis of electric circuits, relay and bridging. The arguments in favor of the priority of the authorship of the elaborations of Russian electrical engineer V. I. Shestakov, K. Shannon - one of the founders of computer science, and Japanese engineer A. Nakashima are discussed. The issue of contradiction between V. I. Shestakov and representatives of the school of M. A. Gavrilov is touched on.
CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS
NASA Technical Reports Server (NTRS)
Iverson, D. L.
1994-01-01
Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same techniques as the fault tree cut set code, except it includes all upstream digraph nodes in the cut sets for a given node and checks for cycles in the digraph during the solution process. CUTSETS solves for specified nodes and will not automatically solve for all upstream digraph nodes. The cut sets will be output as a text file. CUTSETS includes a utility program that will convert the popular COD format digraph model description files into text input files suitable for use with the CUTSETS programs. FEAT (MSC-21873) and FIRM (MSC-21860) available from COSMIC are examples of programs that produce COD format digraph model description files that may be converted for use with the CUTSETS programs. CUTSETS is written in C-language to be machine independent. It has been successfully implemented on a Sun running SunOS, a DECstation running ULTRIX, a Macintosh running System 7, and a DEC VAX running VMS. The RAM requirement varies with the size of the models. CUTSETS is available in UNIX tar format on a .25 inch streaming magnetic tape cartridge (standard distribution) or on a 3.5 inch diskette. It is also available on a 3.5 inch Macintosh format diskette or on a 9-track 1600 BPI magnetic tape in DEC VAX FILES-11 format. Sample input and sample output are provided on the distribution medium. An electronic copy of the documentation in Macintosh Microsoft Word format is included on the distribution medium. Sun and SunOS are trademarks of Sun Microsystems, Inc. DEC, DeCstation, ULTRIX, VAX, and VMS are trademarks of Digital Equipment Corporation. UNIX is a registered trademark of AT&T Bell Laboratories. Macintosh is a registered trademark of Apple Computer, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Raedt, Hans; Katsnelson, Mikhail I.; Donker, Hylke C.
It is shown that the Pauli equation and the concept of spin naturally emerge from logical inference applied to experiments on a charged particle under the conditions that (i) space is homogeneous (ii) the observed events are logically independent, and (iii) the observed frequency distributions are robust with respect to small changes in the conditions under which the experiment is carried out. The derivation does not take recourse to concepts of quantum theory and is based on the same principles which have already been shown to lead to e.g. the Schrödinger equation and the probability distributions of pairs of particles inmore » the singlet or triplet state. Application to Stern–Gerlach experiments with chargeless, magnetic particles, provides additional support for the thesis that quantum theory follows from logical inference applied to a well-defined class of experiments. - Highlights: • The Pauli equation is obtained through logical inference applied to robust experiments on a charged particle. • The concept of spin appears as an inference resulting from the treatment of two-valued data. • The same reasoning yields the quantum theoretical description of neutral magnetic particles. • Logical inference provides a framework to establish a bridge between objective knowledge gathered through experiments and their description in terms of concepts.« less
Review: Evaluation of Foot-and-Mouth Disease Control Using Fault Tree Analysis.
Isoda, N; Kadohira, M; Sekiguchi, S; Schuppers, M; Stärk, K D C
2015-06-01
An outbreak of foot-and-mouth disease (FMD) causes huge economic losses and animal welfare problems. Although much can be learnt from past FMD outbreaks, several countries are not satisfied with their degree of contingency planning and aiming at more assurance that their control measures will be effective. The purpose of the present article was to develop a generic fault tree framework for the control of an FMD outbreak as a basis for systematic improvement and refinement of control activities and general preparedness. Fault trees are typically used in engineering to document pathways that can lead to an undesired event, that is, ineffective FMD control. The fault tree method allows risk managers to identify immature parts of the control system and to analyse the events or steps that will most probably delay rapid and effective disease control during a real outbreak. The present developed fault tree is generic and can be tailored to fit the specific needs of countries. For instance, the specific fault tree for the 2001 FMD outbreak in the UK was refined based on control weaknesses discussed in peer-reviewed articles. Furthermore, the specific fault tree based on the 2001 outbreak was applied to the subsequent FMD outbreak in 2007 to assess the refinement of control measures following the earlier, major outbreak. The FMD fault tree can assist risk managers to develop more refined and adequate control activities against FMD outbreaks and to find optimum strategies for rapid control. Further application using the current tree will be one of the basic measures for FMD control worldwide. © 2013 Blackwell Verlag GmbH.
NASA Astrophysics Data System (ADS)
Rebenack, C.; Anderson, W. T.; Cherubini, P.
2012-12-01
The South Florida coastal ecosystem is among the world's subtropical coastlines which are threatened by the potential effects of climate change. A well-developed localized paleohistory is essential in the understanding of the role climate variability/change has on both hydrological dynamics and disturbance event frequency and intensity; this understanding can then aid in the development of better predictive models. High resolution paleoclimate proxies, such as those developed from tree-ring archives, may be useful tools for extrapolating actual climate trends over time from the overlapping long-term and short-term climate cycles, such as the Atlantic Multidecadal Oscillation (AMO) and the El Niño-Southern Oscillation (ENSO). In South Florida, both the AMO and ENSO strongly influence seasonal precipitation, and a more complete grasp of how these cycles have affected the region in the past could be applied to future freshwater management practices. Dendrochronology records for the terrestrial subtropics, including South Florida, are sparse because seasonality for this region is precipitation-driven; this is in contrast to the drastic temperature changes experienced in the temperate latitudes. Subtropical seasonality may lead to the complete lack of visible rings or to the formation of ring structures that may or may not represent annual growth. Fortunately, it has recently been demonstrated that Pinus elliottii trees in South Florida produce distinct annual growth rings; however ring width was not found to significantly correlate with either the AMO or ENSO. Dendrochronology studies may be taken a step beyond the physical tree-ring proxies by using the carbon isotope ratios to infer information about physiological controls and environmental factors that affect the distribution of isotopes within the plant. It has been well established that the stable isotope composition of cellulose can be related to precipitation, drought, large-scale ocean/atmospheric oscillations, and disturbance events, such as tropical cyclone impacts. Because slash pine growth is dependent on water availability, a chronology developed using carbon isotopes may provide greater insight into plant stress over time and ultimately may lead to better correlations with climate oscillations. The work presented here is the result of a carbon-isotope study of four slash pine trees located across a freshwater gradient on Big Pine Key, Florida. A site chronology has been developed by cross-dating the δ13C records for each of the trees. The tree located on the distal edge of the freshwater gradient shows an overall enriched isotopic signature over time compared to the trees growing over a deeper part of the local freshwater lens, indicating that these trees are sensitive to water stress. In addition, the carbon isotope data show seasonal stomatal activity in the trees and indicate the timing of two disturbance events.
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Yu, Yun; Degnan, James H.; Nakhleh, Luay
2012-01-01
Gene tree topologies have proven a powerful data source for various tasks, including species tree inference and species delimitation. Consequently, methods for computing probabilities of gene trees within species trees have been developed and widely used in probabilistic inference frameworks. All these methods assume an underlying multispecies coalescent model. However, when reticulate evolutionary events such as hybridization occur, these methods are inadequate, as they do not account for such events. Methods that account for both hybridization and deep coalescence in computing the probability of a gene tree topology currently exist for very limited cases. However, no such methods exist for general cases, owing primarily to the fact that it is currently unknown how to compute the probability of a gene tree topology within the branches of a phylogenetic network. Here we present a novel method for computing the probability of gene tree topologies on phylogenetic networks and demonstrate its application to the inference of hybridization in the presence of incomplete lineage sorting. We reanalyze a Saccharomyces species data set for which multiple analyses had converged on a species tree candidate. Using our method, though, we show that an evolutionary hypothesis involving hybridization in this group has better support than one of strict divergence. A similar reanalysis on a group of three Drosophila species shows that the data is consistent with hybridization. Further, using extensive simulation studies, we demonstrate the power of gene tree topologies at obtaining accurate estimates of branch lengths and hybridization probabilities of a given phylogenetic network. Finally, we discuss identifiability issues with detecting hybridization, particularly in cases that involve extinction or incomplete sampling of taxa. PMID:22536161
NASA Astrophysics Data System (ADS)
van Meerveld, Ilja; Spencer, Sheena
2017-04-01
Most studies on stemflow have focused on the amount of stemflow in different forests or for different rainfall events. So far, few studies have looked at how stemflow intensity varies during rainfall events and how peak stemflow intensities compare to peak rainfall intensities. High stemflow intensities at the base of the tree, where roots and other preferential flow pathways are prevalent, may lead to faster and deeper infiltration of stemflow than rainfall and thus affect soil moisture dynamics and potentially also subsurface stormflow generation. We measured stemflow intensities for three Western hemlock, two Western red cedar, two Douglas-fir and one Birch tree in a mature coniferous forest in coastal British Columbia to determine how stemflow intensities were related to rainfall intensity. We sprayed a blue dye tracer on two Western hemlock trees (29 and 52 cm diameter at breast height (DBH)) to determine how stemflow water flows through the soil and to what depth it infiltrates. We also applied the blue dye tracer to an area between the trees to compare infiltration of stemflow with rainfall. Stemflow increased linearly with event total precipitation for all trees. The larger trees almost exclusively had funneling ratios (i.e. the volume of stemflow per unit basal area divided by the rainfall) smaller than one, regardless of species and event size. The funneling ratios for the small trees were generally larger for larger events (up to a funneling ratio of 20) but there was considerable scatter in this relation. Trees with a DBH <35 cm, which represent 24% of the total basal area of the study site, contributed 72% of the estimated total stemflow amount. Stemflow intensities (volume of stemflow per unit basal area per hour) often increased in a stepwise manner. When there were two precipitation bursts, stemflow intensity was usually highest during the second precipitation burst. However, when there were several hours of very low rainfall intensity between consecutive precipitation bursts, stemflow intensity was lower during the first burst after the break in rainfall. Peak stemflow intensities were higher than the maximum precipitation intensity. The blue dye that was applied to the tree stems was found more frequently at depth than near the soil surface. Stemflow flowed primarily through the 10 cm organic rich upper layer of the soil around the tree before flowing between or along live and dead roots, inside dead roots, around rocks and boulders deeper into the soil. Lateral flow was observed above a dense clay layer but where roots were able to penetrate the clay layer, the infiltrating water flowed deeper into the soil and (almost) reached the soil-bedrock interface. Stemflow appeared to infiltrate deeper (122 cm) than rainfall (85 cm) but this difference was in part due to variations in maximum soil depth. These results suggest that even though stemflow is only a minor component of the water balance, the double funnelling of stemflow may significantly affect soil moisture, recharge and runoff generation.
Phenology of Pacific Northwest tree species
Connie Harrington; Kevin Ford; Brad St. Clair
2016-01-01
Phenology is the study of the timing of recurring biological events. For foresters, the most commonly observed phenological events are budburst, flowering, and leaf fall, but other harder to observe events, such as diameter-growth initiation, are also important. Most events that occur in the spring are influenced by past exposure to cool (chilling) temperatures and...
Extreme Drought Events Revealed in Amazon Tree Ring Records
NASA Astrophysics Data System (ADS)
Jenkins, H. S.; Baker, P. A.; Guilderson, T. P.
2010-12-01
The Amazon basin is a center of deep atmospheric convection and thus acts as a major engine for global hydrologic circulation. Yet despite its significance, a full understanding of Amazon rainfall variability remains elusive due to a poor historical record of climate. Temperate tree rings have been used extensively to reconstruct climate over the last thousand years, however less attention has been given to the application of dendrochronology in tropical regions, in large part due to a lower frequency of tree species known to produce annual rings. Here we present a tree ring record of drought extremes from the Madre de Dios region of southeastern Peru over the last 190 years. We confirm that tree ring growth in species Cedrela odorata is annual and show it to be well correlated with wet season precipitation. This correlation is used to identify extreme dry (and wet) events that have occurred in the past. We focus on drought events identified in the record as drought frequency is expected to increase over the Amazon in a warming climate. The Cedrela chronology records historic Amazon droughts of the 20th century previously identified in the literature and extends the record of drought for this region to the year 1816. Our analysis shows that there has been an increase in the frequency of extreme drought (mean recurrence interval = 5-6 years) since the turn of the 20th century and both Atlantic and Pacific sea surface temperature (SST) forcing mechanisms are implicated.
Modeling time-to-event (survival) data using classification tree analysis.
Linden, Ariel; Yarnold, Paul R
2017-12-01
Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Reed, A. S.; Stephen, F. M.; Billings, S. A.
2011-12-01
A major oak decline event in recent decades in Northwest Arkansas permits insight into disturbance impacts on forests, which is important for understanding global carbon, nutrient and climate cycles given projections of increasing disturbance event frequency in the future. The decline event, associated with an increase in population of a native, wood-boring insect, followed a cycle of droughts and resulted in a mosaic of apparently healthy red oaks (Quercus rubra) neighboring severely declining trees of the same species. Tree-ring evidence suggests decreased growth rates following increases in the insect's population decades prior to visible external decline symptoms (i.e. decreased crown coverage, mortality), but only in trees destined to die during the insect outbreak. Reasons why some trees experienced mortality and some remained healthy are unclear. Through analysis of stable isotopes of carbon (δ13C) and oxygen (δ18O) in wood and leaf δ13C and nitrogen among co-occurring trees, we can infer differential responses of red oaks to disturbance and associated resilience to mortality. Tree-ring a-cellulose δ13C varied from -27.3to -23.0%, and δ18O values varied from 27.5 to 31.8%. Neither δ13C nor δ18O exhibited signficant differences between healthy and declining trees. However, declining trees exhibited a significant, positive relationship between δ13C and δ18O (p <0.05, r2=0.15) prior to peak insect infestation. In contrast, apparently healthy individuals did not exhibit a significant relationship between these parameters, but exhibited significant, positive relationships between current year leaf δ13C and nitrogen content (p<0.05, r2=0.77). These results suggest that healthy and declining trees had different strategies for coping with insect infestation. Correlation between tree-ring δ13C and δ18O in dying trees suggests that trees destined to die during the infestation regulated their δ13C values primarily via stomatal conductance, a mechanism that influences both δ13C and δ18O. In contrast, δ13C values in apparently healthy trees did not vary with δ18O, indicating that stomatal conductance was not an important regulator of δ13C. The linkage between δ13C and nitrogen availability in these trees suggests that carbon sink strength, typically associated with plant nutrient status, may have played a more important role than carbon source strength (i.e. stomatal conductance) in governing tree-ring δ13C. These results suggest that 1) responses to disturbance of co-occurring trees of the same species can diverge in ways discernable decades later via stable isotopic analysis, and 2) the primary driver of wood δ13C values, whether carbon source (stomatal conductance) or sink (fixation capacity) strength, is linked to its fate.
Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram
2015-08-01
In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.
Huang, Weiqing; Fan, Hongbo; Qiu, Yongfu; Cheng, Zhiyu; Qian, Yu
2016-02-15
Haze weather has become a serious environmental pollution problem which occurs in many Chinese cities. One of the most critical factors for the formation of haze weather is the exhausts of coal combustion, thus it is meaningful to figure out the causation mechanism between urban haze and the exhausts of coal combustion. Based on above considerations, the fault tree analysis (FAT) approach was employed for the causation mechanism of urban haze in Beijing by considering the risk events related with the exhausts of coal combustion for the first time. Using this approach, firstly the fault tree of the urban haze causation system connecting with coal combustion exhausts was established; consequently the risk events were discussed and identified; then, the minimal cut sets were successfully determined using Boolean algebra; finally, the structure, probability and critical importance degree analysis of the risk events were completed for the qualitative and quantitative assessment. The study results proved that the FTA was an effective and simple tool for the causation mechanism analysis and risk management of urban haze in China. Copyright © 2015 Elsevier B.V. All rights reserved.
Fire-mediated dieback and compositional cascade in an Amazonian forest.
Barlow, Jos; Peres, Carlos A
2008-05-27
The only fully coupled land-atmosphere global climate model predicts a widespread dieback of Amazonian forest cover through reduced precipitation. Although these predictions are controversial, the structural and compositional resilience of Amazonian forests may also have been overestimated, as current vegetation models fail to consider the potential role of fire in the degradation of forest ecosystems. We examine forest structure and composition in the Arapiuns River basin in the central Brazilian Amazon, evaluating post-fire forest recovery and the consequences of recurrent fires for the patterns of dominance of tree species. We surveyed tree plots in unburned and once-burned forests examined 1, 3 and 9 years after an unprecedented fire event, in twice-burned forests examined 3 and 9 years after fire and in thrice-burned forests examined 5 years after the most recent fire event. The number of trees recorded in unburned primary forest control plots was stable over time. However, in both once- and twice-burned forest plots, there was a marked recruitment into the 10-20cm diameter at breast height tree size classes between 3 and 9 years post-fire. Considering tree assemblage composition 9 years after the first fire contact, we observed (i) a clear pattern of community turnover among small trees and the most abundant shrubs and saplings, and (ii) that species that were common in any of the four burn treatments (unburned, once-, twice- and thrice-burned) were often rare or entirely absent in other burn treatments. We conclude that episodic wildfires can lead to drastic changes in forest structure and composition, with cascading shifts in forest composition following each additional fire event. Finally, we use these results to evaluate the validity of the savannization paradigm.
Garrity, Steven R.; Allen, Craig D.; Brumby, Steven P.; Gangodagamage, Chandana; McDowell, Nate G.; Cai, D. Michael
2013-01-01
Widespread tree mortality events have recently been observed in several biomes. To effectively quantify the severity and extent of these events, tools that allow for rapid assessment at the landscape scale are required. Past studies using high spatial resolution satellite imagery have primarily focused on detecting green, red, and gray tree canopies during and shortly after tree damage or mortality has occurred. However, detecting trees in various stages of death is not always possible due to limited availability of archived satellite imagery. Here we assess the capability of high spatial resolution satellite imagery for tree mortality detection in a southwestern U.S. mixed species woodland using archived satellite images acquired prior to mortality and well after dead trees had dropped their leaves. We developed a multistep classification approach that uses: supervised masking of non-tree image elements; bi-temporal (pre- and post-mortality) differencing of normalized difference vegetation index (NDVI) and red:green ratio (RGI); and unsupervised multivariate clustering of pixels into live and dead tree classes using a Gaussian mixture model. Classification accuracies were improved in a final step by tuning the rules of pixel classification using the posterior probabilities of class membership obtained from the Gaussian mixture model. Classifications were produced for two images acquired post-mortality with overall accuracies of 97.9% and 98.5%, respectively. Classified images were combined with land cover data to characterize the spatiotemporal characteristics of tree mortality across areas with differences in tree species composition. We found that 38% of tree crown area was lost during the drought period between 2002 and 2006. The majority of tree mortality during this period was concentrated in piñon-juniper (Pinus edulis-Juniperus monosperma) woodlands. An additional 20% of the tree canopy died or was removed between 2006 and 2011, primarily in areas experiencing wildfire and management activity. -Our results demonstrate that unsupervised clustering of bi-temporal NDVI and RGI differences can be used to detect tree mortality resulting from numerous causes and in several forest cover types.
Fault tree analysis of the causes of waterborne outbreaks.
Risebro, Helen L; Doria, Miguel F; Andersson, Yvonne; Medema, Gertjan; Osborn, Keith; Schlosser, Olivier; Hunter, Paul R
2007-01-01
Prevention and containment of outbreaks requires examination of the contribution and interrelation of outbreak causative events. An outbreak fault tree was developed and applied to 61 enteric outbreaks related to public drinking water supplies in the EU. A mean of 3.25 causative events per outbreak were identified; each event was assigned a score based on percentage contribution per outbreak. Source and treatment system causative events often occurred concurrently (in 34 outbreaks). Distribution system causative events occurred less frequently (19 outbreaks) but were often solitary events contributing heavily towards the outbreak (a mean % score of 87.42). Livestock and rainfall in the catchment with no/inadequate filtration of water sources contributed concurrently to 11 of 31 Cryptosporidium outbreaks. Of the 23 protozoan outbreaks experiencing at least one treatment causative event, 90% of these events were filtration deficiencies; by contrast, for bacterial, viral, gastroenteritis and mixed pathogen outbreaks, 75% of treatment events were disinfection deficiencies. Roughly equal numbers of groundwater and surface water outbreaks experienced at least one treatment causative event (18 and 17 outbreaks, respectively). Retrospective analysis of multiple outbreaks of enteric disease can be used to inform outbreak investigations, facilitate corrective measures, and further develop multi-barrier approaches.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
Macalady, Alison K; Bugmann, Harald
2014-01-01
The processes leading to drought-associated tree mortality are poorly understood, particularly long-term predisposing factors, memory effects, and variability in mortality processes and thresholds in space and time. We use tree rings from four sites to investigate Pinus edulis mortality during two drought periods in the southwestern USA. We draw on recent sampling and archived collections to (1) analyze P. edulis growth patterns and mortality during the 1950s and 2000s droughts; (2) determine the influence of climate and competition on growth in trees that died and survived; and (3) derive regression models of growth-mortality risk and evaluate their performance across space and time. Recent growth was 53% higher in surviving vs. dying trees, with some sites exhibiting decades-long growth divergences associated with previous drought. Differential growth response to climate partly explained growth differences between live and dead trees, with responses wet/cool conditions most influencing eventual tree status. Competition constrained tree growth, and reduced trees' ability to respond to favorable climate. The best predictors in growth-mortality models included long-term (15-30 year) average growth rate combined with a metric of growth variability and the number of abrupt growth increases over 15 and 10 years, respectively. The most parsimonious models had high discriminatory power (ROC>0.84) and correctly classified ∼ 70% of trees, suggesting that aspects of tree growth, especially over decades, can be powerful predictors of widespread drought-associated die-off. However, model discrimination varied across sites and drought events. Weaker growth-mortality relationships and higher growth at lower survival probabilities for some sites during the 2000s event suggest a shift in mortality processes from longer-term growth-related constraints to shorter-term processes, such as rapid metabolic decline even in vigorous trees due to acute drought stress, and/or increases in the attack rate of both chronically stressed and more vigorous trees by bark beetles.
Limmer, Matt A; Holmes, Amanda J; Burken, Joel G
2014-09-16
Long-term monitoring (LTM) of groundwater remedial projects is costly and time-consuming, particularly when using phytoremediation, a long-term remedial approach. The use of trees as sensors of groundwater contamination (i.e., phytoscreening) has been widely described, although the use of trees to provide long-term monitoring of such plumes (phytomonitoring) has been more limited due to unexplained variability of contaminant concentrations in trees. To assess this variability, we developed an in planta sampling method to obtain high-frequency measurements of chlorinated ethenes in oak (Quercus rubra) and baldcypress (Taxodium distichum) trees growing above a contaminated plume during a 4-year trial. The data set revealed that contaminant concentrations increased rapidly with transpiration in the spring and decreased in the fall, resulting in perchloroethene (PCE) and trichloroethene (TCE) sapwood concentrations an order of magnitude higher in late summer as compared to winter. Heartwood PCE and TCE concentrations were more buffered against seasonal effects. Rainfall events caused negligible dilution of contaminant concentrations in trees after precipitation events. Modeling evapotranspiration potential from meteorological data and comparing the modeled uptake and transport with the 4 years of high frequency data provides a foundation to advance the implementation of phytomonitoring and improved understanding of plant contaminant interactions.
Drivers and mechanisms of tree mortality in moist tropical forests.
McDowell, Nate; Allen, Craig D; Anderson-Teixeira, Kristina; Brando, Paulo; Brienen, Roel; Chambers, Jeff; Christoffersen, Brad; Davies, Stuart; Doughty, Chris; Duque, Alvaro; Espirito-Santo, Fernando; Fisher, Rosie; Fontes, Clarissa G; Galbraith, David; Goodsman, Devin; Grossiord, Charlotte; Hartmann, Henrik; Holm, Jennifer; Johnson, Daniel J; Kassim, Abd Rahman; Keller, Michael; Koven, Charlie; Kueppers, Lara; Kumagai, Tomo'omi; Malhi, Yadvinder; McMahon, Sean M; Mencuccini, Maurizio; Meir, Patrick; Moorcroft, Paul; Muller-Landau, Helene C; Phillips, Oliver L; Powell, Thomas; Sierra, Carlos A; Sperry, John; Warren, Jeff; Xu, Chonggang; Xu, Xiangtao
2018-02-16
Tree mortality rates appear to be increasing in moist tropical forests (MTFs) with significant carbon cycle consequences. Here, we review the state of knowledge regarding MTF tree mortality, create a conceptual framework with testable hypotheses regarding the drivers, mechanisms and interactions that may underlie increasing MTF mortality rates, and identify the next steps for improved understanding and reduced prediction. Increasing mortality rates are associated with rising temperature and vapor pressure deficit, liana abundance, drought, wind events, fire and, possibly, CO 2 fertilization-induced increases in stand thinning or acceleration of trees reaching larger, more vulnerable heights. The majority of these mortality drivers may kill trees in part through carbon starvation and hydraulic failure. The relative importance of each driver is unknown. High species diversity may buffer MTFs against large-scale mortality events, but recent and expected trends in mortality drivers give reason for concern regarding increasing mortality within MTFs. Models of tropical tree mortality are advancing the representation of hydraulics, carbon and demography, but require more empirical knowledge regarding the most common drivers and their subsequent mechanisms. We outline critical datasets and model developments required to test hypotheses regarding the underlying causes of increasing MTF mortality rates, and improve prediction of future mortality under climate change. No claim to original US government works New Phytologist © 2018 New Phytologist Trust.
Carnicer, Jofre; Coll, Marta; Ninyerola, Miquel; Pons, Xavier; Sánchez, Gerardo; Peñuelas, Josep
2011-01-01
Climate change is progressively increasing severe drought events in the Northern Hemisphere, causing regional tree die-off events and contributing to the global reduction of the carbon sink efficiency of forests. There is a critical lack of integrated community-wide assessments of drought-induced responses in forests at the macroecological scale, including defoliation, mortality, and food web responses. Here we report a generalized increase in crown defoliation in southern European forests occurring during 1987–2007. Forest tree species have consistently and significantly altered their crown leaf structures, with increased percentages of defoliation in the drier parts of their distributions in response to increased water deficit. We assessed the demographic responses of trees associated with increased defoliation in southern European forests, specifically in the Iberian Peninsula region. We found that defoliation trends are paralleled by significant increases in tree mortality rates in drier areas that are related to tree density and temperature effects. Furthermore, we show that severe drought impacts are associated with sudden changes in insect and fungal defoliation dynamics, creating long-term disruptive effects of drought on food webs. Our results reveal a complex geographical mosaic of species-specific responses to climate change–driven drought pressures on the Iberian Peninsula, with an overwhelmingly predominant trend toward increased drought damage. PMID:21220333
Carnicer, Jofre; Coll, Marta; Ninyerola, Miquel; Pons, Xavier; Sánchez, Gerardo; Peñuelas, Josep
2011-01-25
Climate change is progressively increasing severe drought events in the Northern Hemisphere, causing regional tree die-off events and contributing to the global reduction of the carbon sink efficiency of forests. There is a critical lack of integrated community-wide assessments of drought-induced responses in forests at the macroecological scale, including defoliation, mortality, and food web responses. Here we report a generalized increase in crown defoliation in southern European forests occurring during 1987-2007. Forest tree species have consistently and significantly altered their crown leaf structures, with increased percentages of defoliation in the drier parts of their distributions in response to increased water deficit. We assessed the demographic responses of trees associated with increased defoliation in southern European forests, specifically in the Iberian Peninsula region. We found that defoliation trends are paralleled by significant increases in tree mortality rates in drier areas that are related to tree density and temperature effects. Furthermore, we show that severe drought impacts are associated with sudden changes in insect and fungal defoliation dynamics, creating long-term disruptive effects of drought on food webs. Our results reveal a complex geographical mosaic of species-specific responses to climate change-driven drought pressures on the Iberian Peninsula, with an overwhelmingly predominant trend toward increased drought damage.
Panyushkina, Irina P.; Leavitt, Steven W.; Thompson, Todd A.; Schneider, Allan F.; Lange, Todd
2008-01-01
Until now, availability of wood from the Younger Dryas abrupt cooling event (YDE) in N. America ca. 12.9 to 11.6 ka has been insufficient to develop high-resolution chronologies for refining our understanding of YDE conditions. Here we present a multi-proxy tree-ring chronology (ring widths, “events” evidenced by microanatomy and macro features, stable isotopes) from a buried black spruce forest in the Great Lakes area (Liverpool East site), spanning 116 yr at ca. 12,000 cal yr BP. During this largely cold and wet period, the proxies convey a coherent and precise forest history including frost events, tilting, drowning and burial in estuarine sands as the Laurentide Ice Sheet deteriorated. In the middle of the period, a short mild interval appears to have launched the final and largest episode of tree recruitment. Ultimately the tops of the trees were sheared off after death, perhaps by wind-driven ice floes, culminating an interval of rising water and sediment deposition around the base of the trees. Although relative influences of the continental ice sheet and local effects from ancestral Lake Michigan are indeterminate, the tree-ring proxies provide important insight into environment and ecology of a N. American YDE boreal forest stand.
Bi, Sai; Chen, Min; Jia, Xiaoqiang; Dong, Ying; Wang, Zonghua
2015-07-06
A hyper-branched hybridization chain reaction (HB-HCR) is presented herein, which consists of only six species that can metastably coexist until the introduction of an initiator DNA to trigger a cascade of hybridization events, leading to the self-sustained assembly of hyper-branched and nicked double-stranded DNA structures. The system can readily achieve ultrasensitive detection of target DNA. Moreover, the HB-HCR principle is successfully applied to construct three-input concatenated logic circuits with excellent specificity and extended to design a security-mimicking keypad lock system. Significantly, the HB-HCR-based keypad lock can alarm immediately if the "password" is incorrect. Overall, the proposed HB-HCR with high amplification efficiency is simple, homogeneous, fast, robust, and low-cost, and holds great promise in the development of biosensing, in the programmable assembly of DNA architectures, and in molecular logic operations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Risk Assessment in Underground Coalmines Using Fuzzy Logic in the Presence of Uncertainty
NASA Astrophysics Data System (ADS)
Tripathy, Debi Prasad; Ala, Charan Kumar
2018-04-01
Fatal accidents are occurring every year as regular events in Indian coal mining industry. To increase the safety conditions, it has become a prerequisite to performing a risk assessment of various operations in mines. However, due to uncertain accident data, it is hard to conduct a risk assessment in mines. The object of this study is to present a method to assess safety risks in underground coalmines. The assessment of safety risks is based on the fuzzy reasoning approach. Mamdani fuzzy logic model is developed in the fuzzy logic toolbox of MATLAB. A case study is used to demonstrate the applicability of the developed model. The summary of risk evaluation in case study mine indicated that mine fire has the highest risk level among all the hazard factors. This study could help the mine management to prepare safety measures based on the risk rankings obtained.
Assessment of Seismic Damage on The Exist Buildings Using Fuzzy Logic
NASA Astrophysics Data System (ADS)
Pınar, USTA; Nihat, MOROVA; EVCİ, Ahmet; ERGÜN, Serap
2018-01-01
Earthquake as a natural disaster could damage the lives of many people and buildings all over the world. These is micvulnerability of the buildings needs to be evaluated. Accurate evaluation of damage sustained by buildings during natural disaster events is critical to determine the buildings safety and their suitability for future occupancy. The earthquake is one of the disasters that structures face the most. There fore, there is a need to evaluate seismic damage and vulnerability of the buildings to protect them. These days fuzzy systems have been widely used in different fields of science because of its simpli city and efficiency. Fuzzy logic provides a suitable framework for reasoning, deduction, and decision making in fuzzy conditions. In this paper, studies on earthquake hazard evaluation of buildings by fuzzy logic modeling concepts in the literature have been investigated and evaluated, as a whole.
Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348
Drought timing influences the legacy of tree growth recovery.
Huang, Mengtian; Wang, Xuhui; Keenan, Trevor F; Piao, Shilong
2018-05-04
Whether and how the timing of extreme events affects the direction and magnitude of legacy effects on tree growth is poorly understood. In this study, we use a global database of Ring-Width Index (RWI) from 2,500 sites to examine the impact and legacy effects (the departure of observed RWI from expected RWI) of extreme drought events during 1948-2008, with a particular focus on the influence of drought timing. We assessed the recovery of stem radial growth in the years following severe drought events with separate groupings designed to characterize the timing of the drought. We found that legacies from extreme droughts during the dry season (DS droughts) lasted longer and had larger impacts in each of the 3 years post drought than those from extreme droughts during the wet season (WS droughts). At the global scale, the average integrated legacy from DS droughts (0.18) was about nine times that from WS droughts (0.02). Site-level comparisons also suggest stronger negative impacts or weaker positive impacts of DS droughts on tree growth than WS droughts. Our results, therefore, highlight that the timing of drought is a crucial factor determining drought impacts on tree recovery. Further increases in baseline aridity could therefore exacerbate the impact of punctuated droughts on terrestrial ecosystems. © 2018 John Wiley & Sons Ltd.
Plaut, Jennifer A; Wadsworth, W Duncan; Pangle, Robert; Yepez, Enrico A; McDowell, Nate G; Pockman, William T
2013-10-01
Global climate change is predicted to alter the intensity and duration of droughts, but the effects of changing precipitation patterns on vegetation mortality are difficult to predict. Our objective was to determine whether prolonged drought or above-average precipitation altered the capacity to respond to the individual precipitation pulses that drive productivity and survival. We analyzed 5 yr of data from a rainfall manipulation experiment in piñon-juniper (Pinus edulis-Juniperus monosperma) woodland using mixed effects models of transpiration response to event size, antecedent soil moisture, and post-event vapor pressure deficit. Replicated treatments included irrigation, drought, ambient control and infrastructure control. Mortality was highest under drought, and the reduced post-pulse transpiration in the droughted trees that died was attributable to treatment effects beyond drier antecedent conditions and reduced event size. In particular, trees that died were nearly unresponsive to antecedent shallow soil moisture, suggesting reduced shallow absorbing root area. Irrigated trees showed an enhanced response to precipitation pulses. Prolonged drought initiates a downward spiral whereby trees are increasingly unable to utilize pulsed soil moisture. Thus, the additive effects of future, more frequent droughts may increase drought-related mortality. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.
Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.
Minicomputer Hardware Monitor Design.
1980-06-01
detected signals. Both the COMTEN and TESDATA systems rely on a " plugboard " arrangement where sensor inputs may be combined by means of standard gate logic...systems. A further use of the plugboard "patch panels" is to direct the measured "event" to collection and/or distribution circuitry, where the event...are plugboard and sensor hookup configurations. The available T-PACs are: o Basic System Profile o Regional Mapping o Advanced System Management
36 CFR 292.46 - Timber harvesting activities.
Code of Federal Regulations, 2014 CFR
2014-07-01
... hazard trees; or to respond to natural events such as wildfire, flood, earthquake, volcanic eruption, high winds, and disease or insect infestation. (2) Where authorized, trees may be harvested by... landscape to the extent practicable. (b) Wild and Scenic Rivers. The following standards and guidelines...
36 CFR 292.46 - Timber harvesting activities.
Code of Federal Regulations, 2011 CFR
2011-07-01
... hazard trees; or to respond to natural events such as wildfire, flood, earthquake, volcanic eruption, high winds, and disease or insect infestation. (2) Where authorized, trees may be harvested by... landscape to the extent practicable. (b) Wild and Scenic Rivers. The following standards and guidelines...
36 CFR 292.46 - Timber harvesting activities.
Code of Federal Regulations, 2013 CFR
2013-07-01
... hazard trees; or to respond to natural events such as wildfire, flood, earthquake, volcanic eruption, high winds, and disease or insect infestation. (2) Where authorized, trees may be harvested by... landscape to the extent practicable. (b) Wild and Scenic Rivers. The following standards and guidelines...
36 CFR 292.46 - Timber harvesting activities.
Code of Federal Regulations, 2012 CFR
2012-07-01
... hazard trees; or to respond to natural events such as wildfire, flood, earthquake, volcanic eruption, high winds, and disease or insect infestation. (2) Where authorized, trees may be harvested by... landscape to the extent practicable. (b) Wild and Scenic Rivers. The following standards and guidelines...
[Fuzzy logic in urology. How to reason in inaccurate terms].
Vírseda Chamorro, Miguel; Salinas Casado, Jesus; Vázquez Alba, David
2004-05-01
The Occidental thinking is basically binary, based on opposites. The classic logic constitutes a systematization of these thinking. The methods of pure sciences such as physics are based on systematic measurement, analysis and synthesis. Nature is described by deterministic differential equations this way. Medical knowledge does not adjust well to deterministic equations of physics so that probability methods are employed. However, this method is not free of problems, both theoretical and practical, so that it is not often possible even to know with certainty the probabilities of most events. On the other hand, the application of binary logic to medicine in general, and to urology particularly, finds serious difficulties such as the imprecise character of the definition of most diseases and the uncertainty associated with most medical acts. These are responsible for the fact that many medical recommendations are made using a literary language which is inaccurate, inconsistent and incoherent. The blurred logic is a way of reasoning coherently using inaccurate concepts. This logic was proposed by Lofti Zadeh in 1965 and it is based in two principles: the theory of blurred conjuncts and the use of blurred rules. A blurred conjunct is one the elements of which have a degree of belonging between 0 and 1. Each blurred conjunct is associated with an inaccurate property or linguistic variable. Blurred rules use the principles of classic logic adapted to blurred conjuncts taking the degree of belonging of each element to the blurred conjunct of reference as the value of truth. Blurred logic allows to do coherent urologic recommendations (i.e. what patient is the performance of PSA indicated in?, what to do in the face of an elevated PSA?), or to perform diagnosis adapted to the uncertainty of diagnostic tests (e.g. data obtained from pressure flow studies in females).
NASA Astrophysics Data System (ADS)
Reinemann, Deborah Jean
2000-10-01
Measures of time are essential to human life, especially in the Western world. Human understanding of time develops from the preschool stages of using "before" and "after" to an adult understanding and appreciation of time. Previous researchers (for example, Piaget, Friedman) have investigated and described stages of time development. Time, as it was investigated here, can be classified as conventional, logical or experiential. Conventional time is the ordered representation of time; the days of the week, the months of the year, or clock time: seconds and hours. Logical time is the deduction of duration based on regular events; for example, calculating the passage of time based on two separate events. Experiential time involves the duration of events and estimating intervals. With the recent production of the National Science Education Standards (NSES), many schools are aligning their science curriculum with the NSES. Time appears both implicitly and explicitly in the NSES. Do Middle School students possess the understanding of time necessary to meet the recommendations of the NSES? An interview protocol of four sessions was developed to investigate middle school students understanding of time. The four sessions included: building and testing water clocks; an interview about water clocks and time intervals; a laserdisc presentation about relative time spans; and a mind mapping session. Students were also given the GALT test of Logical Thinking. The subjects of the study were interviewed; eleven eighth grade students and thirteen sixth grade students. The data was transcribed and coded, and a rubric was developed to evaluate students based on their responses to the four sessions. The Time Analysis Rubric is a grid of the types of time: conventional, logical and experiential time versus the degree of understanding of time. Student results were assigned to levels of understanding based on the Time Analysis Rubric. There was a relationship (although not significant) between the students' GALT score and the Time Analysis Rubric results. There was no difference in Time Analysis levels between sixth and eighth grade students. On the basis of this study, Middle School students' level of understanding of time appears to be sufficient to master the requirements of the NSES.
NASA Technical Reports Server (NTRS)
Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.
2010-01-01
New foundational ideas are used to define a novel approach to generic visual pattern recognition. These ideas proceed from the starting point of the intrinsic equivalence of noise reduction and pattern recognition when noise reduction is taken to its theoretical limit of explicit matched filtering. This led us to think of the logical extension of sparse coding using basis function transforms for both de-noising and pattern recognition to the full pattern specificity of a lexicon of matched filter pattern templates. A key hypothesis is that such a lexicon can be constructed and is, in fact, a generic visual alphabet of spatial vision. Hence it provides a tractable solution for the design of a generic pattern recognition engine. Here we present the key scientific ideas, the basic design principles which emerge from these ideas, and a preliminary design of the Spatial Vision Tree (SVT). The latter is based upon a cryptographic approach whereby we measure a large aggregate estimate of the frequency of occurrence (FOO) for each pattern. These distributions are employed together with Hamming distance criteria to design a two-tier tree. Then using information theory, these same FOO distributions are used to define a precise method for pattern representation. Finally the experimental performance of the preliminary SVT on computer generated test images and complex natural images is assessed.
Scalable Cloning on Large-Scale GPU Platforms with Application to Time-Stepped Simulations on Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoginath, Srikanth B.; Perumalla, Kalyan S.
Cloning is a technique to efficiently simulate a tree of multiple what-if scenarios that are unraveled during the course of a base simulation. However, cloned execution is highly challenging to realize on large, distributed memory computing platforms, due to the dynamic nature of the computational load across clones, and due to the complex dependencies spanning the clone tree. In this paper, we present the conceptual simulation framework, algorithmic foundations, and runtime interface of CloneX, a new system we designed for scalable simulation cloning. It efficiently and dynamically creates whole logical copies of a dynamic tree of simulations across a largemore » parallel system without full physical duplication of computation and memory. The performance of a prototype implementation executed on up to 1,024 graphical processing units of a supercomputing system has been evaluated with three benchmarks—heat diffusion, forest fire, and disease propagation models—delivering a speed up of over two orders of magnitude compared to replicated runs. Finally, the results demonstrate a significantly faster and scalable way to execute many what-if scenario ensembles of large simulations via cloning using the CloneX interface.« less
The XSD-Builder Specification Language—Toward a Semantic View of XML Schema Definition
NASA Astrophysics Data System (ADS)
Fong, Joseph; Cheung, San Kuen
In the present database market, XML database model is a main structure for the forthcoming database system in the Internet environment. As a conceptual schema of XML database, XML Model has its limitation on presenting its data semantics. System analyst has no toolset for modeling and analyzing XML system. We apply XML Tree Model (shown in Figure 2) as a conceptual schema of XML database to model and analyze the structure of an XML database. It is important not only for visualizing, specifying, and documenting structural models, but also for constructing executable systems. The tree model represents inter-relationship among elements inside different logical schema such as XML Schema Definition (XSD), DTD, Schematron, XDR, SOX, and DSD (shown in Figure 1, an explanation of the terms in the figure are shown in Table 1). The XSD-Builder consists of XML Tree Model, source language, translator, and XSD. The source language is called XSD-Source which is mainly for providing an environment with concept of user friendliness while writing an XSD. The source language will consequently be translated by XSD-Translator. Output of XSD-Translator is an XSD which is our target and is called as an object language.
A practical guidance for Cramer class determination.
Roberts, David W; Aptula, Aynur; Schultz, Terry W; Shen, Jie; Api, Anne Marie; Bhatia, Sneha; Kromidas, Lambros
2015-12-01
Expanded use of the Threshold of Toxicological Concern (TTC) methodology has brought into discussion the intent of the original questions used in the Cramer scheme or Cramer decision tree. We have analysed, both manually and by Toxtree software, a large dataset of fragrance ingredients and identified several issues with the original Cramer questions. Some relate to definitions and wording of questions; others relate to in silico interpretation of the questions. We have endeavoured to address all of these inconsistencies and misinterpretations without changing the basic structure and principles of the original decision tree. Based on the analysis of a large data set of over 2500 fragrance ingredients, we found that most of the 33 questions in the original Cramer scheme are straightforward. Through repeated examination each of the 33 questions, we found 14 where the logic underlying the development of the rule is unclear. These questions are well served by minor wording changes and/or further explanation designed to capture what we perceive to be the intent of the original decision tree. The findings reported here could be used as a guidance for conducting Cramer classification and provide advices for the improvement of the in silico tools. Copyright © 2015 Elsevier Inc. All rights reserved.
Peripheral Exophytic Oral Lesions: A Clinical Decision Tree
Safi, Yaser; Jafari, Soudeh
2017-01-01
Diagnosis of peripheral oral exophytic lesions might be quite challenging. This review article aimed to introduce a decision tree for oral exophytic lesions according to their clinical features. General search engines and specialized databases including PubMed, PubMed Central, Medline Plus, EBSCO, Science Direct, Scopus, Embase, and authenticated textbooks were used to find relevant topics by means of keywords such as “oral soft tissue lesion,” “oral tumor like lesion,” “oral mucosal enlargement,” and “oral exophytic lesion.” Related English-language articles published since 1988 to 2016 in both medical and dental journals were appraised. Upon compilation of data, peripheral oral exophytic lesions were categorized into two major groups according to their surface texture: smooth (mesenchymal or nonsquamous epithelium-originated) and rough (squamous epithelium-originated). Lesions with smooth surface were also categorized into three subgroups according to their general frequency: reactive hyperplastic lesions/inflammatory hyperplasia, salivary gland lesions (nonneoplastic and neoplastic), and mesenchymal lesions (benign and malignant neoplasms). In addition, lesions with rough surface were summarized in six more common lesions. In total, 29 entities were organized in the form of a decision tree in order to help clinicians establish a logical diagnosis by a stepwise progression method. PMID:28757870
A fuzzy decision tree for fault classification.
Zio, Enrico; Baraldi, Piero; Popescu, Irina C
2008-02-01
In plant accident management, the control room operators are required to identify the causes of the accident, based on the different patterns of evolution of the monitored process variables thereby developing. This task is often quite challenging, given the large number of process parameters monitored and the intense emotional states under which it is performed. To aid the operators, various techniques of fault classification have been engineered. An important requirement for their practical application is the physical interpretability of the relationships among the process variables underpinning the fault classification. In this view, the present work propounds a fuzzy approach to fault classification, which relies on fuzzy if-then rules inferred from the clustering of available preclassified signal data, which are then organized in a logical and transparent decision tree structure. The advantages offered by the proposed approach are precisely that a transparent fault classification model is mined out of the signal data and that the underlying physical relationships among the process variables are easily interpretable as linguistic if-then rules that can be explicitly visualized in the decision tree structure. The approach is applied to a case study regarding the classification of simulated faults in the feedwater system of a boiling water reactor.
Scalable Cloning on Large-Scale GPU Platforms with Application to Time-Stepped Simulations on Grids
Yoginath, Srikanth B.; Perumalla, Kalyan S.
2018-01-31
Cloning is a technique to efficiently simulate a tree of multiple what-if scenarios that are unraveled during the course of a base simulation. However, cloned execution is highly challenging to realize on large, distributed memory computing platforms, due to the dynamic nature of the computational load across clones, and due to the complex dependencies spanning the clone tree. In this paper, we present the conceptual simulation framework, algorithmic foundations, and runtime interface of CloneX, a new system we designed for scalable simulation cloning. It efficiently and dynamically creates whole logical copies of a dynamic tree of simulations across a largemore » parallel system without full physical duplication of computation and memory. The performance of a prototype implementation executed on up to 1,024 graphical processing units of a supercomputing system has been evaluated with three benchmarks—heat diffusion, forest fire, and disease propagation models—delivering a speed up of over two orders of magnitude compared to replicated runs. Finally, the results demonstrate a significantly faster and scalable way to execute many what-if scenario ensembles of large simulations via cloning using the CloneX interface.« less
Policy tree optimization for adaptive management of water resources systems
NASA Astrophysics Data System (ADS)
Herman, Jonathan; Giuliani, Matteo
2017-04-01
Water resources systems must cope with irreducible uncertainty in supply and demand, requiring policy alternatives capable of adapting to a range of possible future scenarios. Recent studies have developed adaptive policies based on "signposts" or "tipping points" that suggest the need of updating the policy. However, there remains a need for a general method to optimize the choice of the signposts to be used and their threshold values. This work contributes a general framework and computational algorithm to design adaptive policies as a tree structure (i.e., a hierarchical set of logical rules) using a simulation-optimization approach based on genetic programming. Given a set of feature variables (e.g., reservoir level, inflow observations, inflow forecasts), the resulting policy defines both the optimal reservoir operations and the conditions under which such operations should be triggered. We demonstrate the approach using Folsom Reservoir (California) as a case study, in which operating policies must balance the risk of both floods and droughts. Numerical results show that the tree-based policies outperform the ones designed via Dynamic Programming. In addition, they display good adaptive capacity to the changing climate, successfully adapting the reservoir operations across a large set of uncertain climate scenarios.
The chordate proteome history database.
Levasseur, Anthony; Paganini, Julien; Dainat, Jacques; Thompson, Julie D; Poch, Olivier; Pontarotti, Pierre; Gouret, Philippe
2012-01-01
The chordate proteome history database (http://ioda.univ-provence.fr) comprises some 20,000 evolutionary analyses of proteins from chordate species. Our main objective was to characterize and study the evolutionary histories of the chordate proteome, and in particular to detect genomic events and automatic functional searches. Firstly, phylogenetic analyses based on high quality multiple sequence alignments and a robust phylogenetic pipeline were performed for the whole protein and for each individual domain. Novel approaches were developed to identify orthologs/paralogs, and predict gene duplication/gain/loss events and the occurrence of new protein architectures (domain gains, losses and shuffling). These important genetic events were localized on the phylogenetic trees and on the genomic sequence. Secondly, the phylogenetic trees were enhanced by the creation of phylogroups, whereby groups of orthologous sequences created using OrthoMCL were corrected based on the phylogenetic trees; gene family size and gene gain/loss in a given lineage could be deduced from the phylogroups. For each ortholog group obtained from the phylogenetic or the phylogroup analysis, functional information and expression data can be retrieved. Database searches can be performed easily using biological objects: protein identifier, keyword or domain, but can also be based on events, eg, domain exchange events can be retrieved. To our knowledge, this is the first database that links group clustering, phylogeny and automatic functional searches along with the detection of important events occurring during genome evolution, such as the appearance of a new domain architecture.
Biogeochemical hotspots following a simulated tree mortality event of southern pine beetle
NASA Astrophysics Data System (ADS)
Siegert, C. M.; Renninger, H. J.; Karunarathna, S.; Hornslein, N.; Riggins, J. J.; Clay, N. A.; Tang, J. D.; Chaney, B.; Drotar, N.
2017-12-01
Disturbances in forest ecosystems can alter functions like productivity, respiration, and nutrient cycling through the creation of biogeochemical hotspots. These events occur sporadically across the landscape, leading to uncertainty in terrestrial biosphere carbon models, which have yet to capture the full complexity of biotic and abiotic factors driving ecological processes in the terrestrial environment. Given the widespread impact of southern pine beetle on forest ecosystems throughout the southeastern United States, it is critical to management and planning activities to understand the role of these disturbances. As such, we hypothesize that bark beetle killed trees create biogeochemical hotspots in the soils surrounding their trunk as they undergo mortality due to (1) increased soil moisture from reductions in plant water uptake and increased stemflow production, (2) enhanced canopy-derived inputs of carbon and nitrogen, and (3) increased microbial activity and root mortality. In 2015, a field experiment to mimic a southern pine beetle attack was established by girdling loblolly pine trees. Subsequent measurements of throughfall and stemflow for water quantity and quality, transpiration, stem respiration, soil respiration, and soil chemistry were used to quantify the extent of spatial and temporal impacts of tree mortality on carbon budgets. Compared to control trees, girdled trees exhibited reduced water uptake within the first 6 months of the study and succumbed to mortality within 18 months. Over two years, the girdled trees generated 33% more stemflow than control trees (7836 vs. 5882 L m-2). Preliminary analysis of carbon and nitrogen concentrations and dissolved organic matter quality are still pending. In the surrounding soils, C:N ratios were greater under control trees (12.8) than under girdled trees (12.1), which was driven by an increase in carbon around control trees (+0.13 mg C mg-1 soil) and not a decrease around girdled trees (-0.01 mg C mg-1 soil), with no observed differences in N concentrations. Although data from the remaining of the 2017 growing season are still pending, we have thus far demonstrated how tree mortality from southern pine beetle changes single tree hydrologic and biogeochemical cycles.
Mousavi, S. Mostafa; Beroza, Gregory C.; Hoover, Susan M.
2018-01-01
Probabilistic seismic hazard analysis (PSHA) characterizes ground-motion hazard from earthquakes. Typically, the time horizon of a PSHA forecast is long, but in response to induced seismicity related to hydrocarbon development, the USGS developed one-year PSHA models. In this paper, we present a display of the variability in USGS hazard curves due to epistemic uncertainty in its informed submodel using a simple bootstrapping approach. We find that variability is highest in low-seismicity areas. On the other hand, areas of high seismic hazard, such as the New Madrid seismic zone or Oklahoma, exhibit relatively lower variability simply because of more available data and a better understanding of the seismicity. Comparing areas of high hazard, New Madrid, which has a history of large naturally occurring earthquakes, has lower forecast variability than Oklahoma, where the hazard is driven mainly by suspected induced earthquakes since 2009. Overall, the mean hazard obtained from bootstrapping is close to the published model, and variability increased in the 2017 one-year model relative to the 2016 model. Comparing the relative variations caused by individual logic-tree branches, we find that the highest hazard variation (as measured by the 95% confidence interval of bootstrapping samples) in the final model is associated with different ground-motion models and maximum magnitudes used in the logic tree, while the variability due to the smoothing distance is minimal. It should be pointed out that this study is not looking at the uncertainty in the hazard in general, but only as it is represented in the USGS one-year models.
Selmeryd, Jonas; Henriksen, Egil; Leppert, Jerzy; Hedberg, Pär
2016-01-01
Aims The aim of this article is to examine how the European Association of Cardiovascular Imaging (EACVI) and the American Society of Echocardiography (ASE) recommendations on the classification of diastolic dysfunction (DDF) are interpreted in the scientific community and to explore how variations in the DDF definition affect the reported prevalence. Methods and results A systematic review of studies citing the EACVI/ASE consensus document ‘Recommendations for the evaluation of left ventricular diastolic function by echocardiography’ was performed. The definition of DDF used in each study was recorded. Subsequently, several possible interpretations of the EACVI/ASE classification scheme were used to obtain DDF prevalence in a community-based sample (n = 714). In the systematic review, 60 studies were included. In 13 studies, no specification of DDF definition was presented, a one-level classification tree was used in 13, a two-level classification tree in 18, and in the remaining 16 studies, a DDF definition was presented but no grading of DDF was performed. In 17 studies, the DDF definition relied solely on early diastolic tissue velocity and/or left atrial size. In eight of these studies, a single parameter was used, in two studies the logical operator AND was used to combine two or more parameters, and the remaining seven studies used the logical operator OR. The resulting prevalence of DDF in the community-based sample varied from 12 to 84%, depending on the DDF definition used. Conclusion A substantial heterogeneity of definitions of DDF was evident among the studies reviewed, and the different definitions had a substantial impact on the reported prevalence of DDF. PMID:26374880
ERIC Educational Resources Information Center
Smith, Rebekah E.; Bayen, Ute J.
2006-01-01
Event-based prospective memory involves remembering to perform an action in response to a particular future event. Normal younger and older adults performed event-based prospective memory tasks in 2 experiments. The authors applied a formal multinomial processing tree model of prospective memory (Smith & Bayen, 2004) to disentangle age differences…
Improving the water use efficiency of olive trees growing in water harvesting systems
NASA Astrophysics Data System (ADS)
Berliner, Pedro; Leake, Salomon; Carmi, Gennady; Agam, Nurit
2017-04-01
Water is a primary limiting factor for agricultural development in many arid and semi-arid regions in which a runoff generation is a rather frequent event. If conveyed to dyke surrounded plots and ponded, runoff water can thereafter be used for tree production. One of the most promising runoff collection configurations is that of micro-catchments in which water is collected close to the area in which runoff was generated and stored in adjacent shallow pits. The objective of this work was to assess the effect of the geometry of runoff water collection area (shallow pit or trench) on direct evaporative water losses and on the water use efficiency of olive trees grown in them. The study was conducted during the summer of 2013 and 2014. In this study regular micro-catchments with basins of 9 m2 (3 x 3 m) by 0.1 m deep were compared with trenches of one meter deep and one meter wide. Each configuration was replicated three times. One tree was planted in each shallow basin and the distance between trees in the 12 m long trench was four meters. Access tubes for neutron probes were installed in the micro-catchments and trenches (four and seven, respectively) to depths of 2.5 m. Soil water content in the soil profile was monitored periodically throughout drying periods in between simulated runoff events. Transpiration of the trees was estimated from half-hourly sap flow measurements using a Granier system. Total transpiration fluxes were computed for time intervals corresponding to consecutive soil water measurements. During the first year, a large runoff event was simulated by applying once four cubic meters to each plot; and in the second year the same volume of water was split into four applications, simulating a series of small runoff events. In both geometries, trees received the same amount of water per tree. Evaporation from trenches and micro-catchments was estimated as the difference between evapotranspiration obtained computing the differences in total soil water content between two consecutive measurements and transpiration for this interval estimated from sap flow measurements. In both years the evaporation from micro-catchments was significantly larger than that of trenches. The fractional loss due to evaporation from the total applied water for the second year for example, was 53% and 22% for micro-catchments and trenches, respectively. This indicates that a trench geometry reduces the amount of water lost to direct evaporation from the soil, and is thus more efficient in utilizing harvested runoff water.
NASA Astrophysics Data System (ADS)
Van Stan, John T.; Wagner, Sasha; Guillemette, François; Whitetree, Ansley; Lewis, Julius; Silva, Leticia; Stubbins, Aron
2017-11-01
Studies on the fate and transport of dissolved organic matter (DOM) along the rainfall-to-discharge flow pathway typically begin in streams or soils, neglecting the initial enrichment of rainfall with DOM during contact with plant canopies. However, rain water can gather significant amounts of tree-derived DOM (tree-DOM) when it drains from the canopy, as throughfall, and down the stem, as stemflow. We examined the temporal variability of event-scale tree-DOM concentrations, yield, and optical (light absorbance and fluorescence) characteristics from an epiphyte-laden Quercus virginiana-Juniperus virginiana forest on Skidaway Island, Savannah, Georgia (USA). All tree-DOM fluxes were highly enriched in dissolved organic carbon (DOC) compared to rainfall, and epiphytes further increased concentrations. Stemflow DOC concentrations were greater than throughfall across study species, yet larger throughfall water yields produced greater DOC yields versus stemflow. Tree-DOM optical characteristics indicate it is aromatic-rich with fluorescent DOM dominated by humic-like fluorescence, containing 10-20% protein-like (tryptophan-like) fluorescence. Storm size was the only storm condition that strongly correlated with tree-DOM concentration and flux; however, throughfall and stemflow optical characteristics varied little across a wide range of storm conditions (from low magnitude events to intense tropical storms). Annual tree-DOM yields from the study forest (0.8-46 g C m-2 yr-1) were similar to other yields from discrete down-gradient fluxes (litter leachates, soil leachates, and stream discharge) along the rainfall-to-discharge flow path.
Macalady, Alison K.; Bugmann, Harald
2014-01-01
The processes leading to drought-associated tree mortality are poorly understood, particularly long-term predisposing factors, memory effects, and variability in mortality processes and thresholds in space and time. We use tree rings from four sites to investigate Pinus edulis mortality during two drought periods in the southwestern USA. We draw on recent sampling and archived collections to (1) analyze P. edulis growth patterns and mortality during the 1950s and 2000s droughts; (2) determine the influence of climate and competition on growth in trees that died and survived; and (3) derive regression models of growth-mortality risk and evaluate their performance across space and time. Recent growth was 53% higher in surviving vs. dying trees, with some sites exhibiting decades-long growth divergences associated with previous drought. Differential growth response to climate partly explained growth differences between live and dead trees, with responses wet/cool conditions most influencing eventual tree status. Competition constrained tree growth, and reduced trees’ ability to respond to favorable climate. The best predictors in growth-mortality models included long-term (15–30 year) average growth rate combined with a metric of growth variability and the number of abrupt growth increases over 15 and 10 years, respectively. The most parsimonious models had high discriminatory power (ROC>0.84) and correctly classified ∼70% of trees, suggesting that aspects of tree growth, especially over decades, can be powerful predictors of widespread drought-associated die-off. However, model discrimination varied across sites and drought events. Weaker growth-mortality relationships and higher growth at lower survival probabilities for some sites during the 2000s event suggest a shift in mortality processes from longer-term growth-related constraints to shorter-term processes, such as rapid metabolic decline even in vigorous trees due to acute drought stress, and/or increases in the attack rate of both chronically stressed and more vigorous trees by bark beetles. PMID:24786646
NASA Astrophysics Data System (ADS)
Wilson, A.; Jackson, R. B.; Tumber-Davila, S. J.
2017-12-01
An increase in the frequency and severity of droughts has been associated with the changing climate. These events have the potential to alter the composition and biogeography of forests, as well as increase tree mortality related to climate-induced stress. Already, an increase in tree mortality has been observed throughout the US. The recent drought in California led to millions of tree mortalities in the southern Sierra Nevada alone. In order to assess the potential impacts of these events on forest systems, it is imperative to understand what factors contribute to tree mortality. As plants become water-stressed, they may invest carbon more heavily belowground to reach a bigger pool of water, but their ability to adapt may be limited by the characteristics of the soil. In the Southern Sierra Critical Zone Observatory, a high tree mortality zone, we have selected both dead and living trees to examine the factors that contribute to root zone variability and belowground biomass investment by individual plants. A series of 15 cores surrounding the tree were taken to collect root and soil samples. These were then used to compare belowground rooting distributions with soil characteristics (texture, water holding capacity, pH, electric conductivity). Abies concolor is heavily affected by drought-induced mortality, therefore the rooting systems of dead Abies concolor trees were examined to determine the relationship between their rooting systems and environmental conditions. Examining the relationship between soil characteristics and rooting systems of trees may shed light on the plasticity of rooting systems and how trees adapt based on the characteristics of its environment. A better understanding of the factors that contribute to tree mortality can improve our ability to predict how forest systems may be impacted by climate-induced stress. Key words: Root systems, soil characteristics, drought, adaptation, terrestrial carbon, forest ecology
An adiabatic quantum flux parametron as an ultra-low-power logic device
NASA Astrophysics Data System (ADS)
Takeuchi, Naoki; Ozawa, Dan; Yamanashi, Yuki; Yoshikawa, Nobuyuki
2013-03-01
Ultra-low-power adiabatic quantum flux parametron (QFP) logic is investigated since it has the potential to reduce the bit energy per operation to the order of the thermal energy. In this approach, nonhysteretic QFPs are operated slowly to prevent nonadiabatic energy dissipation occurring during switching events. The designed adiabatic QFP gate is estimated to have a dynamic energy dissipation of 12% of IcΦ0 for a rise/fall time of 1000 ps. It can be further reduced by reducing circuit inductances. Three stages of adiabatic QFP NOT gates were fabricated using a Nb Josephson integrated circuit process and their correct operation was confirmed.
Globalization, Education, and Citizenship: Solidarity versus Markets?
ERIC Educational Resources Information Center
Torres, Carlos Alberto
2002-01-01
Suggests that globalization places limits on state autonomy and national sovereignty, affecting education in various ways. Educational policy and its contributions to citizenship, democracy, and multiculturalism will face unprecedented challenges if the logic of fear, exacerbated by the events of September 11, 2001 prevails. (Author/SLD)
Conceptualizing Organizational Climates. Research Report No. 7.
ERIC Educational Resources Information Center
Schneider, Benjamin
Part 1 of this paper presents some logical and conceptual distinctions between job satisfaction and organizational climate, the former being viewed as micro, evaluative, individual perceptions of personal events and experiences the latter as macro, relatively descriptive, organizational level perceptions that are abstractions of organizational…
NASA Technical Reports Server (NTRS)
Butler, Douglas J.; Kerstman, Eric
2010-01-01
This slide presentation reviews the goals and approach for the Integrated Medical Model (IMM). The IMM is a software decision support tool that forecasts medical events during spaceflight and optimizes medical systems during simulations. It includes information on the software capabilities, program stakeholders, use history, and the software logic.
NASA Astrophysics Data System (ADS)
Zaginaev, V.; Ballesteros-Cánovas, J. A.; Erokhin, S.; Matov, E.; Petrakov, D.; Stoffel, M.
2016-09-01
Glacier lake outburst floods (GLOFs) and related debris flows are among the most significant natural threats in the Tien Shan Mountains of Kyrgyzstan and have even caused the loss of life and damage to infrastructure in its capital Bishkek. An improved understanding of the occurrence of this process is essential so as to be able to design reliable disaster risk reduction strategies, even more so in view of ongoing climate change and scenarios of future evolutions. Here, we apply a dendrogeomorphic approach to reconstruct past debris-flow activity on the Aksay cone (Ala-Archa valley, Kyrgyz range), where outbursting glacier lakes and intense rainfalls have triggered huge debris flows over the past decades. A total of 96 Picea abies (L.) Karst. trees growing on the cone and along the main channel have been selected based on the evidence of past debris-flow damage in their trunks; these trees were then sampled using increment borers. The dating of past events was based on the assessment of growth disturbances (GD) in the tree-ring records and included the detection of injuries, tangential rows of traumatic resin ducts, reaction wood, and abrupt growth changes. In total, 320 GD were identified in the tree-ring samples. In combination with aerial imagery and geomorphic recognition in the field, reactions in trees and their position on the cone have allowed reconstruction of the main spatial patterns of past events on the Aksay cone. Our findings suggest that at least 27 debris flows have occurred on the site between 1877 and 2015 and point to the occurrence of at least 17 events that were not documented prior to this study. We also observe high process activity during the 1950s and 1960s, with major events on the cone in 1950, 1966, and 1968, coinciding with phases of slight glacier advance. The spatial analyses of events also point to two different spatial patterns, suggesting that quite dissimilar magnitudes probably occurred during glacier lake outburst floods and rainfall-induced debris-flow events. The results presented here represent the longest, annually resolved GLOF series in the region, which in turn has key implications on risk assessment, not just in the Ala-Archa valley, but also in the entire Kyrgyz range (northern Tien Shan).
NASA Astrophysics Data System (ADS)
Millar, C. I.; Westfall, R. D.; Delany, D. L.
2010-12-01
Widespread forest mortality in high-elevation forests has been increasing across western North American mountains in recent years, with climate, insects, and disease the primary causes. Subalpine forests in the eastern Sierra Nevada, by contrast, have experienced far less mortality than other ranges, and mortality events have been patchy and episodic. This situation, and lack of significant effect of non-native white-pine blister rust, enable investigation of fine-scale response of two subalpine Sierran species, whitebark pine (Pinus albicaulis, PiAl) and limber pine (P. flexilis, PiFl), to climate variability. We report similarities and differences between the two major mortality events in these pines in the last 150 years: 1988-1992 for PiFl and 2006-ongoing for PiAl. In both species, the events occurred within monotypic, closed-canopy, relatively young stands (< 200 yrs PiAl, < 300 yrs in PiFl); were localized to central-eastern Sierra Nevada; and occurred at 2740-2840 m along the eastern edge of the escarpment on north/northeast aspects with slopes > 40%. Mortality patches averaged 40-80 ha in both species, with mean stand mortality of trees > 10 cm diameter 91% in PiAl and 60% in PiFl. The ultimate cause of tree death was mountain pine beetle (Dendroctonus ponderosae) in both species, with increasing 20th/21st C minimum temperatures combined with drought the pre-conditioning factors. Overall growth in the past 150 years suggests that PiFl is more drought hardy than PiAl but responds sensitively to the combined effects of drought and increasing warmth. After the 1988-1992 drought, surviving PiFl recovered growth. PiAl trees grew very poorly during that drought, and continued poor growth in the years until 2006 when the mortality event occurred in PiAl. A significant species effect is the apparent difference in levels of within-stand genetic diversity for climate factors. Differential growth between 19th C (cool, wet) and 20th/21st C (warming, drying) of PiFl trees that died versus survivors indicates that considerable within-stand genetic diversity for climate existed in PiFl. For PiFl, the late 20th C mortality event acted as strong natural selection to improve within-stand fitness for warmer and drier conditions. PiFl trees that survived the 1988-1992 drought remained healthy through subsequent droughts, including the drought that is currently causing PiAl mortality. By contrast, the PiAl stands do not appear to have contained adaptive genetic diversity for drought and warmth, and PiAl trees growth behavior over the past 150 years was similar in pattern to the PiFl trees that died. As a result, the mortality event in PiAl is creating forest openings, with unknown future stand conditions, rather than rapid within-species adaptation that occurred in PiFl.
Bounding the Resource Availability of Partially Ordered Events with Constant Resource Impact
NASA Technical Reports Server (NTRS)
Frank, Jeremy
2004-01-01
We compare existing techniques to bound the resource availability of partially ordered events. We first show that, contrary to intuition, two existing techniques, one due to Laborie and one due to Muscettola, are not strictly comparable in terms of the size of the search trees generated under chronological search with a fixed heuristic. We describe a generalization of these techniques called the Flow Balance Constraint to tightly bound the amount of available resource for a set of partially ordered events with piecewise constant resource impact We prove that the new technique generates smaller proof trees under chronological search with a fixed heuristic, at little increase in computational expense. We then show how to construct tighter resource bounds but at increased computational cost.
Genetic diversity-seeing the forest through the trees
M. Thompson Conkle
1992-01-01
Forest trees, populations, races, species, and taxonomic groups above the species level display rich variation in biochemical markers. The variation stems from inherited modifications that trace back in time, through converging ancestries, towards common progenitors. Past movements of continents, mountain building events, and climate changes isolated forest populations...
30 CFR 250.618 - Tubing and wellhead equipment.
Code of Federal Regulations, 2011 CFR
2011-07-01
... following requirements during well-workover operations with the tree removed: (a) No tubing string shall be... pressure integrity and is otherwise suitable for its intended use. (b) In the event of prolonged operations... Manager. (c) When reinstalling the tree, you must: (1) Equip wells to monitor for casing pressure...
2013-06-01
collection are the facts that devices the lack encryption or compression methods and that the log file must be saved on the host system prior to transfer...time. Statistical correlation utilizes numerical algorithms to detect deviations from normal event levels and other routine activities (Chuvakin...can also assist in detecting low volume threats. Although easy and logical to implement, the implementation of statistical correlation algorithms
Discrete-Event Simulation Unmasks the Quantum Cheshire Cat
NASA Astrophysics Data System (ADS)
Michielsen, Kristel; Lippert, Thomas; Raedt, Hans De
2017-05-01
It is shown that discrete-event simulation accurately reproduces the experimental data of a single-neutron interferometry experiment [T. Denkmayr {\\sl et al.}, Nat. Commun. 5, 4492 (2014)] and provides a logically consistent, paradox-free, cause-and-effect explanation of the quantum Cheshire cat effect without invoking the notion that the neutron and its magnetic moment separate. Describing the experimental neutron data using weak-measurement theory is shown to be useless for unravelling the quantum Cheshire cat effect.
A Unified Approach to Abductive Inference
2014-09-30
learning in “ Big data ” domains. COMBINING MARKOV LOGIC AND SUPPORT VECTOR MACHINES FOR EVENT EXTRACTION Event extraction is the task of...and achieves stateoftheart performance. This makes it an ideal candidate for learning in “ Big data ...including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the
Grossi, Enzo
2006-01-01
Background In recent years a number of algorithms for cardiovascular risk assessment has been proposed to the medical community. These algorithms consider a number of variables and express their results as the percentage risk of developing a major fatal or non-fatal cardiovascular event in the following 10 to 20 years Discussion The author has identified three major pitfalls of these algorithms, linked to the limitation of the classical statistical approach in dealing with this kind of non linear and complex information. The pitfalls are the inability to capture the disease complexity, the inability to capture process dynamics, and the wide confidence interval of individual risk assessment. Artificial Intelligence tools can provide potential advantage in trying to overcome these limitations. The theoretical background and some application examples related to artificial neural networks and fuzzy logic have been reviewed and discussed. Summary The use of predictive algorithms to assess individual absolute risk of cardiovascular future events is currently hampered by methodological and mathematical flaws. The use of newer approaches, such as fuzzy logic and artificial neural networks, linked to artificial intelligence, seems to better address both the challenge of increasing complexity resulting from a correlation between predisposing factors, data on the occurrence of cardiovascular events, and the prediction of future events on an individual level. PMID:16672045
Tichavský, Radek; Šilhán, Karel; Tolasz, Radim
2017-02-01
Hydro-geomorphic processes have significantly influenced the recent development of valley floors, river banks and depositional forms in mountain environments, have caused considerable damage to manmade developments and have disrupted forest management. Trees growing along streams are affected by the transported debris mass and provide valuable records of debris flow/flood histories in their tree-ring series. Dendrogeomorphic approaches are currently the most accurate methods for creating a chronology of the debris flow/flood events in forested catchments without any field-monitoring or a stream-gauging station. Comprehensive studies focusing on the detailed chronology of hydro-geomorphic events and analysis of meteorological triggers and weather circulation patterns are still lacking for the studied area. We provide a spatio-temporal reconstruction of hydro-geomorphic events in four catchments of the Hrubý Jeseník Mountains, Czech Republic, with an analysis of their triggering factors using meteorological data from four nearby rain gauges. Increment cores from 794 coniferous trees (Picea abies [L.] Karst.) allowed the identification of 40 hydro-geomorphic events during the period of 1889-2013. Most of the events can be explained by extreme daily rainfalls (≥50mm) occurring in at least one rain gauge. However, in several cases, there was no record of extreme precipitation at rain gauges during the debris flow/flood event year, suggesting extremely localised rainstorms at the mountain summits. We concluded that the localisation, intensity and duration of rainstorms; antecedent moisture conditions; and amount of available sediments all influenced the initiation, spatial distribution and characteristics of hydro-geomorphic events. The most frequent synoptic situations responsible for the extreme rainfalls (1946-2015) were related to the meridional atmospheric circulation pattern. Our results enhance current knowledge of the occurrences and triggers of debris flows/floods in the Central European mountains in transition between temperate oceanic and continental climatic conditions and may prompt further research of these phenomena in the Eastern Sudetes in general. Copyright © 2016 Elsevier B.V. All rights reserved.
USGS National Seismic Hazard Maps
Frankel, A.D.; Mueller, C.S.; Barnhard, T.P.; Leyendecker, E.V.; Wesson, R.L.; Harmsen, S.C.; Klein, F.W.; Perkins, D.M.; Dickman, N.C.; Hanson, S.L.; Hopper, M.G.
2000-01-01
The U.S. Geological Survey (USGS) recently completed new probabilistic seismic hazard maps for the United States, including Alaska and Hawaii. These hazard maps form the basis of the probabilistic component of the design maps used in the 1997 edition of the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, prepared by the Building Seismic Safety Council arid published by FEMA. The hazard maps depict peak horizontal ground acceleration and spectral response at 0.2, 0.3, and 1.0 sec periods, with 10%, 5%, and 2% probabilities of exceedance in 50 years, corresponding to return times of about 500, 1000, and 2500 years, respectively. In this paper we outline the methodology used to construct the hazard maps. There are three basic components to the maps. First, we use spatially smoothed historic seismicity as one portion of the hazard calculation. In this model, we apply the general observation that moderate and large earthquakes tend to occur near areas of previous small or moderate events, with some notable exceptions. Second, we consider large background source zones based on broad geologic criteria to quantify hazard in areas with little or no historic seismicity, but with the potential for generating large events. Third, we include the hazard from specific fault sources. We use about 450 faults in the western United States (WUS) and derive recurrence times from either geologic slip rates or the dating of pre-historic earthquakes from trenching of faults or other paleoseismic methods. Recurrence estimates for large earthquakes in New Madrid and Charleston, South Carolina, were taken from recent paleoliquefaction studies. We used logic trees to incorporate different seismicity models, fault recurrence models, Cascadia great earthquake scenarios, and ground-motion attenuation relations. We present disaggregation plots showing the contribution to hazard at four cities from potential earthquakes with various magnitudes and distances.
Earthquake Rupture Forecast of M>= 6 for the Corinth Rift System
NASA Astrophysics Data System (ADS)
Scotti, O.; Boiselet, A.; Lyon-Caen, H.; Albini, P.; Bernard, P.; Briole, P.; Ford, M.; Lambotte, S.; Matrullo, E.; Rovida, A.; Satriano, C.
2014-12-01
Fourteen years of multidisciplinary observations and data collection in the Western Corinth Rift (WCR) near-fault observatory have been recently synthesized (Boiselet, Ph.D. 2014) for the purpose of providing earthquake rupture forecasts (ERF) of M>=6 in WCR. The main contribution of this work consisted in paving the road towards the development of a "community-based" fault model reflecting the level of knowledge gathered thus far by the WCR working group. The most relevant available data used for this exercise are: - onshore/offshore fault traces, based on geological and high-resolution seismics, revealing a complex network of E-W striking, ~10 km long fault segments; microseismicity recorded by a dense network ( > 60000 events; 1.5
Johanson, Bradley E.; Fox, Armando; Winograd, Terry A.; Hanrahan, Patrick M.
2010-04-20
An efficient and adaptive middleware infrastructure called the Event Heap system dynamically coordinates application interactions and communications in a ubiquitous computing environment, e.g., an interactive workspace, having heterogeneous software applications running on various machines and devices across different platforms. Applications exchange events via the Event Heap. Each event is characterized by a set of unordered, named fields. Events are routed by matching certain attributes in the fields. The source and target versions of each field are automatically set when an event is posted or used as a template. The Event Heap system implements a unique combination of features, both intrinsic to tuplespaces and specific to the Event Heap, including content based addressing, support for routing patterns, standard routing fields, limited data persistence, query persistence/registration, transparent communication, self-description, flexible typing, logical/physical centralization, portable client API, at most once per source first-in-first-out ordering, and modular restartability.
The ethnobotany of Christ's Thorn Jujube (Ziziphus spina-christi) in Israel.
Dafni, Amots; Levy, Shay; Lev, Efraim
2005-09-28
This article surveys the ethnobotany of Ziziphus spina-christi (L.) Desf. in the Middle East from various aspects: historical, religious, philological, literary, linguistic, as well as pharmacological, among Muslims, Jews, and Christians. It is suggested that this is the only tree species considered "holy" by Muslims (all the individuals of the species are sanctified by religion) in addition to its status as "sacred tree " (particular trees which are venerated due to historical or magical events related to them, regardless of their botanical identity) in the Middle East. It has also a special status as "blessed tree" among the Druze.