Bayes factors and multimodel inference
Link, W.A.; Barker, R.J.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.
2009-01-01
Multimodel inference has two main themes: model selection, and model averaging. Model averaging is a means of making inference conditional on a model set, rather than on a selected model, allowing formal recognition of the uncertainty associated with model choice. The Bayesian paradigm provides a natural framework for model averaging, and provides a context for evaluation of the commonly used AIC weights. We review Bayesian multimodel inference, noting the importance of Bayes factors. Noting the sensitivity of Bayes factors to the choice of priors on parameters, we define and propose nonpreferential priors as offering a reasonable standard for objective multimodel inference.
A Bayes linear Bayes method for estimation of correlated event rates.
Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim
2013-12-01
Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.
Machine Learning for Information Extraction in Informal Domains
1998-11-01
Bayes algorithm (BayeslDF), a hybrid of BayeslDF and the grammatical inference algo- rithm Alergia (BayesGI), and a relational learner (SRV). It...State-Merging Methods 59 4.1.3 Alergia 61 4.2 Inferring Transducers 62 4.3 Experiments 66 4.4 Discussion 72 Relational Learning for...65 4.5 Precision/recall results for Alergia and BayesG I on the speaker field, with the alphabet transducer produced using m-estimates, at various
Sandoval-Castellanos, Edson; Palkopoulou, Eleftheria; Dalén, Love
2014-01-01
Inference of population demographic history has vastly improved in recent years due to a number of technological and theoretical advances including the use of ancient DNA. Approximate Bayesian computation (ABC) stands among the most promising methods due to its simple theoretical fundament and exceptional flexibility. However, limited availability of user-friendly programs that perform ABC analysis renders it difficult to implement, and hence programming skills are frequently required. In addition, there is limited availability of programs able to deal with heterochronous data. Here we present the software BaySICS: Bayesian Statistical Inference of Coalescent Simulations. BaySICS provides an integrated and user-friendly platform that performs ABC analyses by means of coalescent simulations from DNA sequence data. It estimates historical demographic population parameters and performs hypothesis testing by means of Bayes factors obtained from model comparisons. Although providing specific features that improve inference from datasets with heterochronous data, BaySICS also has several capabilities making it a suitable tool for analysing contemporary genetic datasets. Those capabilities include joint analysis of independent tables, a graphical interface and the implementation of Markov-chain Monte Carlo without likelihoods.
A Test by Any Other Name: P Values, Bayes Factors, and Statistical Inference.
Stern, Hal S
2016-01-01
Procedures used for statistical inference are receiving increased scrutiny as the scientific community studies the factors associated with insuring reproducible research. This note addresses recent negative attention directed at p values, the relationship of confidence intervals and tests, and the role of Bayesian inference and Bayes factors, with an eye toward better understanding these different strategies for statistical inference. We argue that researchers and data analysts too often resort to binary decisions (e.g., whether to reject or accept the null hypothesis) in settings where this may not be required.
Order-Constrained Bayes Inference for Dichotomous Models of Unidimensional Nonparametric IRT
ERIC Educational Resources Information Center
Karabatsos, George; Sheu, Ching-Fan
2004-01-01
This study introduces an order-constrained Bayes inference framework useful for analyzing data containing dichotomous scored item responses, under the assumptions of either the monotone homogeneity model or the double monotonicity model of nonparametric item response theory (NIRT). The framework involves the implementation of Gibbs sampling to…
Nonadditive entropy maximization is inconsistent with Bayesian updating
NASA Astrophysics Data System (ADS)
Pressé, Steve
2014-11-01
The maximum entropy method—used to infer probabilistic models from data—is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.
Nonadditive entropy maximization is inconsistent with Bayesian updating.
Pressé, Steve
2014-11-01
The maximum entropy method-used to infer probabilistic models from data-is a special case of Bayes's model inference prescription which, in turn, is grounded in basic propositional logic. By contrast to the maximum entropy method, the compatibility of nonadditive entropy maximization with Bayes's model inference prescription has never been established. Here we demonstrate that nonadditive entropy maximization is incompatible with Bayesian updating and discuss the immediate implications of this finding. We focus our attention on special cases as illustrations.
Kneissler, Jan; Drugowitsch, Jan; Friston, Karl; Butz, Martin V
2015-01-01
Predictive coding appears to be one of the fundamental working principles of brain processing. Amongst other aspects, brains often predict the sensory consequences of their own actions. Predictive coding resembles Kalman filtering, where incoming sensory information is filtered to produce prediction errors for subsequent adaptation and learning. However, to generate prediction errors given motor commands, a suitable temporal forward model is required to generate predictions. While in engineering applications, it is usually assumed that this forward model is known, the brain has to learn it. When filtering sensory input and learning from the residual signal in parallel, a fundamental problem arises: the system can enter a delusional loop when filtering the sensory information using an overly trusted forward model. In this case, learning stalls before accurate convergence because uncertainty about the forward model is not properly accommodated. We present a Bayes-optimal solution to this generic and pernicious problem for the case of linear forward models, which we call Predictive Inference and Adaptive Filtering (PIAF). PIAF filters incoming sensory information and learns the forward model simultaneously. We show that PIAF is formally related to Kalman filtering and to the Recursive Least Squares linear approximation method, but combines these procedures in a Bayes optimal fashion. Numerical evaluations confirm that the delusional loop is precluded and that the learning of the forward model is more than 10-times faster when compared to a naive combination of Kalman filtering and Recursive Least Squares.
Höhna, Sebastian; Landis, Michael J.
2016-01-01
Programs for Bayesian inference of phylogeny currently implement a unique and fixed suite of models. Consequently, users of these software packages are simultaneously forced to use a number of programs for a given study, while also lacking the freedom to explore models that have not been implemented by the developers of those programs. We developed a new open-source software package, RevBayes, to address these problems. RevBayes is entirely based on probabilistic graphical models, a powerful generic framework for specifying and analyzing statistical models. Phylogenetic-graphical models can be specified interactively in RevBayes, piece by piece, using a new succinct and intuitive language called Rev. Rev is similar to the R language and the BUGS model-specification language, and should be easy to learn for most users. The strength of RevBayes is the simplicity with which one can design, specify, and implement new and complex models. Fortunately, this tremendous flexibility does not come at the cost of slower computation; as we demonstrate, RevBayes outperforms competing software for several standard analyses. Compared with other programs, RevBayes has fewer black-box elements. Users need to explicitly specify each part of the model and analysis. Although this explicitness may initially be unfamiliar, we are convinced that this transparency will improve understanding of phylogenetic models in our field. Moreover, it will motivate the search for improvements to existing methods by brazenly exposing the model choices that we make to critical scrutiny. RevBayes is freely available at http://www.RevBayes.com. [Bayesian inference; Graphical models; MCMC; statistical phylogenetics.] PMID:27235697
Höhna, Sebastian; Landis, Michael J; Heath, Tracy A; Boussau, Bastien; Lartillot, Nicolas; Moore, Brian R; Huelsenbeck, John P; Ronquist, Fredrik
2016-07-01
Programs for Bayesian inference of phylogeny currently implement a unique and fixed suite of models. Consequently, users of these software packages are simultaneously forced to use a number of programs for a given study, while also lacking the freedom to explore models that have not been implemented by the developers of those programs. We developed a new open-source software package, RevBayes, to address these problems. RevBayes is entirely based on probabilistic graphical models, a powerful generic framework for specifying and analyzing statistical models. Phylogenetic-graphical models can be specified interactively in RevBayes, piece by piece, using a new succinct and intuitive language called Rev. Rev is similar to the R language and the BUGS model-specification language, and should be easy to learn for most users. The strength of RevBayes is the simplicity with which one can design, specify, and implement new and complex models. Fortunately, this tremendous flexibility does not come at the cost of slower computation; as we demonstrate, RevBayes outperforms competing software for several standard analyses. Compared with other programs, RevBayes has fewer black-box elements. Users need to explicitly specify each part of the model and analysis. Although this explicitness may initially be unfamiliar, we are convinced that this transparency will improve understanding of phylogenetic models in our field. Moreover, it will motivate the search for improvements to existing methods by brazenly exposing the model choices that we make to critical scrutiny. RevBayes is freely available at http://www.RevBayes.com [Bayesian inference; Graphical models; MCMC; statistical phylogenetics.]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
Bayes factors for the linear ballistic accumulator model of decision-making.
Evans, Nathan J; Brown, Scott D
2018-04-01
Evidence accumulation models of decision-making have led to advances in several different areas of psychology. These models provide a way to integrate response time and accuracy data, and to describe performance in terms of latent cognitive processes. Testing important psychological hypotheses using cognitive models requires a method to make inferences about different versions of the models which assume different parameters to cause observed effects. The task of model-based inference using noisy data is difficult, and has proven especially problematic with current model selection methods based on parameter estimation. We provide a method for computing Bayes factors through Monte-Carlo integration for the linear ballistic accumulator (LBA; Brown and Heathcote, 2008), a widely used evidence accumulation model. Bayes factors are used frequently for inference with simpler statistical models, and they do not require parameter estimation. In order to overcome the computational burden of estimating Bayes factors via brute force integration, we exploit general purpose graphical processing units; we provide free code for this. This approach allows estimation of Bayes factors via Monte-Carlo integration within a practical time frame. We demonstrate the method using both simulated and real data. We investigate the stability of the Monte-Carlo approximation, and the LBA's inferential properties, in simulation studies.
Default Bayes Factors for Model Selection in Regression
ERIC Educational Resources Information Center
Rouder, Jeffrey N.; Morey, Richard D.
2012-01-01
In this article, we present a Bayes factor solution for inference in multiple regression. Bayes factors are principled measures of the relative evidence from data for various models or positions, including models that embed null hypotheses. In this regard, they may be used to state positive evidence for a lack of an effect, which is not possible…
Application of Transformations in Parametric Inference
ERIC Educational Resources Information Center
Brownstein, Naomi; Pensky, Marianna
2008-01-01
The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…
Quantum cognition based on an ambiguous representation derived from a rough set approximation.
Gunji, Yukio-Pegio; Sonoda, Kohei; Basios, Vasileios
2016-03-01
Over the last years, in a series papers by Arecchi and others, a model for the cognitive processes involved in decision making has been proposed and investigated. The key element of this model is the expression of apprehension and judgment, basic cognitive process of decision making, as an inverse Bayes inference classifying the information content of neuron spike trains. It has been shown that for successive plural stimuli this inference, equipped with basic non-algorithmic jumps, is affected by quantum-like characteristics. We show here that such a decision making process is related consistently with an ambiguous representation by an observer within a universe of discourse. In our work the ambiguous representation of an object or a stimuli is defined as a pair of maps from objects of a set to their representations, where these two maps are interrelated in a particular structure. The a priori and a posteriori hypotheses in Bayes inference are replaced by the upper and lower approximations, correspondingly, for the initial data sets that are derived with respect to each map. Upper and lower approximations herein are defined in the context of "rough set" analysis. The inverse Bayes inference is implemented by the lower approximations with respect to the one map and for the upper approximation with respect to the other map for a given data set. We show further that, due to the particular structural relation between the two maps, the logical structure of such combined approximations can only be expressed as an orthomodular lattice and therefore can be represented by a quantum rather than a Boolean logic. To our knowledge, this is the first investigation aiming to reveal the concrete logic structure of inverse Bayes inference in cognitive processes. Copyright © 2016. Published by Elsevier Ireland Ltd.
The anatomy of choice: dopamine and decision-making
Friston, Karl; Schwartenbeck, Philipp; FitzGerald, Thomas; Moutoussis, Michael; Behrens, Timothy; Dolan, Raymond J.
2014-01-01
This paper considers goal-directed decision-making in terms of embodied or active inference. We associate bounded rationality with approximate Bayesian inference that optimizes a free energy bound on model evidence. Several constructs such as expected utility, exploration or novelty bonuses, softmax choice rules and optimism bias emerge as natural consequences of free energy minimization. Previous accounts of active inference have focused on predictive coding. In this paper, we consider variational Bayes as a scheme that the brain might use for approximate Bayesian inference. This scheme provides formal constraints on the computational anatomy of inference and action, which appear to be remarkably consistent with neuroanatomy. Active inference contextualizes optimal decision theory within embodied inference, where goals become prior beliefs. For example, expected utility theory emerges as a special case of free energy minimization, where the sensitivity or inverse temperature (associated with softmax functions and quantal response equilibria) has a unique and Bayes-optimal solution. Crucially, this sensitivity corresponds to the precision of beliefs about behaviour. The changes in precision during variational updates are remarkably reminiscent of empirical dopaminergic responses—and they may provide a new perspective on the role of dopamine in assimilating reward prediction errors to optimize decision-making. PMID:25267823
The anatomy of choice: dopamine and decision-making.
Friston, Karl; Schwartenbeck, Philipp; FitzGerald, Thomas; Moutoussis, Michael; Behrens, Timothy; Dolan, Raymond J
2014-11-05
This paper considers goal-directed decision-making in terms of embodied or active inference. We associate bounded rationality with approximate Bayesian inference that optimizes a free energy bound on model evidence. Several constructs such as expected utility, exploration or novelty bonuses, softmax choice rules and optimism bias emerge as natural consequences of free energy minimization. Previous accounts of active inference have focused on predictive coding. In this paper, we consider variational Bayes as a scheme that the brain might use for approximate Bayesian inference. This scheme provides formal constraints on the computational anatomy of inference and action, which appear to be remarkably consistent with neuroanatomy. Active inference contextualizes optimal decision theory within embodied inference, where goals become prior beliefs. For example, expected utility theory emerges as a special case of free energy minimization, where the sensitivity or inverse temperature (associated with softmax functions and quantal response equilibria) has a unique and Bayes-optimal solution. Crucially, this sensitivity corresponds to the precision of beliefs about behaviour. The changes in precision during variational updates are remarkably reminiscent of empirical dopaminergic responses-and they may provide a new perspective on the role of dopamine in assimilating reward prediction errors to optimize decision-making.
Liao, J. G.; Mcmurry, Timothy; Berg, Arthur
2014-01-01
Empirical Bayes methods have been extensively used for microarray data analysis by modeling the large number of unknown parameters as random effects. Empirical Bayes allows borrowing information across genes and can automatically adjust for multiple testing and selection bias. However, the standard empirical Bayes model can perform poorly if the assumed working prior deviates from the true prior. This paper proposes a new rank-conditioned inference in which the shrinkage and confidence intervals are based on the distribution of the error conditioned on rank of the data. Our approach is in contrast to a Bayesian posterior, which conditions on the data themselves. The new method is almost as efficient as standard Bayesian methods when the working prior is close to the true prior, and it is much more robust when the working prior is not close. In addition, it allows a more accurate (but also more complex) non-parametric estimate of the prior to be easily incorporated, resulting in improved inference. The new method’s prior robustness is demonstrated via simulation experiments. Application to a breast cancer gene expression microarray dataset is presented. Our R package rank.Shrinkage provides a ready-to-use implementation of the proposed methodology. PMID:23934072
Bayesian inference for psychology, part IV: parameter estimation and Bayes factors.
Rouder, Jeffrey N; Haaf, Julia M; Vandekerckhove, Joachim
2018-02-01
In the psychological literature, there are two seemingly different approaches to inference: that from estimation of posterior intervals and that from Bayes factors. We provide an overview of each method and show that a salient difference is the choice of models. The two approaches as commonly practiced can be unified with a certain model specification, now popular in the statistics literature, called spike-and-slab priors. A spike-and-slab prior is a mixture of a null model, the spike, with an effect model, the slab. The estimate of the effect size here is a function of the Bayes factor, showing that estimation and model comparison can be unified. The salient difference is that common Bayes factor approaches provide for privileged consideration of theoretically useful parameter values, such as the value corresponding to the null hypothesis, while estimation approaches do not. Both approaches, either privileging the null or not, are useful depending on the goals of the analyst.
An engine awaits processing in the new engine shop at KSC
NASA Technical Reports Server (NTRS)
1998-01-01
A new Block 2A engine awaits processing in the low bay of the Space Shuttle Main Engine Processing Facility (SSMEPF). Officially opened on July 6, the new facility replaces the Shuttle Main Engine Shop. The SSMEPF is an addition to the existing Orbiter Processing Facility Bay 3. The engine is scheduled to fly on the Space Shuttle Endeavour during the STS-88 mission in December 1998.
Why environmental scientists are becoming Bayesians
James S. Clark
2005-01-01
Advances in computational statistics provide a general framework for the high dimensional models typically needed for ecological inference and prediction. Hierarchical Bayes (HB) represents a modelling structure with capacity to exploit diverse sources of information, to accommodate influences that are unknown (or unknowable), and to draw inference on large numbers of...
Naive Probability: A Mental Model Theory of Extensional Reasoning.
ERIC Educational Resources Information Center
Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul
1999-01-01
Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…
The Scientific Method, Diagnostic Bayes, and How to Detect Epistemic Errors
NASA Astrophysics Data System (ADS)
Vrugt, J. A.
2015-12-01
In the past decades, Bayesian methods have found widespread application and use in environmental systems modeling. Bayes theorem states that the posterior probability, P(H|D) of a hypothesis, H is proportional to the product of the prior probability, P(H) of this hypothesis and the likelihood, L(H|hat{D}) of the same hypothesis given the new/incoming observations, \\hat {D}. In science and engineering, H often constitutes some numerical simulation model, D = F(x,.) which summarizes using algebraic, empirical, and differential equations, state variables and fluxes, all our theoretical and/or practical knowledge of the system of interest, and x are the d unknown parameters which are subject to inference using some data, \\hat {D} of the observed system response. The Bayesian approach is intimately related to the scientific method and uses an iterative cycle of hypothesis formulation (model), experimentation and data collection, and theory/hypothesis refinement to elucidate the rules that govern the natural world. Unfortunately, model refinement has proven to be very difficult in large part because of the poor diagnostic power of residual based likelihood functions tep{gupta2008}. This has inspired te{vrugt2013} to advocate the use of 'likelihood-free' inference using approximate Bayesian computation (ABC). This approach uses one or more summary statistics, S(\\hat {D}) of the original data, \\hat {D} designed ideally to be sensitive only to one particular process in the model. Any mismatch between the observed and simulated summary metrics is then easily linked to a specific model component. A recurrent issue with the application of ABC is self-sufficiency of the summary statistics. In theory, S(.) should contain as much information as the original data itself, yet complex systems rarely admit sufficient statistics. In this article, we propose to combine the ideas of ABC and regular Bayesian inference to guarantee that no information is lost in diagnostic model evaluation. This hybrid approach, coined diagnostic Bayes, uses the summary metrics as prior distribution and original data in the likelihood function, or P(x|\\hat {D}) ∝ P(x|S(\\hat {D})) L(x|\\hat {D}). A case study illustrates the ability of the proposed methodology to diagnose epistemic errors and provide guidance on model refinement.
Bayesian model comparison and parameter inference in systems biology using nested sampling.
Pullen, Nick; Morris, Richard J
2014-01-01
Inferring parameters for models of biological processes is a current challenge in systems biology, as is the related problem of comparing competing models that explain the data. In this work we apply Skilling's nested sampling to address both of these problems. Nested sampling is a Bayesian method for exploring parameter space that transforms a multi-dimensional integral to a 1D integration over likelihood space. This approach focuses on the computation of the marginal likelihood or evidence. The ratio of evidences of different models leads to the Bayes factor, which can be used for model comparison. We demonstrate how nested sampling can be used to reverse-engineer a system's behaviour whilst accounting for the uncertainty in the results. The effect of missing initial conditions of the variables as well as unknown parameters is investigated. We show how the evidence and the model ranking can change as a function of the available data. Furthermore, the addition of data from extra variables of the system can deliver more information for model comparison than increasing the data from one variable, thus providing a basis for experimental design.
Adaptability and phenotypic stability of common bean genotypes through Bayesian inference.
Corrêa, A M; Teodoro, P E; Gonçalves, M C; Barroso, L M A; Nascimento, M; Santos, A; Torres, F E
2016-04-27
This study used Bayesian inference to investigate the genotype x environment interaction in common bean grown in Mato Grosso do Sul State, and it also evaluated the efficiency of using informative and minimally informative a priori distributions. Six trials were conducted in randomized blocks, and the grain yield of 13 common bean genotypes was assessed. To represent the minimally informative a priori distributions, a probability distribution with high variance was used, and a meta-analysis concept was adopted to represent the informative a priori distributions. Bayes factors were used to conduct comparisons between the a priori distributions. The Bayesian inference was effective for the selection of upright common bean genotypes with high adaptability and phenotypic stability using the Eberhart and Russell method. Bayes factors indicated that the use of informative a priori distributions provided more accurate results than minimally informative a priori distributions. According to Bayesian inference, the EMGOPA-201, BAMBUÍ, CNF 4999, CNF 4129 A 54, and CNFv 8025 genotypes had specific adaptability to favorable environments, while the IAPAR 14 and IAC CARIOCA ETE genotypes had specific adaptability to unfavorable environments.
Evidence for the existence of Persian Gulf Water and Red Sea Water in the Bay of Bengal
NASA Astrophysics Data System (ADS)
Jain, Vineet; Shankar, D.; Vinayachandran, P. N.; Kankonkar, A.; Chatterjee, Abhisek; Amol, P.; Almeida, A. M.; Michael, G. S.; Mukherjee, A.; Chatterjee, Meenakshi; Fernandes, R.; Luis, R.; Kamble, Amol; Hegde, A. K.; Chatterjee, Siddhartha; Das, Umasankar; Neema, C. P.
2017-05-01
The high-salinity water masses that originate in the North Indian Ocean are Arabian Sea High-Salinity Water (ASHSW), Persian Gulf Water (PGW), and Red Sea Water (RSW). Among them, only ASHSW has been shown to exist in the Bay of Bengal. We use CTD data from recent cruises to show that PGW and RSW also exist in the bay. The presence of RSW is marked by a deviation of the salinity vertical profile from a fitted curve at depths ranging from 500 to 1000 m; this deviation, though small (of the order of 0.005 psu and therefore comparable to the CTD accuracy of 0.003 psu), is an order of magnitude larger than the 0.0003 psu fluctuations associated with the background turbulence or instrument noise in this depth regime, allowing us to infer the existence of RSW throughout the bay. PGW is marked by the presence of a salinity maximum at 200-450 m; in the southwestern bay, PGW can be distinguished from the salinity maximum due to ASHSW because of the intervening Arabian Sea Salinity Minimum. This salinity minimum and the maximum associated with ASHSW disappear east and north of the south-central bay (85°E, 8°N) owing to mixing between the fresher surface waters that are native to the bay (Bay of Bengal Water or BBW) with the high-salinity ASHSW. Hence, ASHSW is not seen as a distinct water mass in the northern and eastern bay and the maximum salinity over most of the bay is associated with PGW. The surface water over most of the bay is therefore a mixture of ASHSW and the low-salinity BBW. As a corollary, we can also infer that the weak oxygen peak seen within the oxygen-minimum zone in the bay at a depth of 250-400 m is associated with PGW. The hydrographic data also show that these three high-salinity water masses are advected into the bay by the Summer Monsoon Current, which is seen to be a deep current extending to 1000 m. These deep currents extend into the northern bay as well, providing a mechanism for spreading ASHSW, PGW, and RSW throughout the bay.
NASA Astrophysics Data System (ADS)
Piecuch, C. G.; Huybers, P. J.; Tingley, M.
2016-12-01
Sea level observations from coastal tide gauges are some of the longest instrumental records of the ocean. However, these data can be noisy, biased, and gappy, featuring missing values, and reflecting land motion and local effects. Coping with these issues in a formal manner is a challenging task. Some studies use Bayesian approaches to estimate sea level from tide gauge records, making inference probabilistically. Such methods are typically empirically Bayesian in nature: model parameters are treated as known and assigned point values. But, in reality, parameters are not perfectly known. Empirical Bayes methods thus neglect a potentially important source of uncertainty, and so may overestimate the precision (i.e., underestimate the uncertainty) of sea level estimates. We consider whether empirical Bayes methods underestimate uncertainty in sea level from tide gauge data, comparing to a full Bayes method that treats parameters as unknowns to be solved for along with the sea level field. We develop a hierarchical algorithm that we apply to tide gauge data on the North American northeast coast over 1893-2015. The algorithm is run in full Bayes mode, solving for the sea level process and parameters, and in empirical mode, solving only for the process using fixed parameter values. Error bars on sea level from the empirical method are smaller than from the full Bayes method, and the relative discrepancies increase with time; the 95% credible interval on sea level values from the empirical Bayes method in 1910 and 2010 is 23% and 56% narrower, respectively, than from the full Bayes approach. To evaluate the representativeness of the credible intervals, empirical Bayes and full Bayes methods are applied to corrupted data of a known surrogate field. Using rank histograms to evaluate the solutions, we find that the full Bayes method produces generally reliable error bars, whereas the empirical Bayes method gives too-narrow error bars, such that the 90% credible interval only encompasses 70% of true process values. Results demonstrate that parameter uncertainty is an important source of process uncertainty, and advocate for the fully Bayesian treatment of tide gauge records in ocean circulation and climate studies.
NASA Astrophysics Data System (ADS)
Caticha, Ariel
2011-03-01
In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes' rule, and therefore unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme.
GPU MrBayes V3.1: MrBayes on Graphics Processing Units for Protein Sequence Data.
Pang, Shuai; Stones, Rebecca J; Ren, Ming-Ming; Liu, Xiao-Guang; Wang, Gang; Xia, Hong-ju; Wu, Hao-Yang; Liu, Yang; Xie, Qiang
2015-09-01
We present a modified GPU (graphics processing unit) version of MrBayes, called ta(MC)(3) (GPU MrBayes V3.1), for Bayesian phylogenetic inference on protein data sets. Our main contributions are 1) utilizing 64-bit variables, thereby enabling ta(MC)(3) to process larger data sets than MrBayes; and 2) to use Kahan summation to improve accuracy, convergence rates, and consequently runtime. Versus the current fastest software, we achieve a speedup of up to around 2.5 (and up to around 90 vs. serial MrBayes), and more on multi-GPU hardware. GPU MrBayes V3.1 is available from http://sourceforge.net/projects/mrbayes-gpu/. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
The anatomy of choice: active inference and agency.
Friston, Karl; Schwartenbeck, Philipp; Fitzgerald, Thomas; Moutoussis, Michael; Behrens, Timothy; Dolan, Raymond J
2013-01-01
This paper considers agency in the setting of embodied or active inference. In brief, we associate a sense of agency with prior beliefs about action and ask what sorts of beliefs underlie optimal behavior. In particular, we consider prior beliefs that action minimizes the Kullback-Leibler (KL) divergence between desired states and attainable states in the future. This allows one to formulate bounded rationality as approximate Bayesian inference that optimizes a free energy bound on model evidence. We show that constructs like expected utility, exploration bonuses, softmax choice rules and optimism bias emerge as natural consequences of this formulation. Previous accounts of active inference have focused on predictive coding and Bayesian filtering schemes for minimizing free energy. Here, we consider variational Bayes as an alternative scheme that provides formal constraints on the computational anatomy of inference and action-constraints that are remarkably consistent with neuroanatomy. Furthermore, this scheme contextualizes optimal decision theory and economic (utilitarian) formulations as pure inference problems. For example, expected utility theory emerges as a special case of free energy minimization, where the sensitivity or inverse temperature (of softmax functions and quantal response equilibria) has a unique and Bayes-optimal solution-that minimizes free energy. This sensitivity corresponds to the precision of beliefs about behavior, such that attainable goals are afforded a higher precision or confidence. In turn, this means that optimal behavior entails a representation of confidence about outcomes that are under an agent's control.
Engineering Management Board Tour VAB
2017-03-22
Members of NASA’s Engineering Management Board visit the Vehicle Assembly Building’s High Bay 3 at Kennedy Space Center in Florida. The platforms in High Bay 3, including the one on which the board members are standing, were designed to surround and provide access to NASA’s Space Launch System and Orion spacecraft. The Engineering Management Board toured integral areas of Kennedy to help the agencywide group reach its goal of unifying engineering work across NASA.
Engineering Management Board Tour VAB
2017-03-22
The view members of NASA’s Engineering Management Board had in looking up the Vehicle Assembly Building’s High Bay 3 at Kennedy Space Center in Florida. The platforms in High Bay 3, including the one on which the board members are standing, were designed to surround and provide access to NASA’s Space Launch System and Orion spacecraft. The Engineering Management Board toured integral areas of Kennedy to help the agencywide group reach its goal of unifying engineering work across NASA.
System and method for responding to ground and flight system malfunctions
NASA Technical Reports Server (NTRS)
Anderson, Julie J. (Inventor); Fussell, Ronald M. (Inventor)
2010-01-01
A system for on-board anomaly resolution for a vehicle has a data repository. The data repository stores data related to different systems, subsystems, and components of the vehicle. The data stored is encoded in a tree-based structure. A query engine is coupled to the data repository. The query engine provides a user and automated interface and provides contextual query to the data repository. An inference engine is coupled to the query engine. The inference engine compares current anomaly data to contextual data stored in the data repository using inference rules. The inference engine generates a potential solution to the current anomaly by referencing the data stored in the data repository.
Engineering works and the tidal Chesapeake
NASA Technical Reports Server (NTRS)
Hargis, W. J., Jr.
1972-01-01
The tidal tributaries of the ocean and coastal areas of the mid-Atlantic region and the ecological significance of engineering projects are discussed. The effects of engineering works on maritime environments and resources, with the Chesapeake Bay as the area of prime interest are examined. Significant engineering projects, both actual and proposed, are described. The conflict of navigational demands and maintenance of an estuarine environment for commercial and sport fishing and recreation is described. Specific applications of remote sensors for analyzing ecological conditions of the bay are included.
The anatomy of choice: active inference and agency
Friston, Karl; Schwartenbeck, Philipp; FitzGerald, Thomas; Moutoussis, Michael; Behrens, Timothy; Dolan, Raymond J.
2013-01-01
This paper considers agency in the setting of embodied or active inference. In brief, we associate a sense of agency with prior beliefs about action and ask what sorts of beliefs underlie optimal behavior. In particular, we consider prior beliefs that action minimizes the Kullback–Leibler (KL) divergence between desired states and attainable states in the future. This allows one to formulate bounded rationality as approximate Bayesian inference that optimizes a free energy bound on model evidence. We show that constructs like expected utility, exploration bonuses, softmax choice rules and optimism bias emerge as natural consequences of this formulation. Previous accounts of active inference have focused on predictive coding and Bayesian filtering schemes for minimizing free energy. Here, we consider variational Bayes as an alternative scheme that provides formal constraints on the computational anatomy of inference and action—constraints that are remarkably consistent with neuroanatomy. Furthermore, this scheme contextualizes optimal decision theory and economic (utilitarian) formulations as pure inference problems. For example, expected utility theory emerges as a special case of free energy minimization, where the sensitivity or inverse temperature (of softmax functions and quantal response equilibria) has a unique and Bayes-optimal solution—that minimizes free energy. This sensitivity corresponds to the precision of beliefs about behavior, such that attainable goals are afforded a higher precision or confidence. In turn, this means that optimal behavior entails a representation of confidence about outcomes that are under an agent's control. PMID:24093015
Association of earthquakes and faults in the San Francisco Bay area using Bayesian inference
Wesson, R.L.; Bakun, W.H.; Perkins, D.M.
2003-01-01
Bayesian inference provides a method to use seismic intensity data or instrumental locations, together with geologic and seismologic data, to make quantitative estimates of the probabilities that specific past earthquakes are associated with specific faults. Probability density functions are constructed for the location of each earthquake, and these are combined with prior probabilities through Bayes' theorem to estimate the probability that an earthquake is associated with a specific fault. Results using this method are presented here for large, preinstrumental, historical earthquakes and for recent earthquakes with instrumental locations in the San Francisco Bay region. The probabilities for individual earthquakes can be summed to construct a probabilistic frequency-magnitude relationship for a fault segment. Other applications of the technique include the estimation of the probability of background earthquakes, that is, earthquakes not associated with known or considered faults, and the estimation of the fraction of the total seismic moment associated with earthquakes less than the characteristic magnitude. Results for the San Francisco Bay region suggest that potentially damaging earthquakes with magnitudes less than the characteristic magnitudes should be expected. Comparisons of earthquake locations and the surface traces of active faults as determined from geologic data show significant disparities, indicating that a complete understanding of the relationship between earthquakes and faults remains elusive.
5. VIEW TO THE SOUTHEAST OF THE HOT BAY AND ...
5. VIEW TO THE SOUTHEAST OF THE HOT BAY AND ATTACHED OPERATING GALLERIES ALONG THE WEST SIDE OF THE BAY. - Nevada Test Site, Engine Maintenance Assembly & Disassembly Facility, Area 25, Jackass Flats, Mercury, Nye County, NV
Generic comparison of protein inference engines.
Claassen, Manfred; Reiter, Lukas; Hengartner, Michael O; Buhmann, Joachim M; Aebersold, Ruedi
2012-04-01
Protein identifications, instead of peptide-spectrum matches, constitute the biologically relevant result of shotgun proteomics studies. How to appropriately infer and report protein identifications has triggered a still ongoing debate. This debate has so far suffered from the lack of appropriate performance measures that allow us to objectively assess protein inference approaches. This study describes an intuitive, generic and yet formal performance measure and demonstrates how it enables experimentalists to select an optimal protein inference strategy for a given collection of fragment ion spectra. We applied the performance measure to systematically explore the benefit of excluding possibly unreliable protein identifications, such as single-hit wonders. Therefore, we defined a family of protein inference engines by extending a simple inference engine by thousands of pruning variants, each excluding a different specified set of possibly unreliable identifications. We benchmarked these protein inference engines on several data sets representing different proteomes and mass spectrometry platforms. Optimally performing inference engines retained all high confidence spectral evidence, without posterior exclusion of any type of protein identifications. Despite the diversity of studied data sets consistently supporting this rule, other data sets might behave differently. In order to ensure maximal reliable proteome coverage for data sets arising in other studies we advocate abstaining from rigid protein inference rules, such as exclusion of single-hit wonders, and instead consider several protein inference approaches and assess these with respect to the presented performance measure in the specific application context.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-21
... is made for clean vessel deck wash down, clean vessel engine cooling water, clean vessel generator cooling water, clean bilge water, anchor wash, or vessel engine or generator exhaust. Second, in the Muli... Atmospheric Administration 15 CFR Part 922 Expansion of Fagatele Bay National Marine Sanctuary, Regulatory...
From empirical Bayes to full Bayes : methods for analyzing traffic safety data.
DOT National Transportation Integrated Search
2004-10-24
Traffic safety engineers are among the early adopters of Bayesian statistical tools for : analyzing crash data. As in many other areas of application, empirical Bayes methods were : their first choice, perhaps because they represent an intuitively ap...
QTL fine mapping with Bayes C(π): a simulation study.
van den Berg, Irene; Fritz, Sébastien; Boichard, Didier
2013-06-19
Accurate QTL mapping is a prerequisite in the search for causative mutations. Bayesian genomic selection models that analyse many markers simultaneously should provide more accurate QTL detection results than single-marker models. Our objectives were to (a) evaluate by simulation the influence of heritability, number of QTL and number of records on the accuracy of QTL mapping with Bayes Cπ and Bayes C; (b) estimate the QTL status (homozygous vs. heterozygous) of the individuals analysed. This study focussed on the ten largest detected QTL, assuming they are candidates for further characterization. Our simulations were based on a true dairy cattle population genotyped for 38,277 phased markers. Some of these markers were considered biallelic QTL and used to generate corresponding phenotypes. Different numbers of records (4387 and 1500), heritability values (0.1, 0.4 and 0.7) and numbers of QTL (10, 100 and 1000) were studied. QTL detection was based on the posterior inclusion probability for individual markers, or on the sum of the posterior inclusion probabilities for consecutive markers, estimated using Bayes C or Bayes Cπ. The QTL status of the individuals was derived from the contrast between the sums of the SNP allelic effects of their chromosomal segments. The proportion of markers with null effect (π) frequently did not reach convergence, leading to poor results for Bayes Cπ in QTL detection. Fixing π led to better results. Detection of the largest QTL was most accurate for medium to high heritability, for low to moderate numbers of QTL, and with a large number of records. The QTL status was accurately inferred when the distribution of the contrast between chromosomal segment effects was bimodal. QTL detection is feasible with Bayes C. For QTL detection, it is recommended to use a large dataset and to focus on highly heritable traits and on the largest QTL. QTL statuses were inferred based on the distribution of the contrast between chromosomal segment effects.
ERIC Educational Resources Information Center
Philp, Michael J.
1978-01-01
Anne Arundel Community College uses the Chesapeake Bay for a flexible ocean engineering technology program which includes mechanical, electrical, and environmental options for transfer and/or vocational students, and adult education programs covering such subjects as sailing, Bay history, boat building, scuba-diving, and marine biology. (RT)
20. San FranciscoOakland Bay Bridge contract recipients, April 28, 1933, ...
20. San Francisco-Oakland Bay Bridge contract recipients, April 28, 1933, photographer unknown. Standing, left to right: Edward J. Schneider, Columbia Steel Corporation; C.C. Horton, Healy-Tibbitts Construction Company; Henry J. Kaiser, Bridge Builders, Inc.; Albert Huber, Clinton Construction Company; Allan McDonald, Transbay Construction Company; C.C. Carleton, Chief, Division of Contracts and Rights of Way, California Department of Public Works. Seated, left to right: Henry J. Brunnier, Consulting Engineer, Member of Consulting Board, San Francisco-Oakland Bay Bridge; Charles E. Andrew, Bridge Engineer, San Francisco-Oakland Bay Bridge; Earl Lee Kelly, Director, California Department of Public Works; Harrison S. Robinson, President, Financial ... - Salt River Bridge, Spanning Salt River at Dillon Road, Ferndale, Humboldt County, CA
Efficient Implementation of MrBayes on Multi-GPU
Zhou, Jianfu; Liu, Xiaoguang; Wang, Gang
2013-01-01
MrBayes, using Metropolis-coupled Markov chain Monte Carlo (MCMCMC or (MC)3), is a popular program for Bayesian inference. As a leading method of using DNA data to infer phylogeny, the (MC)3 Bayesian algorithm and its improved and parallel versions are now not fast enough for biologists to analyze massive real-world DNA data. Recently, graphics processor unit (GPU) has shown its power as a coprocessor (or rather, an accelerator) in many fields. This article describes an efficient implementation a(MC)3 (aMCMCMC) for MrBayes (MC)3 on compute unified device architecture. By dynamically adjusting the task granularity to adapt to input data size and hardware configuration, it makes full use of GPU cores with different data sets. An adaptive method is also developed to split and combine DNA sequences to make full use of a large number of GPU cards. Furthermore, a new “node-by-node” task scheduling strategy is developed to improve concurrency, and several optimizing methods are used to reduce extra overhead. Experimental results show that a(MC)3 achieves up to 63× speedup over serial MrBayes on a single machine with one GPU card, and up to 170× speedup with four GPU cards, and up to 478× speedup with a 32-node GPU cluster. a(MC)3 is dramatically faster than all the previous (MC)3 algorithms and scales well to large GPU clusters. PMID:23493260
Efficient implementation of MrBayes on multi-GPU.
Bao, Jie; Xia, Hongju; Zhou, Jianfu; Liu, Xiaoguang; Wang, Gang
2013-06-01
MrBayes, using Metropolis-coupled Markov chain Monte Carlo (MCMCMC or (MC)(3)), is a popular program for Bayesian inference. As a leading method of using DNA data to infer phylogeny, the (MC)(3) Bayesian algorithm and its improved and parallel versions are now not fast enough for biologists to analyze massive real-world DNA data. Recently, graphics processor unit (GPU) has shown its power as a coprocessor (or rather, an accelerator) in many fields. This article describes an efficient implementation a(MC)(3) (aMCMCMC) for MrBayes (MC)(3) on compute unified device architecture. By dynamically adjusting the task granularity to adapt to input data size and hardware configuration, it makes full use of GPU cores with different data sets. An adaptive method is also developed to split and combine DNA sequences to make full use of a large number of GPU cards. Furthermore, a new "node-by-node" task scheduling strategy is developed to improve concurrency, and several optimizing methods are used to reduce extra overhead. Experimental results show that a(MC)(3) achieves up to 63× speedup over serial MrBayes on a single machine with one GPU card, and up to 170× speedup with four GPU cards, and up to 478× speedup with a 32-node GPU cluster. a(MC)(3) is dramatically faster than all the previous (MC)(3) algorithms and scales well to large GPU clusters.
HOT CELL BUILDING, TRA632. CONTEXTUAL VIEW ALONG WALLEYE AVENUE, CAMERA ...
HOT CELL BUILDING, TRA-632. CONTEXTUAL VIEW ALONG WALLEYE AVENUE, CAMERA FACING EASTERLY. HOT CELL BUILDING IS AT CENTER LEFT OF VIEW; THE LOW-BAY PROJECTION WITH LADDER IS THE TEST TRAIN ASSEMBLY FACILITY, ADDED IN 1968. MTR BUILDING IS IN LEFT OF VIEW. HIGH-BAY BUILDING AT RIGHT IS THE ENGINEERING TEST REACTOR BUILDING, TRA-642. INL NEGATIVE NO. HD46-32-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
Example of a Bayes network of relations among visual features
NASA Astrophysics Data System (ADS)
Agosta, John M.
1991-10-01
Bayes probability networks, also termed `influence diagrams,' promise to be a versatile, rigorous, and expressive uncertainty reasoning tool. This paper presents an example of how a Bayes network can express constraints among visual hypotheses. An example is presented of a model composed of cylindric primitives, inferred from a line drawing of a plumbing fixture. Conflict between interpretations of candidate cylinders is expressed by two parameters, one for the presence and one for the absence of visual evidence of their intersection. It is shown how `partial exclusion' relations are so generated and how they determine the degree of competition among the set of hypotheses. Solving this network obtains the assemblies of cylinders most likely to form an object.
An inference engine for embedded diagnostic systems
NASA Technical Reports Server (NTRS)
Fox, Barry R.; Brewster, Larry T.
1987-01-01
The implementation of an inference engine for embedded diagnostic systems is described. The system consists of two distinct parts. The first is an off-line compiler which accepts a propositional logical statement of the relationship between facts and conclusions and produces data structures required by the on-line inference engine. The second part consists of the inference engine and interface routines which accept assertions of fact and return the conclusions which necessarily follow. Given a set of assertions, it will generate exactly the conclusions which logically follow. At the same time, it will detect any inconsistencies which may propagate from an inconsistent set of assertions or a poorly formulated set of rules. The memory requirements are fixed and the worst case execution times are bounded at compile time. The data structures and inference algorithms are very simple and well understood. The data structures and algorithms are described in detail. The system has been implemented on Lisp, Pascal, and Modula-2.
33 CFR 334.930 - Anaheim Bay Harbor, Calif.; Naval Weapons Station, Seal Beach.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Weapons Station, Seal Beach. 334.930 Section 334.930 Navigation and Navigable Waters CORPS OF ENGINEERS... Bay Harbor, Calif.; Naval Weapons Station, Seal Beach. (a) The restricted area. The water of Anaheim Bay Harbor between the east and west jetties at the United States Naval Weapons Station, Seal Beach...
33 CFR 334.930 - Anaheim Bay Harbor, Calif.; Naval Weapons Station, Seal Beach.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Weapons Station, Seal Beach. 334.930 Section 334.930 Navigation and Navigable Waters CORPS OF ENGINEERS... Bay Harbor, Calif.; Naval Weapons Station, Seal Beach. (a) The restricted area. The water of Anaheim Bay Harbor between the east and west jetties at the United States Naval Weapons Station, Seal Beach...
33 CFR 334.930 - Anaheim Bay Harbor, Calif.; Naval Weapons Station, Seal Beach.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Weapons Station, Seal Beach. 334.930 Section 334.930 Navigation and Navigable Waters CORPS OF ENGINEERS... Bay Harbor, Calif.; Naval Weapons Station, Seal Beach. (a) The restricted area. The water of Anaheim Bay Harbor between the east and west jetties at the United States Naval Weapons Station, Seal Beach...
33 CFR 334.930 - Anaheim Bay Harbor, Calif.; Naval Weapons Station, Seal Beach.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Weapons Station, Seal Beach. 334.930 Section 334.930 Navigation and Navigable Waters CORPS OF ENGINEERS... Bay Harbor, Calif.; Naval Weapons Station, Seal Beach. (a) The restricted area. The water of Anaheim Bay Harbor between the east and west jetties at the United States Naval Weapons Station, Seal Beach...
33 CFR 334.930 - Anaheim Bay Harbor, Calif.; Naval Weapons Station, Seal Beach.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Weapons Station, Seal Beach. 334.930 Section 334.930 Navigation and Navigable Waters CORPS OF ENGINEERS... Bay Harbor, Calif.; Naval Weapons Station, Seal Beach. (a) The restricted area. The water of Anaheim Bay Harbor between the east and west jetties at the United States Naval Weapons Station, Seal Beach...
33 CFR 334.1320 - Kuluk Bay, Adak, Alaska; naval restricted area.
Code of Federal Regulations, 2011 CFR
2011-07-01
... restricted area. 334.1320 Section 334.1320 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.1320 Kuluk Bay, Adak, Alaska; naval restricted area. (a) The area. The northwest portion of Kuluk Bay bounded as follows...
33 CFR 334.1320 - Kuluk Bay, Adak, Alaska; naval restricted area.
Code of Federal Regulations, 2014 CFR
2014-07-01
... restricted area. 334.1320 Section 334.1320 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.1320 Kuluk Bay, Adak, Alaska; naval restricted area. (a) The area. The northwest portion of Kuluk Bay bounded as follows...
33 CFR 334.1325 - United States Army Restricted Area, Kuluk Bay, Adak, Alaska.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Area, Kuluk Bay, Adak, Alaska. 334.1325 Section 334.1325 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.1325 United States Army Restricted Area, Kuluk Bay, Adak, Alaska. (a) The area. The area within a...
33 CFR 334.1320 - Kuluk Bay, Adak, Alaska; naval restricted area.
Code of Federal Regulations, 2013 CFR
2013-07-01
... restricted area. 334.1320 Section 334.1320 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.1320 Kuluk Bay, Adak, Alaska; naval restricted area. (a) The area. The northwest portion of Kuluk Bay bounded as follows...
33 CFR 334.1320 - Kuluk Bay, Adak, Alaska; naval restricted area.
Code of Federal Regulations, 2010 CFR
2010-07-01
... restricted area. 334.1320 Section 334.1320 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.1320 Kuluk Bay, Adak, Alaska; naval restricted area. (a) The area. The northwest portion of Kuluk Bay bounded as follows...
33 CFR 334.1325 - United States Army Restricted Area, Kuluk Bay, Adak, Alaska.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Area, Kuluk Bay, Adak, Alaska. 334.1325 Section 334.1325 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.1325 United States Army Restricted Area, Kuluk Bay, Adak, Alaska. (a) The area. The area within a...
33 CFR 334.1325 - United States Army Restricted Area, Kuluk Bay, Adak, Alaska.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Area, Kuluk Bay, Adak, Alaska. 334.1325 Section 334.1325 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.1325 United States Army Restricted Area, Kuluk Bay, Adak, Alaska. (a) The area. The area within a...
33 CFR 334.1320 - Kuluk Bay, Adak, Alaska; naval restricted area.
Code of Federal Regulations, 2012 CFR
2012-07-01
... restricted area. 334.1320 Section 334.1320 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.1320 Kuluk Bay, Adak, Alaska; naval restricted area. (a) The area. The northwest portion of Kuluk Bay bounded as follows...
Derivation of Delaware Bay tidal parameters from Space Shuttle photography
NASA Technical Reports Server (NTRS)
Zheng, Quanan; Yan, Xiao-Hai; Klemas, Vic
1993-01-01
The tide-related parameters of the Delaware Bay are derived from Space Shuttle time-series photographs. The water areas in the bay are measured from interpretation maps of the photographs with a CALCOMP 9100 digitizer and ERDAS Image Processing System. The corresponding tidal levels are calculated using the exposure time annotated on the photographs. From these data, an approximate function relating the water area to the tidal level at a reference point is determined. Based on the function, the water areas of the Delaware Bay at mean high water (MHW) and mean low water (MLW), below 0 m, and for the tidal zone are inferred. With MHW and MLW areas and the mean tidal range, we calculate the tidal influx of the Delaware Bay, which is 2.76 x 10 exp 9 cu m. The velocity of flood tide at the bay mouth is determined using the tidal flux and an integral of the velocity distribution function at the cross section between Cape Henlopen and Cape May. The result is 132 cm/s, which compares well with the data on tidal current charts.
High-resolution marine seismic reflection data from the San Francisco Bay area
Childs, Jonathan R.; Hart, Patrick; Bruns, Terry R.; Marlow, Michael S.; Sliter, Ray
2000-01-01
Between 1993 and 1997, the U.S. Geological Survey acquired high-resolution, marine seismic-reflection profile data across submerged portions of known and inferred upper crustal fault zones throughout the greater San Francisco Bay area. Surveys were conducted oversouth San Francisco Bay in the vicinity of the San Bruno shoal (roughly between the San Francisco and Oakland airports), over the offshore extension of the San Andreas fault system west of the Golden Gate, over the Hayward fault to Rodgers Creek fault step-over in San Pablo Bay, and over the Kirby Hills fault where it crosses the western Sacramento Delta. Reconnaissance profiles were acquired elsewhere throughout the San Francisco and San Pablo Bays. These data were acquired by the U.S. Geological Survey, Western Coastal and Marine Geology Team, under the auspices of the Central California/San Francisco Bay Earthquake Hazards Project. Analysis and interpretation of some of these profiles has been published by Marlow and others (1996, 1999). Further analysis and interpretation of these data are available in a USGS. Professional Paper Crustal Structure of the Coastal and Marine San Francisco Bay Region, T. Parsons, editor, http://geopubs.wr.usgs.gov/prof-paper/pp1658/ [link added 2012 mfd].
NASA Astrophysics Data System (ADS)
Lane, Emily M.; Borrero, Jose; Whittaker, Colin N.; Bind, Jo; Chagué-Goff, Catherine; Goff, James; Goring, Derek; Hoyle, Jo; Mueller, Christof; Power, William L.; Reid, Catherine M.; Williams, James H.; Williams, Shaun P.
2017-05-01
At 12:02:56 a.m. Monday, November 14 2016 NZDT (11:02:56 a.m., November 13 2016 UTC) a magnitude 7.8 earthquake struck near Kaikōura on the north-eastern coast of the South Island of New Zealand. This earthquake caused a tsunami along New Zealand's east coast that was recorded on a number of sea level gauges. Outside of the Kaikōura region, north facing bays along Banks Peninsula were most affected by the tsunami. Of these, Little Pigeon Bay experienced extensive inundation and an unoccupied cottage was destroyed by the wave run-up. We report on the inundation extent and (inferred) flow directions at Little Pigeon Bay, including a study on temporal changes in the field evidence of this inundation. Preliminary modelling results indicate that the waves may have excited resonance in the bay. We also present results from inundation surveys of nearby, north-facing bays on Banks Peninsula. The excitation of resonance in Little Pigeon Bay provides an explanation for the more severe inundation and damage there in comparison to these nearby bays.
Action understanding and active inference
Mattout, Jérémie; Kilner, James
2012-01-01
We have suggested that the mirror-neuron system might be usefully understood as implementing Bayes-optimal perception of actions emitted by oneself or others. To substantiate this claim, we present neuronal simulations that show the same representations can prescribe motor behavior and encode motor intentions during action–observation. These simulations are based on the free-energy formulation of active inference, which is formally related to predictive coding. In this scheme, (generalised) states of the world are represented as trajectories. When these states include motor trajectories they implicitly entail intentions (future motor states). Optimizing the representation of these intentions enables predictive coding in a prospective sense. Crucially, the same generative models used to make predictions can be deployed to predict the actions of self or others by simply changing the bias or precision (i.e. attention) afforded to proprioceptive signals. We illustrate these points using simulations of handwriting to illustrate neuronally plausible generation and recognition of itinerant (wandering) motor trajectories. We then use the same simulations to produce synthetic electrophysiological responses to violations of intentional expectations. Our results affirm that a Bayes-optimal approach provides a principled framework, which accommodates current thinking about the mirror-neuron system. Furthermore, it endorses the general formulation of action as active inference. PMID:21327826
Pushing open-ocean organic paleo-environmental proxies to the margin: Narragansett Bay, RI
NASA Astrophysics Data System (ADS)
Salacup, J. M.; Herbert, T.; Prell, W. L.
2010-12-01
Estuarine sediment deposits provide an under-utilized opportunity to reconstruct high-resolution records of environmental change from the highly sensitive intersection of oceanic and terrestrial systems. Previous applications of both well-established and novel organic geochemical proxies to estuaries have met with mixed success. Compared to oceanic settings, the large dynamic range of tidal currents, water temperature, salinity, nutrients, and productivity both enrich and complicate estuarine sedimentary records. Here, we present the results of monthly samples of water-column particulate organic matter and compare them to a suite of sediment cores in an effort to elucidate how the environmental signal produced in the water-column is translated to the sediment. Specifically, we measured alkenones and glycerol dialkyl glycerol tetraethers (GDGTs), the bases for the Uk’37 sea-surface temperature (SST) and C-37total primary productivity proxies, and the TEX86 SST and BIT Index proxies, respectively. Alkenones, produced by haptophyte algae, are present in most of our water-column samples; however, concentrations in many samples are too low to reliably calculate temperature. When reliable, water-column alkenones infer SSTs between 13-16°C, consistent with sediment core-top Uk’37 SST estimates. These correlate to May and Oct SSTs, coinciding with the terminations of the summer-fall and winter-spring algal blooms in Narragansett Bay. In contrast to alkenone fingerprints reported from the much lower salinity Chesapeake Bay, Narragansett Bay samples lack significant contributions of the C37:4 ketone, consistent with production by open-ocean haptophytes. Notably, sedimentary records of Uk’37-inferred SST show strong inter-core centennial-to-decadal coherence. The structure and absolute values of inferred SSTs correlate well with instrumental mean Sept-Oct air temperatures back to 1895, and contain structure consistent with the late Little Ice Age and 20th century warming. Our record indicates that the past 100 in Narragansett Bay are the warmest in at least the last 500. Water-column values of the GDGT-based BIT Index, a proxy for the delivery of terrestrial organic matter (TOM), decrease down-Bay with distance from the major rivers. However, absolute values of the index, exceeding 0.8-0.9 in the upper half of the Bay, are more consistent with soil samples than water and are hard to reconcile with high level of marine productivity. Sedimentary values of the BIT Index are also high, between ~0.4 and 0.8, and their profiles suggest that Colonization (~1700) and Industrialization (~1850) altered terrestrial sediment delivery to the Bay. Such high values of the BIT Index suggest the utility of TEX86 may be complicated by terrestrial GDGT contributions. Indeed, sedimentary values for TEX86 are highly variable and show little correlation with Uk’37 or instrumental records. Our results confirm the utility of organic geochemical proxies in estuarine settings while advocating the application of more than one. In Narragansett Bay, this approach has allowed the reconstruction of local historically and climatically important events such as the impacts of European settlement, the Industrial Revolution, and 20th century warming.
33 CFR 334.720 - Gulf of Mexico, south from Choctawhatchee Bay; Missile test area.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 33 Navigation and Navigable Waters 3 2014-07-01 2014-07-01 false Gulf of Mexico, south from Choctawhatchee Bay; Missile test area. 334.720 Section 334.720 Navigation and Navigable Waters CORPS OF ENGINEERS... Mexico, south from Choctawhatchee Bay; Missile test area. (a) The danger zone. The danger zone shall...
33 CFR 334.720 - Gulf of Mexico, south from Choctawhatchee Bay; Missile test area.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 33 Navigation and Navigable Waters 3 2012-07-01 2012-07-01 false Gulf of Mexico, south from Choctawhatchee Bay; Missile test area. 334.720 Section 334.720 Navigation and Navigable Waters CORPS OF ENGINEERS... Mexico, south from Choctawhatchee Bay; Missile test area. (a) The danger zone. The danger zone shall...
33 CFR 334.720 - Gulf of Mexico, south from Choctawhatchee Bay; Missile test area.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 33 Navigation and Navigable Waters 3 2013-07-01 2013-07-01 false Gulf of Mexico, south from Choctawhatchee Bay; Missile test area. 334.720 Section 334.720 Navigation and Navigable Waters CORPS OF ENGINEERS... Mexico, south from Choctawhatchee Bay; Missile test area. (a) The danger zone. The danger zone shall...
Photocopy of drawing (original drawing of MacDill Field in possession ...
Photocopy of drawing (original drawing of MacDill Field in possession of MacDill Air Force Base, Civil Engineering, Tampa, Florida; site plan dated December, 1942) BASE LAYOUT, DECEMBER 1942 - MacDill Air Force Base, Bounded by City of Tampa North, Tampa Bay South, Old Tampa Bay West, & Hillsborough Bay East, Tampa, Hillsborough County, FL
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-18
... proposes the establishment of a Regulated Navigation Area (RNA) at the Youngs Bay PacifiCorp property in Astoria, OR. This RNA is necessary to preserve the integrity of an engineered sediment cap as part of an Oregon Department of Environmental Quality (DEQ) required remedial action. This proposed RNA will do so...
Han, Hyemin; Park, Joonsuk
2018-01-01
Recent debates about the conventional traditional threshold used in the fields of neuroscience and psychology, namely P < 0.05, have spurred researchers to consider alternative ways to analyze fMRI data. A group of methodologists and statisticians have considered Bayesian inference as a candidate methodology. However, few previous studies have attempted to provide end users of fMRI analysis tools, such as SPM 12, with practical guidelines about how to conduct Bayesian inference. In the present study, we aim to demonstrate how to utilize Bayesian inference, Bayesian second-level inference in particular, implemented in SPM 12 by analyzing fMRI data available to public via NeuroVault. In addition, to help end users understand how Bayesian inference actually works in SPM 12, we examine outcomes from Bayesian second-level inference implemented in SPM 12 by comparing them with those from classical second-level inference. Finally, we provide practical guidelines about how to set the parameters for Bayesian inference and how to interpret the results, such as Bayes factors, from the inference. We also discuss the practical and philosophical benefits of Bayesian inference and directions for future research. PMID:29456498
The Irish glaciated margin: processes and environments of deglaciation.
NASA Astrophysics Data System (ADS)
McCarron, Stephen; Monteys, Xavier; Scott, Gill
2015-04-01
High resolution bathymetric data for Donegal Bay and parts of the western Irish Continental Shelf have become available in recent years due to the Irish National Seabed Survey [INSS] (2000-2009). Relative to onshore glacigenic landform preservation and visibility on the shelf and on the floor of Donegal Bay is excellent. Here we describe some of the the data, paying particular attention to the area close to the north Mayo coastline. We discuss inferred connections between well exposed and age constrained glacial geology along the coastal fringe and the submarine evidence of deglcial processes and timing. It is argued that the sediment and landform assemblage within the Bay is derived from multiple, lobate extensions of the last British Irish Ice Sheet into the Donegal Bay topographic low from source areas to the southeast (north Mayo) and east/northeast (Sligo and Donegal/Fermanagh) during overall deglaciation (Termination 1).
Anthropogenic Eutrophication of Narragansett Bay: Evidence from Dated Sediment Cores
The organic matter preserved in estuarine sediments provides a number of useful indicators, or "proxies" that can be used to infer paleoenvironmental changes One type of paleoenvironmental change is anthropogenic eutrophication. The human activity largely responsible for increasi...
Application of temporal LNC logic in artificial intelligence
NASA Astrophysics Data System (ADS)
Adamek, Marek; Mulawka, Jan
2016-09-01
This paper presents the temporal logic inference engine developed in our university. It is an attempt to demonstrate implementation and practical application of temporal logic LNC developed in Cardinal Stefan Wyszynski University in Warsaw.1 The paper describes the fundamentals of LNC logic, architecture and implementation of inference engine. The practical application is shown by providing the solution for popular in Artificial Intelligence problem of Missionaries and Cannibals in terms of LNC logic. Both problem formulation and inference engine are described in details.
Bayesian Inference in Satellite Gravity Inversion
NASA Technical Reports Server (NTRS)
Kis, K. I.; Taylor, Patrick T.; Wittmann, G.; Kim, Hyung Rae; Torony, B.; Mayer-Guerr, T.
2005-01-01
To solve a geophysical inverse problem means applying measurements to determine the parameters of the selected model. The inverse problem is formulated as the Bayesian inference. The Gaussian probability density functions are applied in the Bayes's equation. The CHAMP satellite gravity data are determined at the altitude of 400 kilometer altitude over the South part of the Pannonian basin. The model of interpretation is the right vertical cylinder. The parameters of the model are obtained from the minimum problem solved by the Simplex method.
High Resolution Soil Water from Regional Databases and Satellite Images
NASA Technical Reports Server (NTRS)
Morris, Robin D.; Smelyanskly, Vadim N.; Coughlin, Joseph; Dungan, Jennifer; Clancy, Daniel (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on the ways in which plant growth can be inferred from satellite data and can then be used to infer soil water. There are several steps in this process, the first of which is the acquisition of data from satellite observations and relevant information databases such as the State Soil Geographic Database (STATSGO). Then probabilistic analysis and inversion with the Bayes' theorem reveals sources of uncertainty. The Markov chain Monte Carlo method is also used.
Bayesian Model Selection in Geophysics: The evidence
NASA Astrophysics Data System (ADS)
Vrugt, J. A.
2016-12-01
Bayesian inference has found widespread application and use in science and engineering to reconcile Earth system models with data, including prediction in space (interpolation), prediction in time (forecasting), assimilation of observations and deterministic/stochastic model output, and inference of the model parameters. Per Bayes theorem, the posterior probability, , P(H|D), of a hypothesis, H, given the data D, is equivalent to the product of its prior probability, P(H), and likelihood, L(H|D), divided by a normalization constant, P(D). In geophysics, the hypothesis, H, often constitutes a description (parameterization) of the subsurface for some entity of interest (e.g. porosity, moisture content). The normalization constant, P(D), is not required for inference of the subsurface structure, yet of great value for model selection. Unfortunately, it is not particularly easy to estimate P(D) in practice. Here, I will introduce the various building blocks of a general purpose method which provides robust and unbiased estimates of the evidence, P(D). This method uses multi-dimensional numerical integration of the posterior (parameter) distribution. I will then illustrate this new estimator by application to three competing subsurface models (hypothesis) using GPR travel time data from the South Oyster Bacterial Transport Site, in Virginia, USA. The three subsurface models differ in their treatment of the porosity distribution and use (a) horizontal layering with fixed layer thicknesses, (b) vertical layering with fixed layer thicknesses and (c) a multi-Gaussian field. The results of the new estimator are compared against the brute force Monte Carlo method, and the Laplace-Metropolis method.
Detail view in engine bay three in the the aft ...
Detail view in engine bay three in the the aft fuselage of the Orbiter Discovery. This view shows the engine interface fittings and the hydraulic-actuator support structure. The propellant feed lines are the large plugged and capped orifices. Note the handwritten references on the thrust plate in proximity to the actuators that read E3 Pitch and E3 Yaw. This view was taken from a service platform in the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
EAST ELEVATION OF HIGH BAY ADDITION OF FUEL STORAGE BUILDING ...
EAST ELEVATION OF HIGH BAY ADDITION OF FUEL STORAGE BUILDING (CPP-603). INL DRAWING NUMBER 200-0603-00-706-051286. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID
Computational Precision of Mental Inference as Critical Source of Human Choice Suboptimality.
Drugowitsch, Jan; Wyart, Valentin; Devauchelle, Anne-Dominique; Koechlin, Etienne
2016-12-21
Making decisions in uncertain environments often requires combining multiple pieces of ambiguous information from external cues. In such conditions, human choices resemble optimal Bayesian inference, but typically show a large suboptimal variability whose origin remains poorly understood. In particular, this choice suboptimality might arise from imperfections in mental inference rather than in peripheral stages, such as sensory processing and response selection. Here, we dissociate these three sources of suboptimality in human choices based on combining multiple ambiguous cues. Using a novel quantitative approach for identifying the origin and structure of choice variability, we show that imperfections in inference alone cause a dominant fraction of suboptimal choices. Furthermore, two-thirds of this suboptimality appear to derive from the limited precision of neural computations implementing inference rather than from systematic deviations from Bayes-optimal inference. These findings set an upper bound on the accuracy and ultimate predictability of human choices in uncertain environments. Copyright © 2016 Elsevier Inc. All rights reserved.
FPGA Acceleration of the phylogenetic likelihood function for Bayesian MCMC inference methods.
Zierke, Stephanie; Bakos, Jason D
2010-04-12
Likelihood (ML)-based phylogenetic inference has become a popular method for estimating the evolutionary relationships among species based on genomic sequence data. This method is used in applications such as RAxML, GARLI, MrBayes, PAML, and PAUP. The Phylogenetic Likelihood Function (PLF) is an important kernel computation for this method. The PLF consists of a loop with no conditional behavior or dependencies between iterations. As such it contains a high potential for exploiting parallelism using micro-architectural techniques. In this paper, we describe a technique for mapping the PLF and supporting logic onto a Field Programmable Gate Array (FPGA)-based co-processor. By leveraging the FPGA's on-chip DSP modules and the high-bandwidth local memory attached to the FPGA, the resultant co-processor can accelerate ML-based methods and outperform state-of-the-art multi-core processors. We use the MrBayes 3 tool as a framework for designing our co-processor. For large datasets, we estimate that our accelerated MrBayes, if run on a current-generation FPGA, achieves a 10x speedup relative to software running on a state-of-the-art server-class microprocessor. The FPGA-based implementation achieves its performance by deeply pipelining the likelihood computations, performing multiple floating-point operations in parallel, and through a natural log approximation that is chosen specifically to leverage a deeply pipelined custom architecture. Heterogeneous computing, which combines general-purpose processors with special-purpose co-processors such as FPGAs and GPUs, is a promising approach for high-performance phylogeny inference as shown by the growing body of literature in this field. FPGAs in particular are well-suited for this task because of their low power consumption as compared to many-core processors and Graphics Processor Units (GPUs).
3. DETAIL VIEW OF DIRECT DRIVE STERLING 'DOLPHIN T' MODEL ...
3. DETAIL VIEW OF DIRECT DRIVE STERLING 'DOLPHIN T' MODEL 4 CYLINDER, GASOLINE TRACTOR-TYPE ENGINE WITH FALKBIBBY FLEXIBLE COUPLING - Central Railroad of New Jersey, Newark Bay Lift Bridge, Spanning Newark Bay, Newark, Essex County, NJ
Computational Neuropsychology and Bayesian Inference.
Parr, Thomas; Rees, Geraint; Friston, Karl J
2018-01-01
Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine 'prior' beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology - optimal inference with suboptimal priors - and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient's behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.
Computational Neuropsychology and Bayesian Inference
Parr, Thomas; Rees, Geraint; Friston, Karl J.
2018-01-01
Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine ‘prior’ beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology – optimal inference with suboptimal priors – and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient’s behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology. PMID:29527157
Facility Layout Problems Using Bays: A Survey
NASA Astrophysics Data System (ADS)
Davoudpour, Hamid; Jaafari, Amir Ardestani; Farahani, Leila Najafabadi
2010-06-01
Layout design is one of the most important activities done by industrial Engineers. Most of these problems have NP hard Complexity. In a basic layout design, each cell is represented by a rectilinear, but not necessarily convex polygon. The set of fully packed adjacent polygons is known as a block layout (Asef-Vaziri and Laporte 2007). Block layout is divided by slicing tree and bay layout. In bay layout, departments are located in vertical columns or horizontal rows, bays. Bay layout is used in real worlds especially in concepts such as semiconductor and aisles. There are several reviews in facility layout; however none of them focus on bay layout. The literature analysis given here is not limited to specific considerations about bay layout design. We present a state of art review for bay layout considering some issues such as the used objectives, the techniques of solving and the integration methods in bay.
2013-12-20
MORRO BAY, Calif. – An Erickson Sky Crane helicopter lands in Morro Bay, Calif., in preparation for the test of the SpaceX Dragon test article. The test enables SpaceX engineers to evaluate the spacecraft's parachute deployment system as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. The parachute test took place at Morro Bay, Calif. Photo credit: NASA/Kim Shiflett
2013-12-20
MORRO BAY, Calif. – An Erickson Sky Crane helicopter lands in Morro Bay, Calif., in preparation for the test of the SpaceX Dragon test article. The test enables SpaceX engineers to evaluate the spacecraft's parachute deploymentsystem as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. The parachute test took place at Morro Bay, Calif. Photo credit: NASA/Kim Shiflett
Photocopy of drawing (original drawing of MacDill Field in possession ...
Photocopy of drawing (original drawing of MacDill Field in possession of MacDill Air Force Base, Civil Engineering, Tampa, Florida; 1952 architectural drawings by Strategic Air Command, MacDill Air Force Base) BASE LAYOUT, 1952 - MacDill Air Force Base, Bounded by City of Tampa North, Tampa Bay South, Old Tampa Bay West, & Hillsborough Bay East, Tampa, Hillsborough County, FL
Flow in water-intake pump bays: A guide for utility engineers. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ettema, R.
1998-09-01
This report is intended to serve as a guide for power-plant engineers facing problems with flow conditions in pump bays in water-intake structures, especially those located alongside rivers. The guide briefly introduces the typical prevailing flow field outside of a riverside water intake. That flow field often sets the inflow conditions for pump bays located within the water intake. The monograph then presents and discusses the main flow problems associated with pump bays. The problems usually revolve around the formation of troublesome vortices. A novel feature of this monograph is the use of numerical modeling to reveal diagnostically how themore » vortices form and their sensitivities to flow conditions, such as uniformity of approach flow entering the bay and water-surface elevation relative to pump-bell submergence. The modeling was carried out using a computer code developed specially for the present project. Pump-bay layouts are discussed next. The discussion begins with a summary of the main variables influencing bay flows. The numerical model is used to determine the sensitivities of the vortices to variations in the geometric parameters. The fixes include the use of flow-control vanes and suction scoops for ensuring satisfactory flow performance in severe flow conditions; notably flows with strong cross flow and shallow flows. The monograph ends with descriptions of modeling techniques. An extensive discussion is provided on the use of numerical model for illuminating bay flows. The model is used to show how fluid viscosity affects bay flow. The effect of fluid viscosity is an important consideration in hydraulic modeling of water intakes.« less
NASA Astrophysics Data System (ADS)
Thomas, Adam D.; Dopita, Michael A.; Kewley, Lisa J.; Groves, Brent A.; Sutherland, Ralph S.; Hopkins, Andrew M.; Blanc, Guillermo A.
2018-04-01
NebulaBayes is a new Bayesian code that implements a general method of comparing observed emission-line fluxes to photoionization model grids. The code enables us to extract robust, spatially resolved measurements of abundances in the extended narrow-line regions (ENLRs) produced by Active Galactic Nuclei (AGN). We observe near-constant ionization parameters but steeply radially declining pressures, which together imply that radiation pressure regulates the ENLR density structure on large scales. Our sample includes four “pure Seyfert” galaxies from the S7 survey that have extensive ENLRs. NGC 2992 shows steep metallicity gradients from the nucleus into the ionization cones. An inverse metallicity gradient is observed in ESO 138-G01, which we attribute to a recent gas inflow or minor merger. A uniformly high metallicity and hard ionizing continuum are inferred across the ENLR of Mrk 573. Our analysis of IC 5063 is likely affected by contamination from shock excitation, which appears to soften the inferred ionizing spectrum. The peak of the ionizing continuum E peak is determined by the nuclear spectrum and the absorbing column between the nucleus and the ionized nebula. We cannot separate variation in this intrinsic E peak from the effects of shock or H II region contamination, but E peak measurements nevertheless give insights into ENLR excitation. We demonstrate the general applicability of NebulaBayes by analyzing a nuclear spectrum from the non-active galaxy NGC 4691 using a H II region grid. The NLR and H II region model grids are provided with NebulaBayes for use by the astronomical community.
Bayesian model reduction and empirical Bayes for group (DCM) studies
Friston, Karl J.; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E.; van Wijk, Bernadette C.M.; Ziegler, Gabriel; Zeidman, Peter
2016-01-01
This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level – e.g., dynamic causal models – and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. PMID:26569570
PIA: An Intuitive Protein Inference Engine with a Web-Based User Interface.
Uszkoreit, Julian; Maerkens, Alexandra; Perez-Riverol, Yasset; Meyer, Helmut E; Marcus, Katrin; Stephan, Christian; Kohlbacher, Oliver; Eisenacher, Martin
2015-07-02
Protein inference connects the peptide spectrum matches (PSMs) obtained from database search engines back to proteins, which are typically at the heart of most proteomics studies. Different search engines yield different PSMs and thus different protein lists. Analysis of results from one or multiple search engines is often hampered by different data exchange formats and lack of convenient and intuitive user interfaces. We present PIA, a flexible software suite for combining PSMs from different search engine runs and turning these into consistent results. PIA can be integrated into proteomics data analysis workflows in several ways. A user-friendly graphical user interface can be run either locally or (e.g., for larger core facilities) from a central server. For automated data processing, stand-alone tools are available. PIA implements several established protein inference algorithms and can combine results from different search engines seamlessly. On several benchmark data sets, we show that PIA can identify a larger number of proteins at the same protein FDR when compared to that using inference based on a single search engine. PIA supports the majority of established search engines and data in the mzIdentML standard format. It is implemented in Java and freely available at https://github.com/mpc-bioinformatics/pia.
Dopamine, Affordance and Active Inference
Friston, Karl J.; Shiner, Tamara; FitzGerald, Thomas; Galea, Joseph M.; Adams, Rick; Brown, Harriet; Dolan, Raymond J.; Moran, Rosalyn; Stephan, Klaas Enno; Bestmann, Sven
2012-01-01
The role of dopamine in behaviour and decision-making is often cast in terms of reinforcement learning and optimal decision theory. Here, we present an alternative view that frames the physiology of dopamine in terms of Bayes-optimal behaviour. In this account, dopamine controls the precision or salience of (external or internal) cues that engender action. In other words, dopamine balances bottom-up sensory information and top-down prior beliefs when making hierarchical inferences (predictions) about cues that have affordance. In this paper, we focus on the consequences of changing tonic levels of dopamine firing using simulations of cued sequential movements. Crucially, the predictions driving movements are based upon a hierarchical generative model that infers the context in which movements are made. This means that we can confuse agents by changing the context (order) in which cues are presented. These simulations provide a (Bayes-optimal) model of contextual uncertainty and set switching that can be quantified in terms of behavioural and electrophysiological responses. Furthermore, one can simulate dopaminergic lesions (by changing the precision of prediction errors) to produce pathological behaviours that are reminiscent of those seen in neurological disorders such as Parkinson's disease. We use these simulations to demonstrate how a single functional role for dopamine at the synaptic level can manifest in different ways at the behavioural level. PMID:22241972
Discriminative Relational Topic Models.
Chen, Ning; Zhu, Jun; Xia, Fei; Zhang, Bo
2015-05-01
Relational topic models (RTMs) provide a probabilistic generative process to describe both the link structure and document contents for document networks, and they have shown promise on predicting network structures and discovering latent topic representations. However, existing RTMs have limitations in both the restricted model expressiveness and incapability of dealing with imbalanced network data. To expand the scope and improve the inference accuracy of RTMs, this paper presents three extensions: 1) unlike the common link likelihood with a diagonal weight matrix that allows the-same-topic interactions only, we generalize it to use a full weight matrix that captures all pairwise topic interactions and is applicable to asymmetric networks; 2) instead of doing standard Bayesian inference, we perform regularized Bayesian inference (RegBayes) with a regularization parameter to deal with the imbalanced link structure issue in real networks and improve the discriminative ability of learned latent representations; and 3) instead of doing variational approximation with strict mean-field assumptions, we present collapsed Gibbs sampling algorithms for the generalized relational topic models by exploring data augmentation without making restricting assumptions. Under the generic RegBayes framework, we carefully investigate two popular discriminative loss functions, namely, the logistic log-loss and the max-margin hinge loss. Experimental results on several real network datasets demonstrate the significance of these extensions on improving prediction performance.
Water resources planning for rivers draining into Mobile Bay
NASA Technical Reports Server (NTRS)
April, G. C.
1976-01-01
The application of remote sensing, automatic data processing, modeling and other aerospace related technologies to hydrological engineering and water resource management are discussed for the entire river drainage system which feeds the Mobile Bay estuary. The adaptation and implementation of existing mathematical modeling methods are investigated for the purpose of describing the behavior of Mobile Bay. Of particular importance are the interactions that system variables such as river flow rate, wind direction and speed, and tidal state have on the water movement and quality within the bay system.
Bayesian Inference in the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2008-01-01
This paper provides an elementary tutorial overview of Bayesian inference and its potential for application in aerospace experimentation in general and wind tunnel testing in particular. Bayes Theorem is reviewed and examples are provided to illustrate how it can be applied to objectively revise prior knowledge by incorporating insights subsequently obtained from additional observations, resulting in new (posterior) knowledge that combines information from both sources. A logical merger of Bayesian methods and certain aspects of Response Surface Modeling is explored. Specific applications to wind tunnel testing, computational code validation, and instrumentation calibration are discussed.
2004-01-22
KENNEDY SPACE CENTER, FLA. - Standing inside Discovery’s payload bay, Carol Scott (right), lead orbiter engineer, talks about her job as part of a special feature for the KSC Web. With his back to the camera is Bill Kallus, Media manager in the KSC Web Studio. Behind Scott can be seen the open hatch of the airlock, which provides support functions such as airlock depressurization and repressurization, extravehicular activity equipment recharge, liquid-cooled garment water cooling, EVA equipment checkout, donning and communications. The outer hatch isolates the airlock from the unpressurized payload bay when closed and permits the EVA crew members to exit from the airlock to the payload bay when open.
Ferry Engine Repower to Provide Benefits for Air and Water
EPA’s Diesel Emission Reduction Act grant to the Delaware River and Bay Authority is bringing new clean air technology to the Cape May-Lewes Ferry, thereby reducing air pollution emissions and contributing to cleaner water in the Chesapeake Bay.
2. VIEW SOUTH, NORTH ELEVATION SHOWING BAYS 2 and 3, ...
2. VIEW SOUTH, NORTH ELEVATION SHOWING BAYS 2 and 3, DIESEL AND TURNTABLE Photocopy of photograph, 1976 (Courtesy of Chesapeake Beach Railway Museum; Roy Hartman, photographer) - Chesapeake Beach Railroad Engine House, 21 Yost Place, Seat Pleasant, Prince George's County, MD
Domurat, Artur; Kowalczuk, Olga; Idzikowska, Katarzyna; Borzymowska, Zuzanna; Nowak-Przygodzka, Marta
2015-01-01
This paper has two aims. First, we investigate how often people make choices conforming to Bayes' rule when natural sampling is applied. Second, we show that using Bayes' rule is not necessary to make choices satisfying Bayes' rule. Simpler methods, even fallacious heuristics, might prescribe correct choices reasonably often under specific circumstances. We considered elementary situations with binary sets of hypotheses and data. We adopted an ecological approach and prepared two-stage computer tasks resembling natural sampling. Probabilistic relations were inferred from a set of pictures, followed by a choice which was made to maximize the chance of a preferred outcome. Use of Bayes' rule was deduced indirectly from choices. Study 1 used a stratified sample of N = 60 participants equally distributed with regard to gender and type of education (humanities vs. pure sciences). Choices satisfying Bayes' rule were dominant. To investigate ways of making choices more directly, we replicated Study 1, adding a task with a verbal report. In Study 2 (N = 76) choices conforming to Bayes' rule dominated again. However, the verbal reports revealed use of a new, non-inverse rule, which always renders correct choices, but is easier than Bayes' rule to apply. It does not require inversion of conditions [transforming P(H) and P(D|H) into P(H|D)] when computing chances. Study 3 examined the efficiency of three fallacious heuristics (pre-Bayesian, representativeness, and evidence-only) in producing choices concordant with Bayes' rule. Computer-simulated scenarios revealed that the heuristics produced correct choices reasonably often under specific base rates and likelihood ratios. Summing up we conclude that natural sampling results in most choices conforming to Bayes' rule. However, people tend to replace Bayes' rule with simpler methods, and even use of fallacious heuristics may be satisfactorily efficient.
A Modular Artificial Intelligence Inference Engine System (MAIS) for support of on orbit experiments
NASA Technical Reports Server (NTRS)
Hancock, Thomas M., III
1994-01-01
This paper describes a Modular Artificial Intelligence Inference Engine System (MAIS) support tool that would provide health and status monitoring, cognitive replanning, analysis and support of on-orbit Space Station, Spacelab experiments and systems.
Ghosh, Sujit K
2010-01-01
Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.
Why Bayesian Psychologists Should Change the Way They Use the Bayes Factor.
Hoijtink, Herbert; van Kooten, Pascal; Hulsker, Koenraad
2016-01-01
The discussion following Bem's ( 2011 ) psi research highlights that applications of the Bayes factor in psychological research are not without problems. The first problem is the omission to translate subjective prior knowledge into subjective prior distributions. In the words of Savage ( 1961 ): "they make the Bayesian omelet without breaking the Bayesian egg." The second problem occurs if the Bayesian egg is not broken: the omission to choose default prior distributions such that the ensuing inferences are well calibrated. The third problem is the adherence to inadequate rules for the interpretation of the size of the Bayes factor. The current paper will elaborate these problems and show how to avoid them using the basic hypotheses and statistical model used in the first experiment described in Bem ( 2011 ). It will be argued that a thorough investigation of these problems in the context of more encompassing hypotheses and statistical models is called for if Bayesian psychologists want to add a well-founded Bayes factor to the tool kit of psychological researchers.
Late Eocene impacts: Geologic record, correlation, and paleoenvironmental consequences
Poag, C. Wylie; Mankinen, Edward A.; Norris, Richard D.
2003-01-01
We present new magnetostratigraphic and stable isotopic (18C, 13Ccarb) data to help improve correlations among three late Eocene impact craters and their inferred breccia and ejecta deposits. Our analyses also shed light on potential global environmental consequences attributable to the impacts. The new data come from a continuously cored interval of the subsurface Chickahominy Formation, which lies conformably above the Chesapeake Bay impact crater in southeastern Virginia. The new magnetostratigraphic data indicate that the Chesapeake Bay impact took place in Chron C16n. 2n, the same magnetochron that encompasses the late Eocene ejecta layer at Massignano, Italy. This correlation places both the Chesapeake Bay impact and the Massignano ejecta at ~35.6 Ma, and resolves a previous miscorrelation between these two sites based on planktonic foraminifera and calcareous nannofossils. The new magnetostratigraphic correlations also suggest that the published magnetostratigraphic framework for ejecta-bearing late Eocene strata ar ODP Site 689B (Maud Rise) is incorrect, due to an incomplete section.New 18C data (single species of benthic foraminifera) from the same Chickahominy section ar Chesapeake Bay indicate that successional intervals of warm oceanic bottom-water may be characteristic of the late Eocene. We infer that the warm intervals correlate with successive episodes of greenhouse warming, triggered in part by a comer shower, which produced the Chesapeake Bay, Toms Canyon, Popigai, and presumably additional (as yet undiscovered) late Eocene impact craters. We also demonstrate that a marked negative execution of 13Ccarb persists through the upper half of the Chickahominy Formation. This excursion, also recorded at Massigno, at Bath Cliff, Barbados, and at other widespread localities in the world ocean, may be additional evidence of global-scale, long-term environmental disturbances related to the bolide impacts. As such, this 13C signal may be useful for global subdivision of the late Eocene stratigraphic record.
2. VIEW TO THE SOUTHWEST OF THE MAIN EMAD BUILDING ...
2. VIEW TO THE SOUTHWEST OF THE MAIN E-MAD BUILDING WITH THE COLD BAY ON THE EAST (LEFT) AND THE HOT BAY ON THE WEST (RIGHT). - Nevada Test Site, Engine Maintenance Assembly & Disassembly Facility, Area 25, Jackass Flats, Mercury, Nye County, NV
2012-07-26
bottle rosette. Macronutrient and chlorophyll assays were performed using methods detailed in Pennington and Chavez [2000]. High Performance Liquid...S = 33.28) Niskin bottle sample contained 10.6 /<M nitrate. Other macronutrients (phosphate, silicilic acid) were detected and generally abundant
Earth Observations taken by the Expedition Seven crew
2003-06-25
ISS007-E-08251 (25 June 2003) --- This photo featuring the San Francisco Bay area in California was photographed from the International Space Station (ISS) by astronaut Edward T. Lu, Expedition 7 NASA ISS science officer and flight engineer. The San Francisco Bay Bridge, Alcatraz Island, Golden Gate Bridge, and Golden Gate Park are visible at upper right. Stanford University and red salt ponds on the bay near Fremont at lower left.
Audain, Enrique; Uszkoreit, Julian; Sachsenberg, Timo; Pfeuffer, Julianus; Liang, Xiao; Hermjakob, Henning; Sanchez, Aniel; Eisenacher, Martin; Reinert, Knut; Tabb, David L; Kohlbacher, Oliver; Perez-Riverol, Yasset
2017-01-06
In mass spectrometry-based shotgun proteomics, protein identifications are usually the desired result. However, most of the analytical methods are based on the identification of reliable peptides and not the direct identification of intact proteins. Thus, assembling peptides identified from tandem mass spectra into a list of proteins, referred to as protein inference, is a critical step in proteomics research. Currently, different protein inference algorithms and tools are available for the proteomics community. Here, we evaluated five software tools for protein inference (PIA, ProteinProphet, Fido, ProteinLP, MSBayesPro) using three popular database search engines: Mascot, X!Tandem, and MS-GF+. All the algorithms were evaluated using a highly customizable KNIME workflow using four different public datasets with varying complexities (different sample preparation, species and analytical instruments). We defined a set of quality control metrics to evaluate the performance of each combination of search engines, protein inference algorithm, and parameters on each dataset. We show that the results for complex samples vary not only regarding the actual numbers of reported protein groups but also concerning the actual composition of groups. Furthermore, the robustness of reported proteins when using databases of differing complexities is strongly dependant on the applied inference algorithm. Finally, merging the identifications of multiple search engines does not necessarily increase the number of reported proteins, but does increase the number of peptides per protein and thus can generally be recommended. Protein inference is one of the major challenges in MS-based proteomics nowadays. Currently, there are a vast number of protein inference algorithms and implementations available for the proteomics community. Protein assembly impacts in the final results of the research, the quantitation values and the final claims in the research manuscript. Even though protein inference is a crucial step in proteomics data analysis, a comprehensive evaluation of the many different inference methods has never been performed. Previously Journal of proteomics has published multiple studies about other benchmark of bioinformatics algorithms (PMID: 26585461; PMID: 22728601) in proteomics studies making clear the importance of those studies for the proteomics community and the journal audience. This manuscript presents a new bioinformatics solution based on the KNIME/OpenMS platform that aims at providing a fair comparison of protein inference algorithms (https://github.com/KNIME-OMICS). Six different algorithms - ProteinProphet, MSBayesPro, ProteinLP, Fido and PIA- were evaluated using the highly customizable workflow on four public datasets with varying complexities. Five popular database search engines Mascot, X!Tandem, MS-GF+ and combinations thereof were evaluated for every protein inference tool. In total >186 proteins lists were analyzed and carefully compare using three metrics for quality assessments of the protein inference results: 1) the numbers of reported proteins, 2) peptides per protein, and the 3) number of uniquely reported proteins per inference method, to address the quality of each inference method. We also examined how many proteins were reported by choosing each combination of search engines, protein inference algorithms and parameters on each dataset. The results show that using 1) PIA or Fido seems to be a good choice when studying the results of the analyzed workflow, regarding not only the reported proteins and the high-quality identifications, but also the required runtime. 2) Merging the identifications of multiple search engines gives almost always more confident results and increases the number of peptides per protein group. 3) The usage of databases containing not only the canonical, but also known isoforms of proteins has a small impact on the number of reported proteins. The detection of specific isoforms could, concerning the question behind the study, compensate for slightly shorter reports using the parsimonious reports. 4) The current workflow can be easily extended to support new algorithms and search engine combinations. Copyright © 2016. Published by Elsevier B.V.
Zuo, Chandler; Chen, Kailei; Keleş, Sündüz
2017-06-01
Current analytic approaches for querying large collections of chromatin immunoprecipitation followed by sequencing (ChIP-seq) data from multiple cell types rely on individual analysis of each data set (i.e., peak calling) independently. This approach discards the fact that functional elements are frequently shared among related cell types and leads to overestimation of the extent of divergence between different ChIP-seq samples. Methods geared toward multisample investigations have limited applicability in settings that aim to integrate 100s to 1000s of ChIP-seq data sets for query loci (e.g., thousands of genomic loci with a specific binding site). Recently, Zuo et al. developed a hierarchical framework for state-space matrix inference and clustering, named MBASIC, to enable joint analysis of user-specified loci across multiple ChIP-seq data sets. Although this versatile framework estimates both the underlying state-space (e.g., bound vs. unbound) and also groups loci with similar patterns together, its Expectation-Maximization-based estimation structure hinders its applicability with large number of loci and samples. We address this limitation by developing MAP-based asymptotic derivations from Bayes (MAD-Bayes) framework for MBASIC. This results in a K-means-like optimization algorithm that converges rapidly and hence enables exploring multiple initialization schemes and flexibility in tuning. Comparison with MBASIC indicates that this speed comes at a relatively insignificant loss in estimation accuracy. Although MAD-Bayes MBASIC is specifically designed for the analysis of user-specified loci, it is able to capture overall patterns of histone marks from multiple ChIP-seq data sets similar to those identified by genome-wide segmentation methods such as ChromHMM and Spectacle.
Luepke Bynum, Gretchen
2007-01-01
Modern sediments from representative localities in Willapa Bay, Washington, comprise two principal heavy-mineral suites. One contains approximately equivalent amounts of hornblende, orthopyroxene, and clinopyroxene; this is derived from the Columbia River, which discharges into the Pacific Ocean a short distance south of the bay. The other suite, dominated by clinopyroxene, is restricted to sands of rivers flowing into the bay from the east. The heavy-mineral distributions within the bay suggest that sand discharged from the Columbia River, borne north by longshore transport and carried into the bay by tidal currents, accounts for nearly all of the sand within the interior of Willapa Bay today. Pleistocene deposits on the east side of the bay contain three heavy-mineral assemblages, two of which are identical to the modern assemblages described above. These assemblages reflect the relative influence of tidal and fluvial processes on the Late Pleistocene deposits (100,000–200,000 BP. Amino acid racemization in Quaternary shell deposits at Willapa Bay, Washington. Geochimica et Cosmochimica Acta 43, 1505–1520). They are also consistent with those processes inferred on the basis of sedimentary structures and stratigraphic relations in about two-thirds of the samples examined. Anomalies can be explained by recycling of sand from older deposits. The persistence of the two heavy-mineral suites suggests that the pattern of estuarine sedimentation in Late Pleistocene deposits closely resembled that of the modern bay. The third heavy-mineral suite is enriched in epidote and occurs in a few older Pleistocene units. On the north side of the bay, the association of this suite with southwest-directed foresets in cross-bedded gravel indicates derivation from the northeast, perhaps from an area of glacial outwash. The presence of this suite in ancient estuarine sands exposed on the northeast side of the bay suggests that input from this northerly source may have intermittently dominated Willapa Bay deposition in the past.
General view of the High Bay area of the Space ...
General view of the High Bay area of the Space Shuttle Main Engine (SSME) Processing Facility at Kennedy Space Center. This view shows the specially modified fork lift used for horizontal installation and removal of the SSMEs into and out of the Orbiters. SSME number 2059 is in the background and is in the process of being scanned with a high-definition laser scanner to acquire field documentation for the production of historic documentatin. - Space Transportation System, Space Shuttle Main Engine, Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
F-OWL: An Inference Engine for Semantic Web
NASA Technical Reports Server (NTRS)
Zou, Youyong; Finin, Tim; Chen, Harry
2004-01-01
Understanding and using the data and knowledge encoded in semantic web documents requires an inference engine. F-OWL is an inference engine for the semantic web language OWL language based on F-logic, an approach to defining frame-based systems in logic. F-OWL is implemented using XSB and Flora-2 and takes full advantage of their features. We describe how F-OWL computes ontology entailment and compare it with other description logic based approaches. We also describe TAGA, a trading agent environment that we have used as a test bed for F-OWL and to explore how multiagent systems can use semantic web concepts and technology.
Genie Inference Engine Rule Writer’s Guide.
1987-08-01
33 APPENDIX D. Animal Bootstrap File.............................................................. 39...APPENDIX E. Sample Run of Animal Identification Expert System.......................... 43 APPENDIX F. Numeric Test Knowledge Base...and other data s.tructures stored in the knowledge base (KB), queries the user for input, and draws conclusions. Genie (GENeric Inference Engine) is
Harmonic analysis of tides and tidal currents in South San Francisco Bay, California
Cheng, R.T.; Gartner, J.W.
1985-01-01
Water level observations from tide stations and current observations from current-meter moorings in South San Francisco Bay (South Bay), California have been harmonically analysed. At each tide station, 13 harmonic constituents have been computed by a least-squares regression without inference. Tides in South Bay are typically mixed; there is a phase lag of approximately 1 h and an amplification of 1??5 from north to south for a mean semi-diurnal tide. Because most of the current-meter records are between 14 and 29 days, only the five most important harmonics have been solved for east-west and north-south velocity components. The eccentricity of tidal-current ellipse is generally very small, which indicates that the tidal current in South Bay is strongly bidirectional. The analyses further show that the principal direction and the magnitude of tidal current are well correlated with the basin bathymetry. Patterns of Eulerian residual circulation deduced from the current-meter data show an anticlockwise gyre to the west and a clockwise gyre to the east of the main channel in the summer months due to the prevailing westerly wind. Opposite trends have been observed during winter when the wind was variable. ?? 1985.
4. VIEW TO THE NORTHWEST OF THE COLD BAY ON ...
4. VIEW TO THE NORTHWEST OF THE COLD BAY ON THE NORTH (RIGHT) AND THE POST-MORTEM CELLS ON THE SOUTH (LEFT). ALSO ILLUSTRATED ARE THE DIFFERENT ROOF HEIGHTS OF THE BUILDING. - Nevada Test Site, Engine Maintenance Assembly & Disassembly Facility, Area 25, Jackass Flats, Mercury, Nye County, NV
NASA Ames and Traveling Space Museum Host Space Day at Bay Area Schools (Version 2 - Final)
2010-08-10
NASA Ames and the Traveling Space Museum visited under-represented students in the Bay Area in an effort to excite them to the possibilities in science, technology, engineering and mathematics. Includes soundbites from Lewis Braxton III (NASA Ames) and actress Nichelle Nichols (TSM).
EAST/WEST TRUCK BAY AREA OF TRANSFER BASIN CORRIDOR OF FUEL ...
EAST/WEST TRUCK BAY AREA OF TRANSFER BASIN CORRIDOR OF FUEL STORAGE BUILDING (CPP-603). PHOTO TAKEN LOOKING NORTHWEST. INL PHOTO NUMBER HD-54-19-1. Mike Crane, Photographer, 8/2005 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID
3. NORTH ELEVATION OF THE HOT BAY, SHOWING RAILROAD TRACKS ...
3. NORTH ELEVATION OF THE HOT BAY, SHOWING RAILROAD TRACKS LEADING TO THE MASSIVE STEEL-LINED CONCRETE ENTRANCE DOOR. PART OF THE INTRICATE HVAC SYSTEM IS WEST (RIGHT) OF THE DOOR. - Nevada Test Site, Engine Maintenance Assembly & Disassembly Facility, Area 25, Jackass Flats, Mercury, Nye County, NV
Optimal inference with suboptimal models: Addiction and active Bayesian inference
Schwartenbeck, Philipp; FitzGerald, Thomas H.B.; Mathys, Christoph; Dolan, Ray; Wurst, Friedrich; Kronbichler, Martin; Friston, Karl
2015-01-01
When casting behaviour as active (Bayesian) inference, optimal inference is defined with respect to an agent’s beliefs – based on its generative model of the world. This contrasts with normative accounts of choice behaviour, in which optimal actions are considered in relation to the true structure of the environment – as opposed to the agent’s beliefs about worldly states (or the task). This distinction shifts an understanding of suboptimal or pathological behaviour away from aberrant inference as such, to understanding the prior beliefs of a subject that cause them to behave less ‘optimally’ than our prior beliefs suggest they should behave. Put simply, suboptimal or pathological behaviour does not speak against understanding behaviour in terms of (Bayes optimal) inference, but rather calls for a more refined understanding of the subject’s generative model upon which their (optimal) Bayesian inference is based. Here, we discuss this fundamental distinction and its implications for understanding optimality, bounded rationality and pathological (choice) behaviour. We illustrate our argument using addictive choice behaviour in a recently described ‘limited offer’ task. Our simulations of pathological choices and addictive behaviour also generate some clear hypotheses, which we hope to pursue in ongoing empirical work. PMID:25561321
Bayes Nets in Educational Assessment: Where Do the Numbers Come from? CSE Technical Report.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell G.; Yan, Duanli; Steinberg, Linda S.
Educational assessments that exploit advances in technology and cognitive psychology can produce observations and pose student models that outstrip familiar test-theoretic models and analytic methods. Bayesian inference networks (BINs), which include familiar models and techniques as special cases, can be used to manage belief about students'…
Pacini, Clare; Ajioka, James W; Micklem, Gos
2017-04-12
Correlation matrices are important in inferring relationships and networks between regulatory or signalling elements in biological systems. With currently available technology sample sizes for experiments are typically small, meaning that these correlations can be difficult to estimate. At a genome-wide scale estimation of correlation matrices can also be computationally demanding. We develop an empirical Bayes approach to improve covariance estimates for gene expression, where we assume the covariance matrix takes a block diagonal form. Our method shows lower false discovery rates than existing methods on simulated data. Applied to a real data set from Bacillus subtilis we demonstrate it's ability to detecting known regulatory units and interactions between them. We demonstrate that, compared to existing methods, our method is able to find significant covariances and also to control false discovery rates, even when the sample size is small (n=10). The method can be used to find potential regulatory networks, and it may also be used as a pre-processing step for methods that calculate, for example, partial correlations, so enabling the inference of the causal and hierarchical structure of the networks.
Bayesian model reduction and empirical Bayes for group (DCM) studies.
Friston, Karl J; Litvak, Vladimir; Oswal, Ashwini; Razi, Adeel; Stephan, Klaas E; van Wijk, Bernadette C M; Ziegler, Gabriel; Zeidman, Peter
2016-03-01
This technical note describes some Bayesian procedures for the analysis of group studies that use nonlinear models at the first (within-subject) level - e.g., dynamic causal models - and linear models at subsequent (between-subject) levels. Its focus is on using Bayesian model reduction to finesse the inversion of multiple models of a single dataset or a single (hierarchical or empirical Bayes) model of multiple datasets. These applications of Bayesian model reduction allow one to consider parametric random effects and make inferences about group effects very efficiently (in a few seconds). We provide the relatively straightforward theoretical background to these procedures and illustrate their application using a worked example. This example uses a simulated mismatch negativity study of schizophrenia. We illustrate the robustness of Bayesian model reduction to violations of the (commonly used) Laplace assumption in dynamic causal modelling and show how its recursive application can facilitate both classical and Bayesian inference about group differences. Finally, we consider the application of these empirical Bayesian procedures to classification and prediction. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Jayaram, Varalakshmi; Agrawal, Harshit; Welch, William A; Miller, J Wayne; Cocker, David R
2011-03-15
Emissions from harbor-craft significantly affect air quality in populated regions near ports and inland waterways. This research measured regulated and unregulated emissions from an in-use EPA Tier 2 marine propulsion engine on a ferry operating in a bay following standard methods. A special effort was made to monitor continuously both the total Particulate Mass (PM) mass emissions and the real-time Particle Size Distribution (PSD). The engine was operated following the loads in ISO 8178-4 E3 cycle for comparison with the certification standards and across biodiesel blends. Real-time measurements were also made during a typical cruise in the bay. Results showed the in-use nitrogen oxide (NOx) and PM(2.5) emission factors were within the not to exceed standard for Tier 2 marine engines. Comparing across fuels we observed the following: a) no statistically significant change in NO(x) emissions with biodiesel blends (B20, B50); b) ∼ 16% and ∼ 25% reduction of PM(2.5) mass emissions with B20 and B50 respectively; c) a larger organic carbon (OC) to elemental carbon (EC) ratio and organic mass (OM) to OC ratio with B50 compared to B20 and B0; d) a significant number of ultrafine nuclei and a smaller mass mean diameter with increasing blend-levels of biodiesel. The real-time monitoring of gaseous and particulate emissions during a typical cruise in the San Francisco Bay (in-use cycle) revealed important effects of ocean/bay currents on emissions: NO(x) and CO(2) increased 3-fold; PM(2.5) mass increased 6-fold; and ultrafine particles disappeared due to the effect of bay currents. This finding has implications on the use of certification values instead of actual in-use emission values when developing inventories. Emission factors for some volatile organic compounds (VOCs), carbonyls, and poly aromatic hydrocarbons (PAHs) are reported as supplemental data.
Closeup view of the aft fuselage of the Orbiter Discovery ...
Close-up view of the aft fuselage of the Orbiter Discovery looking at the thrust structure that supports the Space Shuttle Main Engines (SSMEs). In this view, SSME number two position is on the left and SSME number three position is on the right. The thrust structure transfers the forces produce by the engines into and through the airframe of the orbiter. The thrust structure includes the SSMEs load reaction truss structure, engine interface fittings and the hydraulic-actuator support structure. The propellant feed lines are the plugged and capped orifices within the engine bays. Note that SSME position two is rotated ninety degrees from position three and one. This was needed to enable enough clearance for the engines to fit and gimbal. Note in engine bay three is a clear view of the actuators that control the gambling of that engine. This view was taken from a service platform in the Orbiter Processing Facility at Kennedy Space Center. - Space Transportation System, Orbiter Discovery (OV-103), Lyndon B. Johnson Space Center, 2101 NASA Parkway, Houston, Harris County, TX
Mechanisms of sediment flux between shallows and marshes
Lacy, Jessica R.; Schile, L.M.; Callaway, J.C.; Ferner, M.C.
2015-01-01
We conducted a field study to investigate temporal variation and forcing mechanisms of sediment flux between a salt marsh and adjacent shallows in northern San Francisco Bay. Suspended-sediment concentration (SSC), tidal currents, and wave properties were measured over the marsh, in marsh creeks, and in bay shallows. Cumulative sediment flux in the marsh creeks was bayward during the study, and was dominated by large bayward flux during the largest tides of the year. This result was unexpected because extreme high tides with long inundation periods are commonly assumed to supply sediment to marshes, and long-term accretion estimates show that the marsh in the study site is depositional. A water mass-balance shows that some landward transport bypassed the creeks, most likely across the marsh-bay interface. An estimate of transport by this pathway based on observed SSC and inferred volume indicates that it was likely much less than the observed export.
33 CFR 334.70 - Buzzards Bay, and adjacent waters, Mass.; danger zones for naval operations.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., Mass.; danger zones for naval operations. 334.70 Section 334.70 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.70 Buzzards Bay, and adjacent waters, Mass.; danger zones for naval operations. (a) Atlantic...
33 CFR 334.70 - Buzzards Bay, and adjacent waters, Mass.; danger zones for naval operations.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., Mass.; danger zones for naval operations. 334.70 Section 334.70 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.70 Buzzards Bay, and adjacent waters, Mass.; danger zones for naval operations. (a) Atlantic...
33 CFR 334.70 - Buzzards Bay, and adjacent waters, Mass.; danger zones for naval operations.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Mass.; danger zones for naval operations. 334.70 Section 334.70 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.70 Buzzards Bay, and adjacent waters, Mass.; danger zones for naval operations. (a) Atlantic...
33 CFR 334.70 - Buzzards Bay, and adjacent waters, Mass.; danger zones for naval operations.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., Mass.; danger zones for naval operations. 334.70 Section 334.70 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.70 Buzzards Bay, and adjacent waters, Mass.; danger zones for naval operations. (a) Atlantic...
Prajith, A; Rao, V Purnachandra; Kessarkar, Pratima M
2015-10-15
Magnetic properties of sediments were investigated in 7 gravity cores recovered along a transect of the Mandovi estuary, western India to understand their provenance and pollution. The maximum magnetic susceptibility of sediments was at least 6 times higher in the upper/middle estuary than in lower estuary/bay. The χfd% and χARM/SIRM of sediments indicated coarse, multi-domain and pseudo-single domain magnetic grains, resembling ore material in the upper/middle estuary and coarse stable single domain (SSD) to fine SSD grains in the lower estuary/bay. Mineralogy parameters indicated hematite and goethite-dominated sediments in the upper/middle estuary and magnetite-dominated sediments in the lower estuary/bay. Two sediment types were discernible because of deposition of abundant ore material in the upper/middle estuary and detrital sediment in the lower estuary/bay. The enrichment factor and Index of geo-accumulation of metals indicated significant to strong pollution with respect to Fe and Mn in sediments from the upper/middle estuary. Copyright © 2015 Elsevier Ltd. All rights reserved.
Schörgendorfer, Angela; Branscum, Adam J; Hanson, Timothy E
2013-06-01
Logistic regression is a popular tool for risk analysis in medical and population health science. With continuous response data, it is common to create a dichotomous outcome for logistic regression analysis by specifying a threshold for positivity. Fitting a linear regression to the nondichotomized response variable assuming a logistic sampling model for the data has been empirically shown to yield more efficient estimates of odds ratios than ordinary logistic regression of the dichotomized endpoint. We illustrate that risk inference is not robust to departures from the parametric logistic distribution. Moreover, the model assumption of proportional odds is generally not satisfied when the condition of a logistic distribution for the data is violated, leading to biased inference from a parametric logistic analysis. We develop novel Bayesian semiparametric methodology for testing goodness of fit of parametric logistic regression with continuous measurement data. The testing procedures hold for any cutoff threshold and our approach simultaneously provides the ability to perform semiparametric risk estimation. Bayes factors are calculated using the Savage-Dickey ratio for testing the null hypothesis of logistic regression versus a semiparametric generalization. We propose a fully Bayesian and a computationally efficient empirical Bayesian approach to testing, and we present methods for semiparametric estimation of risks, relative risks, and odds ratios when parametric logistic regression fails. Theoretical results establish the consistency of the empirical Bayes test. Results from simulated data show that the proposed approach provides accurate inference irrespective of whether parametric assumptions hold or not. Evaluation of risk factors for obesity shows that different inferences are derived from an analysis of a real data set when deviations from a logistic distribution are permissible in a flexible semiparametric framework. © 2013, The International Biometric Society.
Mobile Bay, Alabama area seen in Skylab 4 Earth Resources Experiment Package
1974-02-01
SL4-92-300 (February 1974) --- A near vertical view of the Mobile Bay, Alabama area is seen in this Skylab 4 Earth Resources Experiments Package S190-B (five-inch earth terrain camera) photograph taken from the Skylab space station in Earth orbit. North of Mobile the Tombigbee and Alabama Rivers join to form the Mobile River. Detailed configuration of the individual stream channels and boundaries can be defined as the Mobile River flows into Mobile Bay, and thence into the Gulf of Mexico. The Mobile River Valley with its numerous stream channels is a distinct light shade in contrast to the dark green shade of the adjacent areas. The red coloration of Mobile Bay reflects the sediment load carried into the Bay by the rivers. Variations in red color indicate sediment load and the current paths within Mobile Bay. The waterly movement of the along shore currents at the mouth of Mobile Bay is shown by the contrasting light blue of the sediment-laden current and the blue of the Gulf predominately. Agricultural areas east and west of Mobile Bay are characterized by a rectangular pattern in green to white shades. Color variations may reflect the type and growth cycle of crops. Agricultural areas (light gray-greens) are also clearly visible in other parts of the photograph. Interstate 10 extends from near Pascagoula, Mississippi eastward through Mobile to the outskirts of Pensacola, Florida. Analysis of the EREP photographic data will be undertaken by the U.S. Corps of Engineers to determine bay dynamic processes. Federal agencies participating with NASA on the EREP project are the Departments of Agriculture, Commerce, Interior, the Environmental Protection Agency and the Corps of Engineers. All EREP photography is available to the public through the Department of Interior's Earth Resources Observations Systems Data Center, Sioux Falls, South Dakota. 57198 Photo credit: NASA
The microgravity environment of the Space Shuttle Columbia payload bay during STS-32
NASA Technical Reports Server (NTRS)
Dunbar, Bonnie J.; Giesecke, Robert L.; Thomas, Donald A.
1991-01-01
Over 11 hours of three-axis microgravity accelerometer data were successfully measured in the payload bay of Space Shuttle Columbia as part of the Microgravity Disturbances Experiment on STS-32. These data were measured using the High Resolution Accelerometer Package and the Aerodynamic Coefficient Identification Package which were mounted on the Orbiter keel in the aft payload bay. Data were recorded during specific mission events such as Orbiter quiescent periods, crew exercise on the treadmill, and numerous Orbiter engine burns. Orbiter background levels were measured in the 10(exp -5) G range, treadmill operations in the 10(exp -3) G range, and the Orbiter engine burns in the 10(exp -2) G range. Induced acceleration levels resulting from the SYNCOM satellite deploy were in the 10 (exp -2) G range, and operations during the pre-entry Flight Control System checkout were in the 10(exp -2) to 10(exp -1) G range.
13. Photograph of line drawing in possession of the Engineering ...
13. Photograph of line drawing in possession of the Engineering Division of the Directorate of Engineering and Housing, Watervliet Arsenal, New York. BRICK BAY FOR OFFICERS QUARTERS, BRICK SET, EAST SIDE, PLAN AND ELEVATION, OCTOBER 18, 1886. - Watervliet Arsenal, Building No. 4, Mordecai Drive, West of Mettler Road, Watervliet, Albany County, NY
2013-12-20
MORRO BAY, Calif. – An Erickson Sky Crane helicopter refuels following splash down of SpaceX Dragon test article. The test enables SpaceX engineers to evaluate the spacecraft's parachute deployment system as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. The parachute test took place at Morro Bay, Calif. Photo credit: NASA/Kim Shiflett
2013-12-20
MORRO BAY, Calif. – The SpaceX Dragon test article awaits recovery from the Pacific Ocean, off the coast of Morro Bay, Calif following splash down. The test enabled SpaceX engineers to evaluate the spacecraft's parachute deployment system as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett
2013-12-20
MORRO BAY, Calif. – The SpaceX Dragon test article splashes down following a test over the Pacific Ocean, off the coast of Morro Bay, Calif. The test enabled SpaceX engineers to evaluate the spacecraft's parachute deployment system as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett
2013-12-20
MORRO BAY, Calif. – The SpaceX Dragon test article splashes down following a test over the Pacific Ocean, off the coast of Morro Bay, Calif. The test enabled SpaceX engineers to evaluate the spacecraft's parachute deployment system as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett
Investigating Possibilities of Energy Supply from a Tidal Lagoon at Swansea Bay
ERIC Educational Resources Information Center
Thomas, Denise
2017-01-01
Derivation of an energy source from the movement of the tides is the reason for considering a lagoon to trap seawater in Swansea Bay. But while the professional engineers are investigating the possibility of that development, this student group has undertaken a study of the viability of developing biological sources of energy in this restricted…
Acoustic detection of ice crystals in Antarctic waters
NASA Astrophysics Data System (ADS)
Penrose, John D.; Conde, M.; Pauly, T. J.
1994-06-01
During the voyage of the RSV Aurora Australis to the region of Prydz Bay, Antarctica in January-March 1991, ice crystals were encountered at depths from the surface to 125-m in the western area of the bay. On two occasions, crystals were retrieved by netting, and echo sounder records have been used to infer additional regions of occurrence. Acoustic target strength estimates made on the ice crystal assemblies encountered show significant spatial variation, which may relate to crystal size and/or aggregation. Data from a suite of conductivity-temperature-depth casts have been used to map regions of the study area where in situ water temperatures fell below the computed freezing point. Such regions correlate well with those selected on the basis of echogram type and imply that ice crystals occurred at depth over large areas of the bay during the cruise period. The ice crystal distribution described is consistent with that expected from a plume of supercooled water emerging from under the Amery Ice Shelf and forming part of the general circulation of the bay. The magnitude of the supercooled water plume is greater than those reported previously in the Prydz Bay region. If misinterpreted as biota on echo sounder records, ice crystals could significantly bias biomass estimates based on echo integration in this and potentially other areas.
NASA Astrophysics Data System (ADS)
Tien Bui, Dieu; Hoang, Nhat-Duc
2017-09-01
In this study, a probabilistic model, named as BayGmmKda, is proposed for flood susceptibility assessment in a study area in central Vietnam. The new model is a Bayesian framework constructed by a combination of a Gaussian mixture model (GMM), radial-basis-function Fisher discriminant analysis (RBFDA), and a geographic information system (GIS) database. In the Bayesian framework, GMM is used for modeling the data distribution of flood-influencing factors in the GIS database, whereas RBFDA is utilized to construct a latent variable that aims at enhancing the model performance. As a result, the posterior probabilistic output of the BayGmmKda model is used as flood susceptibility index. Experiment results showed that the proposed hybrid framework is superior to other benchmark models, including the adaptive neuro-fuzzy inference system and the support vector machine. To facilitate the model implementation, a software program of BayGmmKda has been developed in MATLAB. The BayGmmKda program can accurately establish a flood susceptibility map for the study region. Accordingly, local authorities can overlay this susceptibility map onto various land-use maps for the purpose of land-use planning or management.
NASA Astrophysics Data System (ADS)
Zaitseva, A. F.; Konyukhov, I. V.; Kazimirko, Yu. V.; Pogosyan, S. I.
2018-03-01
Onega Bay waters are characterized by a high content of chromophoric dissolved organic matter (CDOM). The absorbance spectra and fluorescence intensity (excitation wavelength 455 nm, emission wavelength >680 nm) were used to assess the distribution of CDOM content in water filtered through a GF/F filter. The CDOM content at different points in Onega Bay showed more than a fourfold difference, as inferred from the measured values. The CDOM content in surface waters was, as a rule, higher than in the deeper horizons. A higher CDOM content was measured near the Onega River, near the middle part of the Onega shore, and near the Pomor shore opposite the town of Belomorsk. River runoff is the major source of CDOM in Onega Bay water. The CDOM chemical composition in Onega Bay waters was heterogeneous. The ratio of the fluorescence intensity to the absorbance value was higher near the mouths of rivers and in intensive mixing zones than in water characterized by high salinity. A highly significant linear correlation ( R 2 = 0.7825) between water salinity and CDOM fluorescence intensity was demonstrated. The contribution of fluorescent compounds to river runoff CDOM is substantially higher than the contribution to the composition marine CDOM.
Deep Borehole Instrumentation Along San Francisco Bay Bridges - 2000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchings, L.; Kasameyer, P.; Turpin, C.
2000-03-01
This is a progress report on the Bay Bridges downhole network. Between 2 and 8 instruments have been spaced along the Dumbarton, San Mateo, Bay, and San Rafael bridges in San Francisco Bay, California. The instruments will provide multiple use data that is important to geotechnical, structural engineering, and seismological studies. The holes are between 100 and 1000 ft deep and were drilled by Caltrans. There are twenty-one sensor packages at fifteen sites. The downhole instrument package contains a three component HS-1 seismometer and three orthogonal Wilcox 731 accelerometers, and is capable of recording a micro g from local Mmore » = 1.0 earthquakes to 0.5 g strong ground motion form large Bay Area earthquakes. Preliminary results on phasing across the Bay Bridge, up and down hole wave amplification at Yerba Buena Island, and sensor orientation analysis are presented. Events recorded and located during 1999 are presented. Also, a senior thesis on the deep structure of the San Francisco Bay beneath the Bay Bridge is presented as an addendum.« less
MultiNest: Efficient and Robust Bayesian Inference
NASA Astrophysics Data System (ADS)
Feroz, F.; Hobson, M. P.; Bridges, M.
2011-09-01
We present further development and the first public release of our multimodal nested sampling algorithm, called MultiNest. This Bayesian inference tool calculates the evidence, with an associated error estimate, and produces posterior samples from distributions that may contain multiple modes and pronounced (curving) degeneracies in high dimensions. The developments presented here lead to further substantial improvements in sampling efficiency and robustness, as compared to the original algorithm presented in Feroz & Hobson (2008), which itself significantly outperformed existing MCMC techniques in a wide range of astrophysical inference problems. The accuracy and economy of the MultiNest algorithm is demonstrated by application to two toy problems and to a cosmological inference problem focusing on the extension of the vanilla LambdaCDM model to include spatial curvature and a varying equation of state for dark energy. The MultiNest software is fully parallelized using MPI and includes an interface to CosmoMC. It will also be released as part of the SuperBayeS package, for the analysis of supersymmetric theories of particle physics, at this http URL.
NASA Astrophysics Data System (ADS)
Lewis, C. F. M.; Anderson, T. W.
2017-10-01
South Bay on the southern coast of Manitoulin Island is a fjord-like embayment connected to Lake Huron by a natural narrow gap in the bay's outer sill 6.5-14 m above the lake. A seismic profile, pollen, plant macrofossil, grain size analyses, and other sediment properties of two piston cores from a shallow outer basin of the bay document a 9 m-thick sediment section comprising rhythmically laminated clay under silty clay containing zones with small molluscan shells and marsh detritus. A sandy pebbly layer under soft silty clay mud overlies these sediments. This stratigraphy represents inundation by deep glacial Lake Algonquin followed by the shallowing Post Algonquin series of lakes, and exposure in the early Holocene by 5 Lake Stanley lowstands in the Lake Huron basin separated by 4 Lake Mattawa highstands. Overflow from South Bay in the first lowstand is thought to have eroded the outer sill gap. Marsh environments are inferred to have formed in the bay during subsequent lowstands. The Lake Mattawa highstands are attributed to outburst floods mainly from glacial Lake Agassiz. Palynological evidence of increased spruce occurrence, an apparent regional climate reversal, during the dry pine period is attributed to cold northwest winds from the Lake Superior basin and a lake effect from the Mattawa highstands in the Lake Huron basin. Lake waters transgressed South Bay following the pine period to form the Nipissing shore on Manitoulin Island. Transfer of Lake Huron basin drainage to southern outlets and continued glacioisostatic uplift of the region led to the present configuration of South Bay and Lake Huron.
Girard, Philippe; Takekawa, John Y.; Beissinger, Steven R.
2010-01-01
The threatened California Black Rail lives under dense marsh vegetation, is rarely observed, flies weakly and has a highly disjunct distribution. The largest population of rails is found in 8–10 large wetlands in San Francisco Bay (SF Bay), but a population was recently discovered in the foothills of the Sierra Nevada Mountains (Foothills), within a wetland network comprised of over 200 small marshes. Using microsatellite and mitochondrial analyses, our objectives were to determine the origins, connectivity and demography of this recently-discovered population. Analyses of individuals from the Foothills (n = 31), SF Bay (n = 31), the Imperial Valley (n = 6) and the East Coast (n = 3), combined with rigorous power evaluations, provided valuable insights into past history and current dynamics of the species in Northern California that challenge conventional wisdom about the species. The Foothills and SF Bay populations have diverged strongly from the Imperial Valley population, even more strongly than from individuals of the East Coast subspecies. The data also suggest a historical presence of the species in the Foothills. The SF Bay and Foothills populations had similar estimated effective population size over the areas sampled and appeared linked by a strongly asymmetrical migration pattern, with a greater probability of movement from the Foothills to SF Bay than vice versa. Random mating was inferred in the Foothills, but local substructure among marshes and inbreeding were detected in SF Bay, suggesting different dispersal patterns within each location. The unexpected dimensions of Black Rail demography and population structure suggested by these analyses and their potential importance for management are discussed.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-01
... matter from a cruise ship except clean vessel engine cooling water, clean vessel generator cooling water, vessel engine or generator exhaust, clean bilge water, or anchor wash. * * * * * 0 3. Appendix A to... matter from a cruise ship except clean vessel engine cooling water, clean vessel generator cooling water...
2013-12-20
MORRO BAY, Calif. – A crew member preps an Erickson Sky Crane helicopter for a test of the SpaceX Dragon test article. The test enables SpaceX engineers to evaluate the spacecraft's parachute deployment system as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. The parachute test took place at Morro Bay, Calif. Photo credit: NASA/Kim Shiflett
33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...
33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...
33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...
33 CFR 165.768 - Security Zone; MacDill Air Force Base, Tampa Bay, FL.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Security Zone; MacDill Air Force....768 Security Zone; MacDill Air Force Base, Tampa Bay, FL. (a) Location. The following area is a security zone which exists concurrent with an Army Corps of Engineers restricted area in § 334.635 of this...
The design and application of a Transportable Inference Engine (TIE1)
NASA Technical Reports Server (NTRS)
Mclean, David R.
1986-01-01
A Transportable Inference Engine (TIE1) system has been developed by the author as part of the Interactive Experimenter Planning System (IEPS) task which is involved with developing expert systems in support of the Spacecraft Control Programs Branch at Goddard Space Flight Center in Greenbelt, Maryland. Unlike traditional inference engines, TIE1 is written in the C programming language. In the TIE1 system, knowledge is represented by a hierarchical network of objects which have rule frames. The TIE1 search algorithm uses a set of strategies, including backward chaining, to obtain the values of goals. The application of TIE1 to a spacecraft scheduling problem is described. This application involves the development of a strategies interpreter which uses TIE1 to do constraint checking.
SOUTH WING, TRA661. SOUTH SIDE. CAMERA FACING NORTH. MTR HIGH ...
SOUTH WING, TRA-661. SOUTH SIDE. CAMERA FACING NORTH. MTR HIGH BAY BEYOND. INL NEGATIVE NO. HD46-45-3. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
Bayesian estimation of differential transcript usage from RNA-seq data.
Papastamoulis, Panagiotis; Rattray, Magnus
2017-11-27
Next generation sequencing allows the identification of genes consisting of differentially expressed transcripts, a term which usually refers to changes in the overall expression level. A specific type of differential expression is differential transcript usage (DTU) and targets changes in the relative within gene expression of a transcript. The contribution of this paper is to: (a) extend the use of cjBitSeq to the DTU context, a previously introduced Bayesian model which is originally designed for identifying changes in overall expression levels and (b) propose a Bayesian version of DRIMSeq, a frequentist model for inferring DTU. cjBitSeq is a read based model and performs fully Bayesian inference by MCMC sampling on the space of latent state of each transcript per gene. BayesDRIMSeq is a count based model and estimates the Bayes Factor of a DTU model against a null model using Laplace's approximation. The proposed models are benchmarked against the existing ones using a recent independent simulation study as well as a real RNA-seq dataset. Our results suggest that the Bayesian methods exhibit similar performance with DRIMSeq in terms of precision/recall but offer better calibration of False Discovery Rate.
Wilberg, Michael J; Wiedenmann, John R; Robinson, Jason M
2013-06-01
Autogenic ecosystem engineers are critically important parts of many marine and estuarine systems because of their substantial effect on ecosystem services. Oysters are of particular importance because of their capacity to modify coastal and estuarine habitats and the highly degraded status of their habitats worldwide. However, models to predict dynamics of ecosystem engineers have not previously included the effects of exploitation. We developed a linked population and habitat model for autogenic ecosystem engineers undergoing exploitation. We parameterized the model to represent eastern oyster (Crassostrea virginica) in upper Chesapeake Bay by selecting sets of parameter values that matched observed rates of change in abundance and habitat. We used the model to evaluate the effects of a range of management and restoration options including sustainability of historical fishing pressure, effectiveness of a newly enacted sanctuary program, and relative performance of two restoration approaches. In general, autogenic ecosystem engineers are expected to be substantially less resilient to fishing than an equivalent species that does not rely on itself for habitat. Historical fishing mortality rates in upper Chesapeake Bay for oysters were above the levels that would lead to extirpation. Reductions in fishing or closure of the fishery were projected to lead to long-term increases in abundance and habitat. For fisheries to become sustainable outside of sanctuaries, a substantial larval subsidy would be required from oysters within sanctuaries. Restoration efforts using high-relief reefs were predicted to allow recovery within a shorter period of time than low-relief reefs. Models such as ours, that allow for feedbacks between population and habitat dynamics, can be effective tools for guiding management and restoration of autogenic ecosystem engineers.
Engineering Management Board Tour VAB
2017-03-22
Members of NASA’s Engineering Management Board tour of the Vehicle Assembly Building at Kennedy Space Center in Florida. The platforms in High Bay 3, including the one on which the board members are standing, were designed to surround and provide access to NASA’s Space Launch System and Orion spacecraft. The Engineering Management Board toured integral areas of Kennedy to help the agencywide group reach its goal of unifying engineering work across NASA.
Buhne Point Shoreline Erosion Demonstration Project. Volume 4. Appendices H-L.
1987-08-01
Districts Corps of Engineers Prepared by Moffatt & Nichol, EngineE -s 250 W. Wardlow Road Long Beach, CA 90807 L-2434.03 August 1987...Francisco District, Corps of Engineers Dave Eyres Federal Highway Administration Ervin Renner Humboldt County Board of Supervisors Tom Smith Federal...James A. Gast Humboldt Bay Harbor District Claude Wong U.S. Army Corps of Engineers , Los Angeles Dean Ray Coastal Commission, Eureka Ervin Renner
D-score: a search engine independent MD-score.
Vaudel, Marc; Breiter, Daniela; Beck, Florian; Rahnenführer, Jörg; Martens, Lennart; Zahedi, René P
2013-03-01
While peptides carrying PTMs are routinely identified in gel-free MS, the localization of the PTMs onto the peptide sequences remains challenging. Search engine scores of secondary peptide matches have been used in different approaches in order to infer the quality of site inference, by penalizing the localization whenever the search engine similarly scored two candidate peptides with different site assignments. In the present work, we show how the estimation of posterior error probabilities for peptide candidates allows the estimation of a PTM score called the D-score, for multiple search engine studies. We demonstrate the applicability of this score to three popular search engines: Mascot, OMSSA, and X!Tandem, and evaluate its performance using an already published high resolution data set of synthetic phosphopeptides. For those peptides with phosphorylation site inference uncertainty, the number of spectrum matches with correctly localized phosphorylation increased by up to 25.7% when compared to using Mascot alone, although the actual increase depended on the fragmentation method used. Since this method relies only on search engine scores, it can be readily applied to the scoring of the localization of virtually any modification at no additional experimental or in silico cost. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Spitz, Jérôme; Cherel, Yves; Bertin, Stéphane; Kiszka, Jeremy; Dewez, Alexandre; Ridoux, Vincent
2011-03-01
Long-finned pilot whales ( Globicephala melas), Risso's dolphins ( Grampus griseus), melon-headed whales ( Peponocephala electra), Cuvier's beaked whales ( Ziphius cavirostris), Sowerby's beaked whales ( Mesoplodon bidens), northern bottlenose whales ( Hyperoodon ampullatus), sperm whales ( Physeter macrocephalus), dwarf sperm whales ( Kogia sima) and pygmy sperm whales ( Kogia breviceps) make up the large community of deep-diving odontocetes occurring off the Bay of Biscay, northeast Atlantic. The ecology of these toothed cetaceans is poorly documented worldwide. The present study described their prey preferences from stomach content analysis and showed resource partitioning within the assemblage. The majority of the species appeared to be mostly teutophageous. Fish was an important food source only for the Sowerby's beaked whale and, to a lesser extent, for the long-finned pilot whale. In terms of foraging habitats inferred from prey composition, either pelagic oceanic or demersal neritic habitats were exploited by toothed whales in the Bay of Biscay, with only the long-finned pilot whale foraging in the two habitats. Finally, with more than 14,000 identified cephalopods from 39 species, the present study highlighted also the poorly known deep-sea cephalopod community off the Bay of Biscay using top predators as biological samplers.
Donatelli, Carmine; Ganju, Neil Kamal; Fagherazzi, Sergio; Leonardi, Nicoletta
2018-01-01
surroundings and are therefore frequently referred to as ecological engineers. The effect of seagrasses on coastal bay resilience and sediment transport dynamics is understudied. Here we use six historical maps of seagrass distribution in Barnegat Bay, USA, to investigate the role of these vegetated surfaces on the sediment storage capacity of shallow bays. Analyses are carried out by means of the Coupled-Ocean-Atmosphere-Wave-Sediment Transport (COAWST) numerical modeling framework. Results show that a decline in the extent of seagrass meadows reduces the sediment mass potentially stored within bay systems. The presence of seagrass reduces shear stress values across the entire bay, including unvegetated areas, and promotes sediment deposition on tidal flats. On the other hand, the presence of seagrasses decreases suspended sediment concentrations, which in turn reduces the delivery of sediment to marsh platforms. Results highlight the relevance of seagrasses for the long-term survival of coastal ecosystems, and the complex dynamics regulating the interaction between subtidal and intertidal landscapes.
Ryberg, Karen R.; Blomquist, Joel; Sprague, Lori A.; Sekellick, Andrew J.; Keisman, Jennifer
2018-01-01
Causal attribution of changes in water quality often consists of correlation, qualitative reasoning, listing references to the work of others, or speculation. To better support statements of attribution for water-quality trends, structural equation modeling was used to model the causal factors of total phosphorus loads in the Chesapeake Bay watershed. By transforming, scaling, and standardizing variables, grouping similar sites, grouping some causal factors into latent variable models, and using methods that correct for assumption violations, we developed a structural equation model to show how causal factors interact to produce total phosphorus loads. Climate (in the form of annual total precipitation and the Palmer Hydrologic Drought Index) and anthropogenic inputs are the major drivers of total phosphorus load in the Chesapeake Bay watershed. Increasing runoff due to natural climate variability is offsetting purposeful management actions that are otherwise decreasing phosphorus loading; consequently, management actions may need to be reexamined to achieve target reductions in the face of climate variability.
NASA Astrophysics Data System (ADS)
Saleh, R.
2017-12-01
For a challenge as complex and far-reaching as sea level rise and improving shoreline resiliency, strong partnerships between scientists, elected officials, decision-makers, and the general public are the only way that effective solutions can be developed. The San Francisco Bay, like many similar sheltered water coastal environments (for example, Galveston Bay, Tampa Bay, or Venetian Lagoon) offers a unique opportunity for multiple jurisdictions to collaborate to address sea level rise on a regional basis. For the San Francisco Bay, significant scientific progress has been made in building a real-time simulation model for riverine and Bay hydrodynamics. Other major scientific initiatives, such as morphology mapping, shoreline mapping, and a sediment budget are also underway. In 2014, leaders from the Bay Area science, engineering, planning, policy, elected, and regulatory communities representing jurisdictions around the Bay joined together to address sea level rise. The group includes people from local, regional, state, and federal agencies and organizations. Together, CHARG (Coastal Hazards Adaptation Resiliency Group) established a collective vision and approach to implementing regional solutions. Decision-makers within many Bay Area jurisdictions are motivated to show demonstrable progress toward addressing sea level rise. However, the cost to implement shoreline resiliency solutions will be very large, and must be founded on strong science.CHARG is now tackling several key technical challenges. One is to develop science-based guidelines for local jurisdictions to determine when a project is local, sub-regional, or regional. Concurrently, several organizations are planning or implementing pilot shoreline resiliency projects and other programs. Many creative regional solutions are possible in a sheltered water environment that simply would not be feasible along the open coast. By definition, these solutions cannot be undertaken by one entity alone. Large-scale regional solutions are only possible through the hard work and collaboration of many. This paper will offer insights into the process of collaboration, initiated by the scientific and engineering communities, to influence and help direct major decisions about shoreline resiliency.
NASA Astrophysics Data System (ADS)
Bertin, Xavier; Chaumillon, Eric; Sottolichio, Aldo; Pedreros, Rodrigo
2005-06-01
Tidal inlet characteristics are controlled by wave energy, tidal range, tidal prism, sediment supply and direction and rates of sand delivered to the inlet. This paper deals with the relations between inlet and lagoon evolutions, linked by the tidal prism. Our study is focused on the Maumusson Inlet and the Marennes-Oléron Bay (first oyster farming area in Europe), located on the western coast of France. The tidal range (2-6 m) and wave climate (mean height: 1.5 m) place this tidal inlet system in the mixed energy (tide, waves), tide-dominated category. The availability of high-resolution bathymetric data since 1824 permits to characterise and quantify accurately morphological changes of both the inlet and the tidal bay. Since 1824, sediment filling of the tidal bay has led to a 20% decrease in its water volume, and a 35% reduction of the inlet throat section. Furthermore, the bay is subjected to a very high anthropic pressure, mainly related to oyster farming. Thus, both natural and human-related processes seem relevant to explain high sedimentation rates. Current measurements, hydrodynamic modelling and cross-sectional area of the inlet throat are used in order to quantify tidal prism changes since 1824. Both flood and ebb tidal prism decreased by 35%. Decrease in the Marennes-Oléron Bay water volume is inferred to be responsible for a part of tidal prism decrease at the inlet. Tidal prisms decrease may also be explained by an increase in frictional resistance to tidal wave propagation, due to a general shoaling and oyster farms in the bay. A conceptual model is proposed, taking into account natural and human-related sedimentation processes, and explaining tidal inlet response to tidal bay evolutions.
ERIC Educational Resources Information Center
Kreinberg, Nancy
The purpose of this publication is to stimulate interest in science and engineering careers in young women. Questionnaires were mailed to 450 women scientists and engineers in the San Francisco Bay Area, asking their assistance in developing a booklet to encourage young women toward scientific and mathematical studies. One hundred sixty women…
Effects of Storage on Sediment Toxicity, Bioaccumulation Potential, and Chemistry
1991-01-01
and tested with organisms used by the US Army Engineer District, New York. Test sedi- ments were collected from Westchester Creek (WC), Gowanus Bay ...Ms. Carole Brown, ERSD, obtained the sediment samples. Dr. Eric Crecelius, Battelle Pacific Northwest Laboratories, Sequim , WA, coordinated chemical...other sites, suspected of containing contam- inated sediment, were Westchester Creek (WC), Gowanus Bay (GB), and Arthur Kill (AK), all located in
Deep bore hole instrumentation along San Francisco Bay Bridges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bakun, W.; Bowman, J.; Clymer, R.
1998-10-01
The Bay Bridges down hole network consists of sensors in bore holes that are drilled 100 ft. into bedrock around and in the San Francisco Bay. Between 2 and 8 instruments have been spaced along the Dumbarton, San Mateo, Bay, and San Rafael bridges. The instruments will provide multiple use data that is important to geotechnical, structural engineering, and seismological studies. The holes are between 100 and 1000 ft deep and were drilled by Caltrans. There are twenty- one sensor packages at fifteen sites. Extensive financial support is being contributed by Caltrans, UCB, LBL, LLNL-LDRD, U.C. Campus/Laboratory Collaboration (CLC) program,more » and USGS. The down hole instrument package contains a three component HS-1 seismometer and three orthogonal Wilcox 73 1 accelerometers, and is capable of recording a micro g from local M = 1.0 earthquakes to 0.5 g strong ground motion form large Bay Area earthquakes.« less
Demas, Charles R.
1977-01-01
During October and November 1976 the U.S. Geological Survey, in cooperation with the U.S. Army Corps of Engineers, collected native water and core material from 14 sites along the Atchafalya River in Louisiana (from the head of Whiskey Bay Pilot Channel to American Pass) and 5 sites in Atchafalya Bay for evaluation of possible environmental effects of a proposed channel-enlargement project. Core material from all river sites and one bay site was collected to a depth of 50 feet (15 meters). At the remaining bay sites, samples were collected to a depth of less than 6 inches (15 centimeters) using a pipe dredge. Core material and native water were analyzed (separately and as elutriate samples prepared from mixtures) for selected metals, nutrients, organic compounds, and physical characteristics. No interpretation of the data is given. (Woodard-USGS)
Bayesian Inference of High-Dimensional Dynamical Ocean Models
NASA Astrophysics Data System (ADS)
Lin, J.; Lermusiaux, P. F. J.; Lolla, S. V. T.; Gupta, A.; Haley, P. J., Jr.
2015-12-01
This presentation addresses a holistic set of challenges in high-dimension ocean Bayesian nonlinear estimation: i) predict the probability distribution functions (pdfs) of large nonlinear dynamical systems using stochastic partial differential equations (PDEs); ii) assimilate data using Bayes' law with these pdfs; iii) predict the future data that optimally reduce uncertainties; and (iv) rank the known and learn the new model formulations themselves. Overall, we allow the joint inference of the state, equations, geometry, boundary conditions and initial conditions of dynamical models. Examples are provided for time-dependent fluid and ocean flows, including cavity, double-gyre and Strait flows with jets and eddies. The Bayesian model inference, based on limited observations, is illustrated first by the estimation of obstacle shapes and positions in fluid flows. Next, the Bayesian inference of biogeochemical reaction equations and of their states and parameters is presented, illustrating how PDE-based machine learning can rigorously guide the selection and discovery of complex ecosystem models. Finally, the inference of multiscale bottom gravity current dynamics is illustrated, motivated in part by classic overflows and dense water formation sites and their relevance to climate monitoring and dynamics. This is joint work with our MSEAS group at MIT.
POPPER, a simple programming language for probabilistic semantic inference in medicine.
Robson, Barry
2015-01-01
Our previous reports described the use of the Hyperbolic Dirac Net (HDN) as a method for probabilistic inference from medical data, and a proposed probabilistic medical Semantic Web (SW) language Q-UEL to provide that data. Rather like a traditional Bayes Net, that HDN provided estimates of joint and conditional probabilities, and was static, with no need for evolution due to "reasoning". Use of the SW will require, however, (a) at least the semantic triple with more elaborate relations than conditional ones, as seen in use of most verbs and prepositions, and (b) rules for logical, grammatical, and definitional manipulation that can generate changes in the inference net. Here is described the simple POPPER language for medical inference. It can be automatically written by Q-UEL, or by hand. Based on studies with our medical students, it is believed that a tool like this may help in medical education and that a physician unfamiliar with SW science can understand it. It is here used to explore the considerable challenges of assigning probabilities, and not least what the meaning and utility of inference net evolution would be for a physician. Copyright © 2014 Elsevier Ltd. All rights reserved.
A Habitable Fluvio-Lacustrine Environment at Yellowknife Bay, Gale Crater, Mars
NASA Technical Reports Server (NTRS)
Grotzinger, J. P.; Sumner, D. Y.; Kah, L. C.; Stack, K.; Gupta, S.; Edgar, L.; Rubin, D.; Lewis, K.; Schieber, J.; Mangold, N.;
2013-01-01
The Curiosity rover discovered fine-grained sedimentary rocks, which are inferred to represent an ancient lake and preserve evidence of an environment that would have been suited to support a martian biosphere founded on chemolithoautotrophy. This aqueous environment was characterized by neutral pH, low salinity, and variable redox states of both iron and sulfur species. Carbon, hydrogen, oxygen, sulfur, nitrogen, and phosphorus were measured directly as key biogenic elements; by inference, phosphorus is assumed to have been available. The environment probably had a minimum duration of hundreds to tens of thousands of years. These results highlight the biological viability of fluvial-lacustrine environments in the post-Noachian history of Mars.
Questions Revisited: A Close Examination of Calculus of Inference and Inquiry
NASA Technical Reports Server (NTRS)
Knuth, Kevin H.; Koga, Dennis (Technical Monitor)
2003-01-01
In this paper I examine more closely the way in which probability theory, the calculus of inference, is derived from the Boolean lattice structure of logical assertions ordered by implication. I demonstrate how the duality between the logical conjunction and disjunction in Boolean algebra is lost when deriving the probability calculus. In addition, I look more closely at the other lattice identities to verify that they are satisfied by the probability calculus. Last, I look towards developing the calculus of inquiry demonstrating that there is a sum and product rule for the relevance measure as well as a Bayes theorem. Current difficulties in deriving the complete inquiry calculus will also be discussed.
A habitable fluvio-lacustrine environment at Yellowknife Bay, Gale crater, Mars.
Grotzinger, J P; Sumner, D Y; Kah, L C; Stack, K; Gupta, S; Edgar, L; Rubin, D; Lewis, K; Schieber, J; Mangold, N; Milliken, R; Conrad, P G; DesMarais, D; Farmer, J; Siebach, K; Calef, F; Hurowitz, J; McLennan, S M; Ming, D; Vaniman, D; Crisp, J; Vasavada, A; Edgett, K S; Malin, M; Blake, D; Gellert, R; Mahaffy, P; Wiens, R C; Maurice, S; Grant, J A; Wilson, S; Anderson, R C; Beegle, L; Arvidson, R; Hallet, B; Sletten, R S; Rice, M; Bell, J; Griffes, J; Ehlmann, B; Anderson, R B; Bristow, T F; Dietrich, W E; Dromart, G; Eigenbrode, J; Fraeman, A; Hardgrove, C; Herkenhoff, K; Jandura, L; Kocurek, G; Lee, S; Leshin, L A; Leveille, R; Limonadi, D; Maki, J; McCloskey, S; Meyer, M; Minitti, M; Newsom, H; Oehler, D; Okon, A; Palucis, M; Parker, T; Rowland, S; Schmidt, M; Squyres, S; Steele, A; Stolper, E; Summons, R; Treiman, A; Williams, R; Yingst, A
2014-01-24
The Curiosity rover discovered fine-grained sedimentary rocks, which are inferred to represent an ancient lake and preserve evidence of an environment that would have been suited to support a martian biosphere founded on chemolithoautotrophy. This aqueous environment was characterized by neutral pH, low salinity, and variable redox states of both iron and sulfur species. Carbon, hydrogen, oxygen, sulfur, nitrogen, and phosphorus were measured directly as key biogenic elements; by inference, phosphorus is assumed to have been available. The environment probably had a minimum duration of hundreds to tens of thousands of years. These results highlight the biological viability of fluvial-lacustrine environments in the post-Noachian history of Mars.
Petrology and tectonic history of the Green Bay Schist, Portmore, St. Catherine Parish, Jamaica
Abbott, Richard N.; West, David P.; Bandy, Betsy R.; McAleer, Ryan J.
2016-01-01
There are three occurrences of medium- to high-grade metamorphic rocks in Jamaica: amphibolite facies Westphalia Schist, blueschist/greenschist facies Mt. Hibernia Schist, and the hitherto poorly characterized amphibolite facies Green Bay Schist. New trace element data and thermodynamic calculations show that Green Bay Schist is closely related to Westphalia Schist. The protoliths for both are very similiar (basalt-andesitic basalt, C-MORB), consistent with a subducted ocean-ridge tectonic environment, hence arc-related. The protolith for Mt. Hibernia Schist is quite different (basalt, P-MORB), related to the Caribbean Large Igneous Province. Whereas the P-T-t paths for Green Bay Schist and Westphalia Schist prior to the middle Campanian (>78 Ma) are inferred to be similar, the late Campanian, Maastrichtian and Cenozoic P-T-t paths are very different. New 40Ar/39Ar age determinations show the following: (1) While the difference in the late Campanian and Maastrichtian remains problematic, (2) the difference in the Cenozoic clearly reflects the location relative to the NW-trending, NE-dipping Wagwater Fault: Westphalia Schist to the NE (hanging wall); Green Bay Schist to the SW (foot wall). The Cenozoic P-T-t paths are complementary, and consistent with the behavior of the Wagwater Fault: 65-50 Ma, normal motion (transtension); 50-10 Ma, inactive (quiescent); 10 Ma-present, reverse motion (transpression).
d'Alessio, M. A.; Johanson, I.A.; Burgmann, R.; Schmidt, D.A.; Murray, M.H.
2005-01-01
Observations of surface deformation allow us to determine the kinematics of faults in the San Francisco Bay Area. We present the Bay Area velocity unification (BA??VU??, "bay view"), a compilation of over 200 horizontal surface velocities computed from campaign-style and continuous Global Positioning System (GPS) observations from 1993 to 2003. We interpret this interseismic velocity field using a three-dimensional block model to determine the relative contributions of block motion, elastic strain accumulation, and shallow aseismic creep. The total relative motion between the Pacific plate and the rigid Sierra Nevada/Great Valley (SNGV) microplate is 37.9 ?? 0.6 mm yr-1 directed toward N30.4??W ?? 0.8?? at San Francisco (??2??). Fault slip rates from our preferred model are typically within the error bounds of geologic estimates but provide a better fit to geodetic data (notable right-lateral slip rates in mm yr-1: San Gregorio fault, 2.4 ?? 1.0; West Napa fault, 4.0 ?? 3.0; zone of faulting along the eastern margin of the Coast Range, 5.4 ?? 1.0; and Mount Diablo thrust, 3.9 ?? 1.0 of reverse slip and 4.0 ?? 0.2 of right-lateral strike slip). Slip on the northern Calaveras is partitioned between both the West Napa and Concord/ Green Valley fault systems. The total convergence across the Bay Area is negligible. Poles of rotation for Bay Area blocks progress systematically from the North America-Pacific to North America-SNGV poles. The resulting present-day relative motion cannot explain the strike of most Bay Area faults, but fault strike does loosely correlate with inferred plate motions at the time each fault initiated. Copyright 2005 by the American Geophysical Union.
Lignin phenols used to infer organic matter sources to Sepetiba Bay - RJ, Brasil
NASA Astrophysics Data System (ADS)
Rezende, C. E.; Pfeiffer, W. C.; Martinelli, L. A.; Tsamakis, E.; Hedges, J. I.; Keil, R. G.
2010-04-01
Lignin phenols were measured in the sediments of Sepitiba Bay, Rio de Janeiro, Brazil and in bedload sediments and suspended sediments of the four major fluvial inputs to the bay; São Francisco and Guandu Channels and the Guarda and Cação Rivers. Fluvial suspended lignin yields (Σ8 3.5-14.6 mgC 10 g dw -1) vary little between the wet and dry seasons and are poorly correlated with fluvial chlorophyll concentrations (0.8-50.2 μgC L -1). Despite current land use practices that favor grassland agriculture or industrial uses, fluvial lignin compositions are dominated by a degraded leaf-sourced material. The exception is the Guarda River, which has a slight influence from grasses. The Lignin Phenol Vegetation Index, coupled with acid/aldehyde and 3.5 Db/V ratios, indicate that degraded leaf-derived phenols are also the primary preserved lignin component in the bay. The presence of fringe Typha sp. and Spartina sp. grass beds surrounding portions of the Bay are not reflected in the lignin signature. Instead, lignin entering the bay appears to reflect the erosion of soils containing a degraded signature from the former Atlantic rain forest that once dominated the watershed, instead of containing a significant signature derived from current agricultural uses. A three-component mixing model using the LPVI, atomic N:C ratios, and stable carbon isotopes (which range between -26.8 and -21.8‰) supports the hypothesis that fluvial inputs to the bay are dominated by planktonic matter (78% of the input), with lignin dominated by leaf (14% of the input) over grass (6%). Sediments are composed of a roughly 50-50 mixture of autochthonous material and terrigenous material, with lignin being primarily sourced from leaf.
ERIC Educational Resources Information Center
Page, Robert; Satake, Eiki
2017-01-01
While interest in Bayesian statistics has been growing in statistics education, the treatment of the topic is still inadequate in both textbooks and the classroom. Because so many fields of study lead to careers that involve a decision-making process requiring an understanding of Bayesian methods, it is becoming increasingly clear that Bayesian…
IMAGINE: Interstellar MAGnetic field INference Engine
NASA Astrophysics Data System (ADS)
Steininger, Theo
2018-03-01
IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.
Decision Support Systems for Launch and Range Operations Using Jess
NASA Technical Reports Server (NTRS)
Thirumalainambi, Rajkumar
2007-01-01
The virtual test bed for launch and range operations developed at NASA Ames Research Center consists of various independent expert systems advising on weather effects, toxic gas dispersions and human health risk assessment during space-flight operations. An individual dedicated server supports each expert system and the master system gather information from the dedicated servers to support the launch decision-making process. Since the test bed is based on the web system, reducing network traffic and optimizing the knowledge base is critical to its success of real-time or near real-time operations. Jess, a fast rule engine and powerful scripting environment developed at Sandia National Laboratory has been adopted to build the expert systems providing robustness and scalability. Jess also supports XML representation of knowledge base with forward and backward chaining inference mechanism. Facts added - to working memory during run-time operations facilitates analyses of multiple scenarios. Knowledge base can be distributed with one inference engine performing the inference process. This paper discusses details of the knowledge base and inference engine using Jess for a launch and range virtual test bed.
A variational Bayes discrete mixture test for rare variant association
Logsdon, Benjamin A.; Dai, James Y.; Auer, Paul L.; Johnsen, Jill M.; Ganesh, Santhi K.; Smith, Nicholas L.; Wilson, James G.; Tracy, Russell P.; Lange, Leslie A.; Jiao, Shuo; Rich, Stephen S.; Lettre, Guillaume; Carlson, Christopher S.; Jackson, Rebecca D.; O’Donnell, Christopher J.; Wurfel, Mark M.; Nickerson, Deborah A.; Tang, Hua; Reiner, Alexander P.; Kooperberg, Charles
2014-01-01
Recently, many statistical methods have been proposed to test for associations between rare genetic variants and complex traits. Most of these methods test for association by aggregating genetic variations within a predefined region, such as a gene. Although there is evidence that “aggregate” tests are more powerful than the single marker test, these tests generally ignore neutral variants and therefore are unable to identify specific variants driving the association with phenotype. We propose a novel aggregate rare-variant test that explicitly models a fraction of variants as neutral, tests associations at the gene-level, and infers the rare-variants driving the association. Simulations show that in the practical scenario where there are many variants within a given region of the genome with only a fraction causal our approach has greater power compared to other popular tests such as the Sequence Kernel Association Test (SKAT), the Weighted Sum Statistic (WSS), and the collapsing method of Morris and Zeggini (MZ). Our algorithm leverages a fast variational Bayes approximate inference methodology to scale to exome-wide analyses, a significant computational advantage over exact inference model selection methodologies. To demonstrate the efficacy of our methodology we test for associations between von Willebrand Factor (VWF) levels and VWF missense rare-variants imputed from the National Heart, Lung, and Blood Institute’s Exome Sequencing project into 2,487 African Americans within the VWF gene. Our method suggests that a relatively small fraction (~10%) of the imputed rare missense variants within VWF are strongly associated with lower VWF levels in African Americans. PMID:24482836
The problem of estimating recent genetic connectivity in a changing world.
Samarasin, Pasan; Shuter, Brian J; Wright, Stephen I; Rodd, F Helen
2017-02-01
Accurate understanding of population connectivity is important to conservation because dispersal can play an important role in population dynamics, microevolution, and assessments of extirpation risk and population rescue. Genetic methods are increasingly used to infer population connectivity because advances in technology have made them more advantageous (e.g., cost effective) relative to ecological methods. Given the reductions in wildlife population connectivity since the Industrial Revolution and more recent drastic reductions from habitat loss, it is important to know the accuracy of and biases in genetic connectivity estimators when connectivity has declined recently. Using simulated data, we investigated the accuracy and bias of 2 common estimators of migration (movement of individuals among populations) rate. We focused on the timing of the connectivity change and the magnitude of that change on the estimates of migration by using a coalescent-based method (Migrate-n) and a disequilibrium-based method (BayesAss). Contrary to expectations, when historically high connectivity had declined recently: (i) both methods over-estimated recent migration rates; (ii) the coalescent-based method (Migrate-n) provided better estimates of recent migration rate than the disequilibrium-based method (BayesAss); (iii) the coalescent-based method did not accurately reflect long-term genetic connectivity. Overall, our results highlight the problems with comparing coalescent and disequilibrium estimates to make inferences about the effects of recent landscape change on genetic connectivity among populations. We found that contrasting these 2 estimates to make inferences about genetic-connectivity changes over time could lead to inaccurate conclusions. © 2016 Society for Conservation Biology.
A variational Bayes discrete mixture test for rare variant association.
Logsdon, Benjamin A; Dai, James Y; Auer, Paul L; Johnsen, Jill M; Ganesh, Santhi K; Smith, Nicholas L; Wilson, James G; Tracy, Russell P; Lange, Leslie A; Jiao, Shuo; Rich, Stephen S; Lettre, Guillaume; Carlson, Christopher S; Jackson, Rebecca D; O'Donnell, Christopher J; Wurfel, Mark M; Nickerson, Deborah A; Tang, Hua; Reiner, Alexander P; Kooperberg, Charles
2014-01-01
Recently, many statistical methods have been proposed to test for associations between rare genetic variants and complex traits. Most of these methods test for association by aggregating genetic variations within a predefined region, such as a gene. Although there is evidence that "aggregate" tests are more powerful than the single marker test, these tests generally ignore neutral variants and therefore are unable to identify specific variants driving the association with phenotype. We propose a novel aggregate rare-variant test that explicitly models a fraction of variants as neutral, tests associations at the gene-level, and infers the rare-variants driving the association. Simulations show that in the practical scenario where there are many variants within a given region of the genome with only a fraction causal our approach has greater power compared to other popular tests such as the Sequence Kernel Association Test (SKAT), the Weighted Sum Statistic (WSS), and the collapsing method of Morris and Zeggini (MZ). Our algorithm leverages a fast variational Bayes approximate inference methodology to scale to exome-wide analyses, a significant computational advantage over exact inference model selection methodologies. To demonstrate the efficacy of our methodology we test for associations between von Willebrand Factor (VWF) levels and VWF missense rare-variants imputed from the National Heart, Lung, and Blood Institute's Exome Sequencing project into 2,487 African Americans within the VWF gene. Our method suggests that a relatively small fraction (~10%) of the imputed rare missense variants within VWF are strongly associated with lower VWF levels in African Americans.
Emmert-Streib, Frank; Glazko, Galina V.; Altay, Gökmen; de Matos Simoes, Ricardo
2012-01-01
In this paper, we present a systematic and conceptual overview of methods for inferring gene regulatory networks from observational gene expression data. Further, we discuss two classic approaches to infer causal structures and compare them with contemporary methods by providing a conceptual categorization thereof. We complement the above by surveying global and local evaluation measures for assessing the performance of inference algorithms. PMID:22408642
The Importance of Statistical Modeling in Data Analysis and Inference
ERIC Educational Resources Information Center
Rollins, Derrick, Sr.
2017-01-01
Statistical inference simply means to draw a conclusion based on information that comes from data. Error bars are the most commonly used tool for data analysis and inference in chemical engineering data studies. This work demonstrates, using common types of data collection studies, the importance of specifying the statistical model for sound…
Sand waves at the mouth of San Francisco Bay, California
Gibbons, Helen; Barnard, Patrick L.
2007-01-01
The U.S. Geological Survey; California State University, Monterey Bay; U.S. Army Corps of Engineers; National Oceanic and Atmospheric Administration; and Center for Integrative Coastal Observation, Research and Education partnered to map central San Francisco Bay and its entrance under the Golden Gate Bridge using multibeam echosounders. View eastward, through the Golden Gate into central San Francisco Bay. Depth of sea floor color coded: red (less than 10 m deep) to purple (more than 100 m deep). Land from USGS digital orthophotographs (DOQs) overlaid on USGS digital elevation models (DEMs). Sand waves in this view average 6 m in height and 80 m from crest to crest. Golden Gate Bridge is about 2 km long. Vertical exaggeration is approximately 4x for sea floor, 2x for land.
1983-04-01
BAY, TACOMA, WASHINGTON PREPARED BY: FISHERIES RESEARCH INSTITUTE University of Washington DTIC C. A ELECTE JUL11 1985 DISTRIBUTIONSTATEMENT A...Nakatani 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT, TASK Fisheries Research Institute AREA & WORK UNIT NUMBERS School of... Fisheries WH-10 University of Washington Seattle, Washington 98195 11. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE US Army Corps of Engineers
2013-12-20
MORRO BAY, Calif. – The SpaceX Dragon test article tumbles over the Pacific Ocean, off the coast of Morro Bay, Calif., following its release for an Erickson Sky Crane helicopter. SpaceX engineers induced the tumble to evaluate the spacecraft's parachute deployment system in an emergency abort scenario. The test is part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett
2013-12-20
MORRO BAY, Calif. – An Erickson Sky Crane helicopter releases the SpaceX Dragon test article, inducing a tumble similar to what is expected in an emergency abort scenario, over the Pacific Ocean, off the coast of Morro Bay, Calif. The test allowed engineers to better evaluate the spacecraft's parachute deployment system as part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett
2013-12-20
MORRO BAY, Calif. – The SpaceX Dragon test article tumbles over the Pacific Ocean, off the coast of Morro Bay, Calif., following its release for an Erickson Sky Crane helicopter. SpaceX engineers induced the tumble to evaluate the spacecraft's parachute deployment system in an emergency abort scenario. The test is part of a milestone under its Commercial Crew Integrated Capability agreement with NASA's Commercial Crew Program. Photo credit: NASA/Kim Shiflett
33 CFR 334.220 - Chesapeake Bay, south of Tangier Island, Va.; naval firing range.
Code of Federal Regulations, 2013 CFR
2013-07-01
... ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.220 Chesapeake Bay, south of Tangier Island, Va.; naval firing range. (a) The danger zone. Beginning... to latitude 37°45′00″, longitude 76°09′48″; thence to latitude 37°45′00″, longitude 76°08′51″; and...
33 CFR 334.220 - Chesapeake Bay, south of Tangier Island, Va.; naval firing range.
Code of Federal Regulations, 2010 CFR
2010-07-01
... ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.220 Chesapeake Bay, south of Tangier Island, Va.; naval firing range. (a) The danger zone. Beginning... to latitude 37°45′00″, longitude 76°09′48″; thence to latitude 37°45′00″, longitude 76°08′51″; and...
33 CFR 334.220 - Chesapeake Bay, south of Tangier Island, Va.; naval firing range.
Code of Federal Regulations, 2011 CFR
2011-07-01
... ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.220 Chesapeake Bay, south of Tangier Island, Va.; naval firing range. (a) The danger zone. Beginning... to latitude 37°45′00″, longitude 76°09′48″; thence to latitude 37°45′00″, longitude 76°08′51″; and...
33 CFR 334.220 - Chesapeake Bay, south of Tangier Island, Va.; naval firing range.
Code of Federal Regulations, 2012 CFR
2012-07-01
... ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.220 Chesapeake Bay, south of Tangier Island, Va.; naval firing range. (a) The danger zone. Beginning... to latitude 37°45′00″, longitude 76°09′48″; thence to latitude 37°45′00″, longitude 76°08′51″; and...
33 CFR 334.220 - Chesapeake Bay, south of Tangier Island, Va.; naval firing range.
Code of Federal Regulations, 2014 CFR
2014-07-01
... ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA REGULATIONS § 334.220 Chesapeake Bay, south of Tangier Island, Va.; naval firing range. (a) The danger zone. Beginning... to latitude 37°45′00″, longitude 76°09′48″; thence to latitude 37°45′00″, longitude 76°08′51″; and...
Spatiotemporal analysis of gene flow in Chesapeake Bay Diamondback Terrapins (Malaclemys terrapin)
Converse, Paul E.; Kuchta, Shawn R; Roosenburg, Willem R; Henry, Paula F.; Haramis, G. Michael; King, Timothy L.
2015-01-01
There is widespread concern regarding the impacts of anthropogenic activities on connectivity among populations of plants and animals, and understanding how contemporary and historical processes shape metapopulation dynamics is crucial for setting appropriate conservation targets. We used genetic data to identify population clusters and quantify gene flow over historical and contemporary time frames in the Diamondback Terrapin (Malaclemys terrapin). This species has a long and complicated history with humans, including commercial over-harvesting and subsequent translocation events during the early twentieth century. Today, terrapins face threats from habitat loss and mortality in fisheries bycatch. To evaluate population structure and gene flow among Diamondback Terrapin populations in the Chesapeake Bay region, we sampled 617 individuals from 15 localities, and screened individuals at 12 polymorphic microsatellite loci. Our goals were to demarcate metapopulation structure, quantify genetic diversity, estimate effective population sizes, and document temporal changes in gene flow. We found that terrapins in the Chesapeake Bay region harbor high levels of genetic diversity and form four populations. Effective population sizes were variable. Among most population comparisons, estimates of historical and contemporary terrapin gene flow were generally low (m ≈ 0.01). However, we detected a substantial increase in contemporary gene flow into Chesapeake Bay from populations outside the bay, as well as between two populations within Chesapeake Bay, possibly as a consequence of translocations during the early twentieth century. Our study shows that inferences across multiple time scales are needed to evaluate population connectivity, especially as recent changes may identify threats to population persistence.
Bayes and empirical Bayes estimators of abundance and density from spatial capture-recapture data
Dorazio, Robert M.
2013-01-01
In capture-recapture and mark-resight surveys, movements of individuals both within and between sampling periods can alter the susceptibility of individuals to detection over the region of sampling. In these circumstances spatially explicit capture-recapture (SECR) models, which incorporate the observed locations of individuals, allow population density and abundance to be estimated while accounting for differences in detectability of individuals. In this paper I propose two Bayesian SECR models, one for the analysis of recaptures observed in trapping arrays and another for the analysis of recaptures observed in area searches. In formulating these models I used distinct submodels to specify the distribution of individual home-range centers and the observable recaptures associated with these individuals. This separation of ecological and observational processes allowed me to derive a formal connection between Bayes and empirical Bayes estimators of population abundance that has not been established previously. I showed that this connection applies to every Poisson point-process model of SECR data and provides theoretical support for a previously proposed estimator of abundance based on recaptures in trapping arrays. To illustrate results of both classical and Bayesian methods of analysis, I compared Bayes and empirical Bayes esimates of abundance and density using recaptures from simulated and real populations of animals. Real populations included two iconic datasets: recaptures of tigers detected in camera-trap surveys and recaptures of lizards detected in area-search surveys. In the datasets I analyzed, classical and Bayesian methods provided similar – and often identical – inferences, which is not surprising given the sample sizes and the noninformative priors used in the analyses.
Bayes and empirical Bayes estimators of abundance and density from spatial capture-recapture data.
Dorazio, Robert M
2013-01-01
In capture-recapture and mark-resight surveys, movements of individuals both within and between sampling periods can alter the susceptibility of individuals to detection over the region of sampling. In these circumstances spatially explicit capture-recapture (SECR) models, which incorporate the observed locations of individuals, allow population density and abundance to be estimated while accounting for differences in detectability of individuals. In this paper I propose two Bayesian SECR models, one for the analysis of recaptures observed in trapping arrays and another for the analysis of recaptures observed in area searches. In formulating these models I used distinct submodels to specify the distribution of individual home-range centers and the observable recaptures associated with these individuals. This separation of ecological and observational processes allowed me to derive a formal connection between Bayes and empirical Bayes estimators of population abundance that has not been established previously. I showed that this connection applies to every Poisson point-process model of SECR data and provides theoretical support for a previously proposed estimator of abundance based on recaptures in trapping arrays. To illustrate results of both classical and Bayesian methods of analysis, I compared Bayes and empirical Bayes esimates of abundance and density using recaptures from simulated and real populations of animals. Real populations included two iconic datasets: recaptures of tigers detected in camera-trap surveys and recaptures of lizards detected in area-search surveys. In the datasets I analyzed, classical and Bayesian methods provided similar - and often identical - inferences, which is not surprising given the sample sizes and the noninformative priors used in the analyses.
NASA Astrophysics Data System (ADS)
Nguyen, Emmanuel; Antoni, Jerome; Grondin, Olivier
2009-12-01
In the automotive industry, the necessary reduction of pollutant emission for new Diesel engines requires the control of combustion events. This control is efficient provided combustion parameters such as combustion occurrence and combustion energy are relevant. Combustion parameters are traditionally measured from cylinder pressure sensors. However this kind of sensor is expensive and has a limited lifetime. Thus this paper proposes to use only one cylinder pressure on a multi-cylinder engine and to extract combustion parameters from the other cylinders with low cost knock sensors. Knock sensors measure the vibration circulating on the engine block, hence they do not all contain the information on the combustion processes, but they are also contaminated by other mechanical noises that disorder the signal. The question is how to combine the information coming from one cylinder pressure and knock sensors to obtain the most relevant combustion parameters in all engine cylinders. In this paper, the issue is addressed trough the Bayesian inference formalism. In that cylinder where a cylinder pressure sensor is mounted, combustion parameters will be measured directly. In the other cylinders, they will be measured indirectly from Bayesian inference. Experimental results obtained on a four cylinder Diesel engine demonstrate the effectiveness of the proposed algorithm toward that purpose.
Engineering Management Board Tour VAB
2017-03-22
Members of NASA’s Engineering Management Board pause for a group photo during a tour of the Vehicle Assembly Building at Kennedy Space Center in Florida. The platforms in High Bay 3, including the one on which the board members are standing, were designed to surround and provide access to NASA’s Space Launch System and Orion spacecraft. The Engineering Management Board toured integral areas of Kennedy to help the agencywide group reach its goal of unifying engineering work across NASA.
NASA Astrophysics Data System (ADS)
Davias, M. E.; Gilbride, J. L.
2011-12-01
Aerial photographs of Carolina bays taken in the 1930's sparked the initial research into their geomorphology. Satellite Imagery available today through the Google Earth Virtual Globe facility expands the regions available for interrogation, but reveal only part of their unique planforms. Digital Elevation Maps (DEMs), using Light Detection And Ranging (LiDAR) remote sensing data, accentuate the visual presentation of these aligned ovoid shallow basins by emphasizing their robust circumpheral rims. To support a geospatial survey of Carolina bay landforms in the continental USA, 400,000 km2 of hsv-shaded DEMs were created as KML-JPEG tile sets. A majority of these DEMs were generated with LiDAR-derived data. We demonstrate the tile generation process and their integration into Google Earth, where the DEMs augment available photographic imagery for the visualization of bay planforms. While the generic Carolina bay planform is considered oval, we document subtle regional variations. Using a small set of empirically derived planform shapes, we created corresponding Google Earth overlay templates. We demonstrate the analysis of an individual Carolina bay by placing an appropriate overlay onto the virtually globe, then orientating, sizing and rotating it by edit handles such that it satisfactorily represents the bay's rim. The resulting overlay data element is extracted from Google Earth's object directory and programmatically processed to generate metrics such as geographic location, elevation, major and minor axis and inferred orientation. Utilizing a virtual globe facility for data capture may result in higher quality data compared to methods that reference flat maps, where geospatial shape and orientation of the bays could be skewed and distorted in the orthographic projection process. Using the methodology described, we have measured over 25k distinct Carolina bays. We discuss the Google Fusion geospatial data repository facility, through which these data have been assembled and made web-accessible to other researchers. Preliminary findings from the survey are discussed, such as how bay surface area, eccentricity and orientation vary across ~800 1/4° × 1/4° grid elements. Future work includes measuring 25k additional bays, as well as interrogation of the orientation data to identify any possible systematic geospatial relationships.
Kashyap, Vipul; Morales, Alfredo; Hongsermeier, Tonya
2006-01-01
We present an approach and architecture for implementing scalable and maintainable clinical decision support at the Partners HealthCare System. The architecture integrates a business rules engine that executes declarative if-then rules stored in a rule-base referencing objects and methods in a business object model. The rules engine executes object methods by invoking services implemented on the clinical data repository. Specialized inferences that support classification of data and instances into classes are identified and an approach to implement these inferences using an OWL based ontology engine is presented. Alternative representations of these specialized inferences as if-then rules or OWL axioms are explored and their impact on the scalability and maintenance of the system is presented. Architectural alternatives for integration of clinical decision support functionality with the invoking application and the underlying clinical data repository; and their associated trade-offs are discussed and presented.
Comparing two Bayes methods based on the free energy functions in Bernoulli mixtures.
Yamazaki, Keisuke; Kaji, Daisuke
2013-08-01
Hierarchical learning models are ubiquitously employed in information science and data engineering. The structure makes the posterior distribution complicated in the Bayes method. Then, the prediction including construction of the posterior is not tractable though advantages of the method are empirically well known. The variational Bayes method is widely used as an approximation method for application; it has the tractable posterior on the basis of the variational free energy function. The asymptotic behavior has been studied in many hierarchical models and a phase transition is observed. The exact form of the asymptotic variational Bayes energy is derived in Bernoulli mixture models and the phase diagram shows that there are three types of parameter learning. However, the approximation accuracy or interpretation of the transition point has not been clarified yet. The present paper precisely analyzes the Bayes free energy function of the Bernoulli mixtures. Comparing free energy functions in these two Bayes methods, we can determine the approximation accuracy and elucidate behavior of the parameter learning. Our results claim that the Bayes free energy has the same learning types while the transition points are different. Copyright © 2013 Elsevier Ltd. All rights reserved.
1989-01-01
In this 1989 artist's concept, the Shuttle-C floats in space with its cargo bay doors open. As envisioned by Marshall Space Flight Center plarners, the Shuttle-C would be an unmanned heavy lift cargo vehicle derived from Space Shuttle elements. The vehicle would utilize the basic Shuttle propulsion units (Solid Rocket Boosters, Space Shuttle Main Engine, External Tank), but would replace the Oribiter with an unmanned Shuttle-C Cargo Element (SCE). The SCE would have a payload bay length of eighty-two feet, compared to sixty feet for the Orbiter cargo bay, and would be able to deliver 170,000 pound payloads to low Earth orbit, more than three times the Orbiter's capacity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sunendra, Joshi R.; Kukkadapu, Ravi K.; Burdige, David J.
2015-05-19
The Chesapeake Bay, the largest and most productive estuary in the US, suffers from varying degrees of water quality issues fueled by both point and non–point source nutrient sources. Restoration of the bay is complicated by the multitude of nutrient sources, their variable inputs and hydrological conditions, and complex interacting factors including climate forcing. These complexities not only restrict formulation of effective restoration plans but also open up debates on accountability issues with nutrient loading. A detailed understanding of sediment phosphorus (P) dynamics enables one to identify the exchange of dissolved constituents across the sediment- water interface and aid tomore » better constrain mechanisms and processes controlling the coupling between the sediments and the overlying waters. Here we used phosphate oxygen isotope ratios (δ18Op) in concert with sediment chemistry, XRD, and Mössbauer spectroscopy on the sediment retrieved from an organic rich, sulfidic site in the meso-haline portion of the mid-bay to identify sources and pathway of sedimentary P cycling and to infer potential feedback effect on bottom water hypoxia and surface water eutrophication. Isotope data indicate that the regeneration of inorganic P from organic matter degradation (remineralization) is the predominant, if not sole, pathway for authigenic P precipitation in the mid-bay sediments. We interpret that the excess inorganic P generated by remineralization should have overwhelmed any bottom-water and/or pore-water P derived from other sources or biogeochemical processes and exceeded saturation with respect to authigenic P precipitation. It is the first research that identifies the predominance of remineralization pathway against remobilization (coupled Fe-P cycling) pathway in the Chesapeake Bay. Therefore, these results are expected to have significant implications for the current understanding of P cycling and benthic-pelagic coupling in the bay, particularly on the source and pathway of P that sustains hypoxia and supports phytoplankton growth in the surface water.« less
Hydrogeologic setting and ground water flow beneath a section of Indian River Bay, Delaware
Krantz, David E.; Manheim, Frank T.; Bratton, John F.; Phelan, Daniel J.
2004-01-01
The small bays along the Atlantic coast of the Delmarva Peninsula (Delaware, Maryland, and Virginia) are a valuable natural resource, and an asset for commerce and recreation. These coastal bays also are vulnerable to eutrophication from the input of excess nutrients derived from agriculture and other human activities in the watersheds. Ground water discharge may be an appreciable source of fresh water and a transport pathway for nutrients entering the bays. This paper presents results from an investigation of the physical properties of the surficial aquifer and the processes associated with ground water flow beneath Indian River Bay, Delaware. A key aspect of the project was the deployment of a new technology, streaming horizontal resistivity, to map the subsurface distribution of fresh and saline ground water beneath the bay. The resistivity profiles showed complex patterns of ground water flow, modes of mixing, and submarine ground water discharge. Cores, gamma and electromagnetic-induction logs, and in situ ground water samples collected during a coring operation in Indian River Bay verified the interpretation of the resistivity profiles. The shore-parallel resistivity lines show subsurface zones of fresh ground water alternating with zones dominated by the flow of salt water from the estuary down into the aquifer. Advective flow produces plumes of fresh ground water 400 to 600 m wide and 20 m thick that may extend more than 1 km beneath the estuary. Zones of dispersive mixing between fresh and saline ground water develop on the upper, lower, and lateral boundaries of the the plume. the plumes generally underlie small incised valleys that can be traced landward to stream draining the upland. The incised valleys are filled with 1 to 2 m of silt and peat that act as a semiconfining layer to restrict the downward flow of salt water from the estuary. Active circulation of both the fresh and saline ground water masses beneath the bay is inferred from the geophysical results and supported by geochemical data.
Mikulich-Gilbertson, Susan K; Wagner, Brandie D; Grunwald, Gary K; Riggs, Paula D; Zerbe, Gary O
2018-01-01
Medical research is often designed to investigate changes in a collection of response variables that are measured repeatedly on the same subjects. The multivariate generalized linear mixed model (MGLMM) can be used to evaluate random coefficient associations (e.g. simple correlations, partial regression coefficients) among outcomes that may be non-normal and differently distributed by specifying a multivariate normal distribution for their random effects and then evaluating the latent relationship between them. Empirical Bayes predictors are readily available for each subject from any mixed model and are observable and hence, plotable. Here, we evaluate whether second-stage association analyses of empirical Bayes predictors from a MGLMM, provide a good approximation and visual representation of these latent association analyses using medical examples and simulations. Additionally, we compare these results with association analyses of empirical Bayes predictors generated from separate mixed models for each outcome, a procedure that could circumvent computational problems that arise when the dimension of the joint covariance matrix of random effects is large and prohibits estimation of latent associations. As has been shown in other analytic contexts, the p-values for all second-stage coefficients that were determined by naively assuming normality of empirical Bayes predictors provide a good approximation to p-values determined via permutation analysis. Analyzing outcomes that are interrelated with separate models in the first stage and then associating the resulting empirical Bayes predictors in a second stage results in different mean and covariance parameter estimates from the maximum likelihood estimates generated by a MGLMM. The potential for erroneous inference from using results from these separate models increases as the magnitude of the association among the outcomes increases. Thus if computable, scatterplots of the conditionally independent empirical Bayes predictors from a MGLMM are always preferable to scatterplots of empirical Bayes predictors generated by separate models, unless the true association between outcomes is zero.
NASA Astrophysics Data System (ADS)
Donatelli, Carmine; Ganju, Neil Kamal; Fagherazzi, Sergio; Leonardi, Nicoletta
2018-05-01
Seagrasses are marine flowering plants that strongly impact their physical and biological surroundings and are therefore frequently referred to as ecological engineers. The effect of seagrasses on coastal bay resilience and sediment transport dynamics is understudied. Here we use six historical maps of seagrass distribution in Barnegat Bay, USA, to investigate the role of these vegetated surfaces on the sediment storage capacity of shallow bays. Analyses are carried out by means of the Coupled-Ocean-Atmosphere-Wave-Sediment Transport (COAWST) numerical modeling framework. Results show that a decline in the extent of seagrass meadows reduces the sediment mass potentially stored within bay systems. The presence of seagrass reduces shear stress values across the entire bay, including unvegetated areas, and promotes sediment deposition on tidal flats. On the other hand, the presence of seagrasses decreases suspended sediment concentrations, which in turn reduces the delivery of sediment to marsh platforms. Results highlight the relevance of seagrasses for the long-term survival of coastal ecosystems, and the complex dynamics regulating the interaction between subtidal and intertidal landscapes.
Algorithms for database-dependent search of MS/MS data.
Matthiesen, Rune
2013-01-01
The frequent used bottom-up strategy for identification of proteins and their associated modifications generate nowadays typically thousands of MS/MS spectra that normally are matched automatically against a protein sequence database. Search engines that take as input MS/MS spectra and a protein sequence database are referred as database-dependent search engines. Many programs both commercial and freely available exist for database-dependent search of MS/MS spectra and most of the programs have excellent user documentation. The aim here is therefore to outline the algorithm strategy behind different search engines rather than providing software user manuals. The process of database-dependent search can be divided into search strategy, peptide scoring, protein scoring, and finally protein inference. Most efforts in the literature have been put in to comparing results from different software rather than discussing the underlining algorithms. Such practical comparisons can be cluttered by suboptimal implementation and the observed differences are frequently caused by software parameters settings which have not been set proper to allow even comparison. In other words an algorithmic idea can still be worth considering even if the software implementation has been demonstrated to be suboptimal. The aim in this chapter is therefore to split the algorithms for database-dependent searching of MS/MS data into the above steps so that the different algorithmic ideas become more transparent and comparable. Most search engines provide good implementations of the first three data analysis steps mentioned above, whereas the final step of protein inference are much less developed for most search engines and is in many cases performed by an external software. The final part of this chapter illustrates how protein inference is built into the VEMS search engine and discusses a stand-alone program SIR for protein inference that can import a Mascot search result.
Superposition-model analysis of rare-earth doped BaY2F8
NASA Astrophysics Data System (ADS)
Magnani, N.; Amoretti, G.; Baraldi, A.; Capelletti, R.
The energy level schemes of four rare-earth dopants (Ce3+ , Nd3+ , Dy3+ , and Er3+) in BaY2 F-8 , as determined by optical absorption spectra, were fitted with a single-ion Hamiltonian and analysed within Newman's Superposition Model for the crystal field. A unified picture for the four dopants was obtained, by assuming a distortion of the F- ligand cage around the RE site; within the framework of the Superposition Model, this distortion is found to have a marked anisotropic behaviour for heavy rare earths, while it turns into an isotropic expansion of the nearest-neighbours polyhedron for light rare earths. It is also inferred that the substituting ion may occupy an off-center position with respect to the original Y3+ site in the crystal.
NASA Astrophysics Data System (ADS)
Laesanpura, Agus; Dahrin, Darharta; Nurseptian, Ivan
2017-04-01
East Flores is part of Nusa Tenggara island belongs to volcanic arc zone, hence the active volcanoes surround the area about 60 × 50 square km. It is located at latitude south 8° 30’, and longitude east 122° 45’. Geologically, the rock is mostly of volcanic material since Miocene age. The Intriguing question is where the volcanic feeder, pyroclastic, and how it vanish in subsurface. The magnetic data acquisitions were executed on land for 500 meter interval and denser through the bay surrounded by volcanoes. The combine reduction to pole and forward modelling is apply for serve interpretation using forward modelling technique. The two interpretation sections, show the body of magmatic may present at depth about 2 to 3 km. The observation show no significant decreasing or loosening of magnetic anomaly although near the active volcano. We suggest the thermal anomaly is just disturbing magnetic data in near surface but not in the depth one. Meanwhile the reduction to pole’s section could distinguish the two group of rock. In assuming the layer is flat. The inferred peak of magmatic body near the existing volcano; and the active demagnetization associated through evidence of hot spring and inferred fault structure.
Model weights and the foundations of multimodel inference
Link, W.A.; Barker, R.J.
2006-01-01
Statistical thinking in wildlife biology and ecology has been profoundly influenced by the introduction of AIC (Akaike?s information criterion) as a tool for model selection and as a basis for model averaging. In this paper, we advocate the Bayesian paradigm as a broader framework for multimodel inference, one in which model averaging and model selection are naturally linked, and in which the performance of AIC-based tools is naturally evaluated. Prior model weights implicitly associated with the use of AIC are seen to highly favor complex models: in some cases, all but the most highly parameterized models in the model set are virtually ignored a priori. We suggest the usefulness of the weighted BIC (Bayesian information criterion) as a computationally simple alternative to AIC, based on explicit selection of prior model probabilities rather than acceptance of default priors associated with AIC. We note, however, that both procedures are only approximate to the use of exact Bayes factors. We discuss and illustrate technical difficulties associated with Bayes factors, and suggest approaches to avoiding these difficulties in the context of model selection for a logistic regression. Our example highlights the predisposition of AIC weighting to favor complex models and suggests a need for caution in using the BIC for computing approximate posterior model weights.
NASA Astrophysics Data System (ADS)
Dasher, D. H.; Lomax, T. J.; Bethe, A.; Jewett, S.; Hoberg, M.
2016-02-01
A regional probabilistic survey of 20 randomly selected stations, where water and sediments were sampled, was conducted over an area of Simpson Lagoon and Gwydyr Bay in the Beaufort Sea adjacent Prudhoe Bay, Alaska, in 2014. Sampling parameters included water column for temperature, salinity, dissolved oxygen, chlorophyll a, nutrients and sediments for macroinvertebrates, chemistry, i.e., trace metals and hydrocarbons, and grain size. The 2014 probabilistic survey design allows for inferences to be made of environmental status, for instance the spatial or aerial distribution of sediment trace metals within the design area sampled. Historically, since the 1970's a number of monitoring studies have been conducted in this estuary area using a targeted rather than regional probabilistic design. Targeted non-random designs were utilized to assess specific points of interest and cannot be used to make inferences to distributions of environmental parameters. Due to differences in the environmental monitoring objectives between probabilistic and targeted designs there has been limited assessment see if benefits exist to combining the two approaches. This study evaluates if a combined approach using the 2014 probabilistic survey sediment trace metal and macroinvertebrate results and historical targeted monitoring data can provide a new perspective on better understanding the environmental status of these estuaries.
1993-06-01
COMMUNITY ENZYME OSMOREGULATION ENERGY FLOW DNA/RNA BEHAVIOR NUTRIENT CYCLING END POINT MEMBRANES METABOLISM INTRASPECIFIC HISTOPATHOLOGY SURVIVAL...Miscellaneous Paper D-93-2AD-A268 207 June 1993 US Army Corps of Engineers Waterways Experiment Station Long-Term Effects of Dredging Operations...Program Chronic Sublethal Effects of San Francisco Bay Sediments on Nereis (Neanthes) arenaceodentata; Full Life-Cycle Exposure to Bedded Sediments by
2015-03-01
entrance were evaluated on their ability to reduce potential impacts of waves and currents on wet- lands. Study results indicated all three proposed...transport de- veloped were used in the evaluation of proposed solutions. The prelimi- nary modeling results helped to assess general sediment pattern...Corps of Engineers (USACE), Buffalo Dis- trict, is conducting a study to evaluate shoreline protection measures for coastal wetlands at Braddock Bay
1975-05-01
Paul District, Corps of Engineers St. Paul, Minnesota ENVIRONMENTAL ASSESSMENT REPORT OPERATION AND MAINTENANCE ACTIVITIES GRAND TRAVERSE BAY HARBOR...drift is from north to south. The shoreline north of the harbor is covered with copper mine tailings transported by littoral currents and wave action...from the Gay Mine tailings deposits located about 4 miles north of the harbor. 1.830 Dredge Material Disposal - The U.S. Environmental Protection
The Baltimore Engineers and the Chesapeake Bay, 1961-1987
1988-01-01
supply and drought management study that will identify those measures required to optimize the use of exist- ing water supplies in the Bay drainage ... drainage area of the Chesapeake, the Susquehanna accounts for 43% and the Potomac for 22% of this land area. The total average inflow of fresh water to...right) water supply and, tn some areas, abatement of acid mine drainage . not allowed the Susquehanna to escape from serious water supply
A formal model of interpersonal inference
Moutoussis, Michael; Trujillo-Barreto, Nelson J.; El-Deredy, Wael; Dolan, Raymond J.; Friston, Karl J.
2014-01-01
Introduction: We propose that active Bayesian inference—a general framework for decision-making—can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: (1) Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to “mentalizing” in the psychological literature, is based upon the outcomes of interpersonal exchanges. (2) We show how some well-known social-psychological phenomena (e.g., self-serving biases) can be explained in terms of active interpersonal inference. (3) Mentalizing naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one's own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modeling intersubject variability in mentalizing during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalizing is distorted. PMID:24723872
Ling, Cheng; Hamada, Tsuyoshi; Gao, Jingyang; Zhao, Guoguang; Sun, Donghong; Shi, Weifeng
2016-01-01
MrBayes is a widespread phylogenetic inference tool harnessing empirical evolutionary models and Bayesian statistics. However, the computational cost on the likelihood estimation is very expensive, resulting in undesirably long execution time. Although a number of multi-threaded optimizations have been proposed to speed up MrBayes, there are bottlenecks that severely limit the GPU thread-level parallelism of likelihood estimations. This study proposes a high performance and resource-efficient method for GPU-oriented parallelization of likelihood estimations. Instead of having to rely on empirical programming, the proposed novel decomposition storage model implements high performance data transfers implicitly. In terms of performance improvement, a speedup factor of up to 178 can be achieved on the analysis of simulated datasets by four Tesla K40 cards. In comparison to the other publicly available GPU-oriented MrBayes, the tgMC 3 ++ method (proposed herein) outperforms the tgMC 3 (v1.0), nMC 3 (v2.1.1) and oMC 3 (v1.00) methods by speedup factors of up to 1.6, 1.9 and 2.9, respectively. Moreover, tgMC 3 ++ supports more evolutionary models and gamma categories, which previous GPU-oriented methods fail to take into analysis.
The blue mud shrimp, Upogebia pugettensis, the bay ghost shrimp, Neotrypaea californiensis, and eelgrass, Zostera marina are endemic ecosystem engineers that define the ecological structure and function of estuaries along the Pacific coast of the US as significantly as do marshes...
STORAGE AND RECIEVING, TRA662. ELEVATIONS. LOWBAY SECTION ON SOUTH SIDE ...
STORAGE AND RECIEVING, TRA-662. ELEVATIONS. LOW-BAY SECTION ON SOUTH SIDE WAS FLAMMABLE STORAGE AREA. HUMMEL HUMMEL & JONES 1038-MTR-ETR-662-A-3, 6/1960. INL INDEX NO. 532-0653-00-381-102036, REV. 3. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
NASA Astrophysics Data System (ADS)
Mohammad-Djafari, Ali
2015-01-01
The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.
Video File - RS-25 Engine Test 2017-08-30
2017-08-30
NASA engineers closed a summer of hot fire testing Aug. 30 for flight controllers on RS-25 engines that will help power the new Space Launch System (SLS) rocket being built to carry astronauts to deep-space destinations, including Mars. The 500-second hot fire an RS-25 engine flight controller unit on the A-1 Test Stand at Stennis Space Center near Bay St. Louis, Mississippi marked another step toward the nation’s return to human deep-space exploration missions.
Deep Borehole Instrumentation Along San Francisco Bay Bridges - 2001
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchings, L.; Kasameyer, P.; Long, L.
2001-05-01
This is a progress report on the Bay Bridges downhole network. Between 2 and 8 instruments have been spaced along the Dumbarton, San Mateo, Bay, and San Rafael bridges in San Francisco Bay, California. The instruments will provide multiple use data that is important to geotechnical, structural engineering, and seismological studies. The holes are between 100 and 1000 ft deep and were drilled by Caltrans. There are twenty-one sensor packages at fifteen sites. The downhole instrument package contains a three component HS-1 seismometer and three orthogonal Wilcox 731 accelerometers, and is capable of recording a micro g from local Mmore » = 1.0 earthquakes to 0.5 g strong ground motion form large Bay Area earthquakes. This report list earthquakes and stations where recordings were obtained during the period February 29, 2000 to November 11, 2000. Also, preliminary results on noise analysis for up and down hole recordings at Yerba Buena Island is presented.« less
Baldwin, W.E.; Morton, R.A.; Putney, T.R.; Katuna, M.P.; Harris, M.S.; Gayes, P.T.; Driscoll, N.W.; Denny, J.F.; Schwab, W.C.
2006-01-01
Several generations of the ancestral Pee Dee River system have been mapped beneath the South Carolina Grand Strand coastline and adjacent Long Bay inner shelf. Deep boreholes onshore and high-resolution seismic-reflection data offshore allow for reconstruction of these paleochannels, which formed during glacial lowstands, when the Pee Dee River system incised subaerially exposed coastal-plain and continental-shelf strata. Paleochannel groups, representing different generations of the system, decrease in age to the southwest, where the modern Pee Dee River merges with several coastal-plain tributaries at Winyah Bay, the southern terminus of Long Bay. Positions of the successive generational groups record a regional, southwestward migration of the river system that may have initiated during the late Pliocene. The migration was primarily driven by barrier-island deposition, resulting from the interaction of fluvial and shoreline processes during eustatic highstands. Structurally driven, subsurface paleotopography associated with the Mid-Carolina Platform High has also indirectly assisted in forcing this migration. These results provide a better understanding of the evolution of the region and help explain the lack of mobile sediment on the Long Bay inner shelf. Migration of the river system caused a profound change in sediment supply during the late Pleistocene. The abundant fluvial source that once fed sand-rich barrier islands was cut off and replaced with a limited source, supplied by erosion and reworking of former coastal deposits exposed at the shore and on the inner shelf.
Single board system for fuzzy inference
NASA Technical Reports Server (NTRS)
Symon, James R.; Watanabe, Hiroyuki
1991-01-01
The very large scale integration (VLSI) implementation of a fuzzy logic inference mechanism allows the use of rule-based control and decision making in demanding real-time applications. Researchers designed a full custom VLSI inference engine. The chip was fabricated using CMOS technology. The chip consists of 688,000 transistors of which 476,000 are used for RAM memory. The fuzzy logic inference engine board system incorporates the custom designed integrated circuit into a standard VMEbus environment. The Fuzzy Logic system uses Transistor-Transistor Logic (TTL) parts to provide the interface between the Fuzzy chip and a standard, double height VMEbus backplane, allowing the chip to perform application process control through the VMEbus host. High level C language functions hide details of the hardware system interface from the applications level programmer. The first version of the board was installed on a robot at Oak Ridge National Laboratory in January of 1990.
2D soil and engineering-seismic bedrock modeling of eastern part of Izmir inner bay/Turkey
NASA Astrophysics Data System (ADS)
Pamuk, Eren; Akgün, Mustafa; Özdağ, Özkan Cevdet; Gönenç, Tolga
2017-02-01
Soil-bedrock models are used as a base when the earthquake-soil common behaviour is defined. Moreover, the medium which is defined as bedrock is classified as engineering and seismic bedrock in itself. In these descriptions, S-wave velocity is (Vs) used as a base. The mediums are called soil where the Vs is < 760 m/s, the bigger ones are called bedrock as well. Additionally, the parts are called engineering bedrock where the Vs is between 3000 m/s and 760 m/s, the parts where are bigger than 3000 m/s called seismic bedrock. The interfacial's horizontal topography where is between engineering and seismic bedrock is effective on earthquake's effect changing on the soil surface. That's why, 2D soil-bedrock models must be used to estimate the earthquake effect that could occur on the soil surface. In this research, surface wave methods and microgravity method were used for occuring the 2D soil-bedrock models in the east of İzmir bay. In the first stage, velocity values were obtained by the studies using surface wave methods. Then, density values were calculated from these velocity values by the help of the empiric relations. 2D soil-bedrock models were occurred based upon both Vs and changing of density by using these density values in microgravity model. When evaluating the models, it was determined that the soil is 300-400 m thickness and composed of more than one layers in parts where are especially closer to the bay. Moreover, it was observed that the soil thickness changes in the direction of N-S. In the study area, geologically, it should be thought the engineering bedrock is composed of Bornova melange and seismic bedrock unit is composed of Menderes massif. Also, according to the geophysical results, Neogene limestone and andesite units at between 200 and 400 m depth show that engineering bedrock characteristic.
Inference on the Strength of Balancing Selection for Epistatically Interacting Loci
Buzbas, Erkan Ozge; Joyce, Paul; Rosenberg, Noah A.
2011-01-01
Existing inference methods for estimating the strength of balancing selection in multi-locus genotypes rely on the assumption that there are no epistatic interactions between loci. Complex systems in which balancing selection is prevalent, such as sets of human immune system genes, are known to contain components that interact epistatically. Therefore, current methods may not produce reliable inference on the strength of selection at these loci. In this paper, we address this problem by presenting statistical methods that can account for epistatic interactions in making inference about balancing selection. A theoretical result due to Fearnhead (2006) is used to build a multi-locus Wright-Fisher model of balancing selection, allowing for epistatic interactions among loci. Antagonistic and synergistic types of interactions are examined. The joint posterior distribution of the selection and mutation parameters is sampled by Markov chain Monte Carlo methods, and the plausibility of models is assessed via Bayes factors. As a component of the inference process, an algorithm to generate multi-locus allele frequencies under balancing selection models with epistasis is also presented. Recent evidence on interactions among a set of human immune system genes is introduced as a motivating biological system for the epistatic model, and data on these genes are used to demonstrate the methods. PMID:21277883
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDermott, Jason E.; Costa, Michelle N.; Stevens, S.L.
A difficult problem that is currently growing rapidly due to the sharp increase in the amount of high-throughput data available for many systems is that of determining useful and informative causative influence networks. These networks can be used to predict behavior given observation of a small number of components, predict behavior at a future time point, or identify components that are critical to the functioning of the system under particular conditions. In these endeavors incorporating observations of systems from a wide variety of viewpoints can be particularly beneficial, but has often been undertaken with the objective of inferring networks thatmore » are generally applicable. The focus of the current work is to integrate both general observations and measurements taken for a particular pathology, that of ischemic stroke, to provide improved ability to produce useful predictions of systems behavior. A number of hybrid approaches have recently been proposed for network generation in which the Gene Ontology is used to filter or enrich network links inferred from gene expression data through reverse engineering methods. These approaches have been shown to improve the biological plausibility of the inferred relationships determined, but still treat knowledge-based and machine-learning inferences as incommensurable inputs. In this paper, we explore how further improvements may be achieved through a full integration of network inference insights achieved through application of the Gene Ontology and reverse engineering methods with specific reference to the construction of dynamic models of transcriptional regulatory networks. We show that integrating two approaches to network construction, one based on reverse-engineering from conditional transcriptional data, one based on reverse-engineering from in situ hybridization data, and another based on functional associations derived from Gene Ontology, using probabilities can improve results of clustering as evaluated by a predictive model of transcriptional expression levels.« less
Krypotos, Angelos-Miltiadis; Klugkist, Irene; Engelhard, Iris M.
2017-01-01
ABSTRACT Threat conditioning procedures have allowed the experimental investigation of the pathogenesis of Post-Traumatic Stress Disorder. The findings of these procedures have also provided stable foundations for the development of relevant intervention programs (e.g. exposure therapy). Statistical inference of threat conditioning procedures is commonly based on p-values and Null Hypothesis Significance Testing (NHST). Nowadays, however, there is a growing concern about this statistical approach, as many scientists point to the various limitations of p-values and NHST. As an alternative, the use of Bayes factors and Bayesian hypothesis testing has been suggested. In this article, we apply this statistical approach to threat conditioning data. In order to enable the easy computation of Bayes factors for threat conditioning data we present a new R package named condir, which can be used either via the R console or via a Shiny application. This article provides both a non-technical introduction to Bayesian analysis for researchers using the threat conditioning paradigm, and the necessary tools for computing Bayes factors easily. PMID:29038683
Oxygenation variability in Mejillones Bay, off northern Chile, during the last two centuries
NASA Astrophysics Data System (ADS)
Díaz-Ochoa, J. A.; Pantoja, S.; de Lange, G. J.; Lange, C. B.; Sánchez, G. E.; Acuña, V. R.; Muñoz, P.; Vargas, G.
2011-01-01
The Peru Chile Current ecosystem is characterized by high biological productivity and important fisheries. Although this system is likely to be severely affected by climate change, its response to current global warming is still uncertain. In this paper, we analyze 10-166 year-old sediments in two cores collected from Mejillones Bay, an anoxic sedimentary setting favorable for the preservation of proxies. Based on a 166-year chronology, we used proxies of bottom-water oxygenation (Mo, V, S, and the (lycopane + n-C35)/n-C31 ratio) and surface water productivity (biogenic opal, counts of diatom valves, biogenic Ba, organic carbon, and chlorins) to reconstruct environmental variations in Mejillones Bay. During the last two centuries, a shift took place in the coastal marine ecosystem of Bahia Mejillones at decadal scales. This shift was characterized by intense ENSO-like activity, large-scale fluctuations in biological export productivity and bottom water oxygenation, and increased eolian activity (inferred from Ti/Al and Zr/Al). This short-term variability was accompanied by a gradual increase of sulfidic conditions that has intensified since the early 1960s.
Huang, Tao; Yang, Lianjiao; Chu, Zhuding; Sun, Liguang; Yin, Xijie
2016-09-15
Emperor penguins (Aptenodytes forsteri) are sensitive to the Antarctic climate change because they breed on the fast sea ice. Studies of paleohistory for the emperor penguin are rare, due to the lack of archives on land. In this study, we obtained an emperor penguin ornithogenic sediment profile (PI) and performed geochronological, geochemical and stable isotope analyses on the sediments and feather remains. Two radiocarbon dates of penguin feathers in PI indicate that emperor penguins colonized Amanda Bay as early as CE 1540. By using the bio-elements (P, Se, Hg, Zn and Cd) in sediments and stable isotope values (δ(15)N and δ(13)C) in feathers, we inferred relative population size and dietary change of emperor penguins during the period of CE 1540-2008, respectively. An increase in population size with depleted N isotope ratios for emperor penguins on N island at Amanda Bay during the Little Ice Age (CE 1540-1866) was observed, suggesting that cold climate affected the penguin's breeding habitat, prey availability and thus their population and dietary composition. Copyright © 2016 Elsevier B.V. All rights reserved.
Finite-frequency traveltime tomography of San Francisco Bay region crustal velocity structure
Pollitz, F.F.
2007-01-01
Seismic velocity structure of the San Francisco Bay region crust is derived using measurements of finite-frequency traveltimes. A total of 57 801 relative traveltimes are measured by cross-correlation over the frequency range 0.5-1.5 Hz. From these are derived 4862 'summary' traveltimes, which are used to derive 3-D P-wave velocity structure over a 341 ?? 140 km2 area from the surface to 25 km depth. The seismic tomography is based on sensitivity kernels calculated on a spherically symmetric reference model. Robust elements of the derived P-wave velocity structure are: a pronounced velocity contrast across the San Andreas fault in the south Bay region (west side faster); a moderate velocity contrast across the Hayward fault (west side faster); moderately low velocity crust around the Quien Sabe volcanic field and the Sacramento River delta; very low velocity crust around Lake Berryessa. These features are generally explicable with surface rock types being extrapolated to depth ???10 km in the upper crust. Generally high mid-lower crust velocity and high inferred Poisson's ratio suggest a mafic lower crust. ?? Journal compilation ?? 2007 RAS.
Dupuy, Alton J.; Couvillion, Nolan P.
1979-01-01
From March 1977 to July 1978 the U.S. Geological Survey in cooperation with the U.S. Army Corps of Engineers conducted a series of elutriate studies to determine water quality in selected reaches of major navigable waterways of southern Louisiana. Sample were collected from the Mississippi River-Gulf Outlet areas; Mississippi River, South Pass; Baptiste Collette Bayou; Tiger Pass area; Baou Long; Bayou Barataria and Barataria Bay Waterway area (gulf section); Bayou Segnette Waterway, Lake Pontchartrain near Tangipahoa River mouth; Bayou Grand Caillou; Bayou la Carpe at Homa; Houma Navigation Canal and Terrebonne Bay; Bayou Boeuf, Bayou Chene, and Baou Black, Atchafalaya River Channel, Atchafalaya Bay; Old River Lock tailbay; Red River below mouth of Black River; Freshwaer Canal; Mermentau River and Lake Arthur Mermentau River outlet; and Calcasieu Ship Channel. The studies were initiated at the request of the U.S. Army Corps of Engineers to evaluate possible environmental effects of proposed dredging activities in those waterways. The U.S. Army Corps of Engineers and U.S. Geological Survey collected 189 samples of native water and 172 samples of bottom (bed) material from 163 different sites. A total of 117 elutriates (Mixtures of native water and bottom material) were prepared. The native water and elutriate samples were analyzed for selected metals, pesticides, nutrients organics, and pysical constituents. Particle-size determinations were made on bottom-material samples. (Kosco-USGS)
Fuzzy inference system for identification of geological stratigraphy off Prydz Bay, East Antarctica
NASA Astrophysics Data System (ADS)
Singh, Upendra K.
2011-12-01
The analysis of well logging data plays key role in the exploration and development of hydrocarbon reservoirs. Various well log parameters such as porosity, gamma ray, density, transit time and resistivity, help in classification of strata and estimation of the physical, electrical and acoustical properties of the subsurface lithology. Strong and conspicuous changes in some of the log parameters associated with any particular geological stratigraphy formation are function of its composition, physical properties that help in classification. However some substrata show moderate values in respective log parameters and make difficult to identify the kind of strata, if we go by the standard variability ranges of any log parameters and visual inspection. The complexity increases further with more number of sensors involved. An attempt is made to identify the kinds of stratigraphy from well logs over Prydz bay basin, East Antarctica using fuzzy inference system. A model is built based on few data sets of known stratigraphy and further the network model is used as test model to infer the lithology of a borehole from their geophysical logs, not used in simulation. Initially the fuzzy based algorithm is trained, validated and tested on well log data and finally identifies the formation lithology of a hydrocarbon reservoir system of study area. The effectiveness of this technique is demonstrated by the analysis of the results for actual lithologs and coring data of ODP Leg 188. The fuzzy results show that the training performance equals to 82.95% while the prediction ability is 87.69%. The fuzzy results are very encouraging and the model is able to decipher even thin layer seams and other strata from geophysical logs. The result provides the significant sand formation of depth range 316.0- 341.0 m, where core recovery is incomplete.
Gao, Xuelu; Zhuang, Wen; Chen, Chen-Tung Arthur; Zhang, Yong
2015-01-01
Historically, the Bohai Sea is one of the most important fishing grounds in China. Yet, surrounded by one of the biggest economic rims of China, its ecological functions have been declining rapidly in recent two decades under the heavy anthropogenic impacts. The Laizhou Bay is the smallest one of the three main bays in the Bohai Sea. Owing to the rich brine deposits, chemical industries using brine as raw materials are booming in the southern coast of the Laizhou Bay, the scale of which ranks as the largest one in China. In order to monitor and assess the environmental quality, surface sediments were collected from the coastal waters of southwestern Laizhou Bay and the rivers it connects with during summer and autumn in 2012, and analyzed for heavy metals. Several widely adopted methods were used in the overall assessment of heavy metal pollution status and potential ecological risks in these sediments, and the data were analyzed to infer the main sources of the pollutants. The results showed that the remarkably high concentrations of heavy metals were almost all recorded in a small number of riverine sites. Cr, Cu, Ni and Zn were the main environmental threat according to the sediment quality guidelines. The marine area was generally in good condition with no or low risk from the studied metals and adverse effects on biota could hardly occur. Natural sources dominated the concentrations and distributions of Cu, Ni, Pb and Zn in the marine area. Our results indicated that heavy metal pollution was not a main cause of the ecological degradation of the Laizhou Bay at present. PMID:25816338
NASA Astrophysics Data System (ADS)
Peck, Victoria L.; Allen, Claire S.; Kender, Sev; McClymont, Erin L.; Hodgson, Dominic A.
2015-07-01
Recent intensification of wind-driven upwelling of warm upper circumpolar deep water (UCDW) has been linked to accelerated melting of West Antarctic ice shelves and glaciers. To better assess the long term relationship between UCDW upwelling and the stability of the West Antarctic Ice Sheet, we present a multi-proxy reconstruction of surface and bottom water conditions in Marguerite Bay, West Antarctic Peninsula (WAP), through the Holocene. A combination of sedimentological, diatom and foraminiferal records are, for the first time, presented together to infer a decline in UCDW influence within Marguerite Bay through the early to mid Holocene and the dominance of cyclic forcing in the late Holocene. Extensive glacial melt, limited sea ice and enhanced primary productivity between 9.7 and 7.0 ka BP is considered to be most consistent with persistent incursions of UCDW through Marguerite Trough. From 7.0 ka BP sea ice seasons increased and productivity decreased, suggesting that UCDW influence within Marguerite Bay waned, coincident with the equatorward migration of the Southern Hemisphere Westerly Winds (SWW). UCDW influence continued through the mid Holocene, and by 4.2 ka BP lengthy sea ice seasons persisted within Marguerite Bay. Intermittent melting and reforming of this sea ice within the late Holocene may be indicative of episodic incursions of UCDW into Marguerite Bay during this period. The cyclical changes in the oceanography within Marguerite Bay during the late Holocene is consistent with enhanced sensitively to ENSO forcing as opposed to the SWW-forcing that appears to have dominated the early to mid Holocene. Current measurements of the oceanography of the WAP continental shelf suggest that the system has now returned to the early Holocene-like oceanographic configuration reported here, which in both cases has been associated with rapid deglaciation.
Wood, Dustin A.; Bui, Thuy-Vy D.; Overton, Cory T.; Vandergast, Amy; Casazza, Michael L.; Hull, Joshua M.; Takekawa, John Y.
2016-01-01
Fragmentation and loss of natural habitat have important consequences for wild populations and can negatively affect long-term viability and resilience to environmental change. Salt marsh obligate species, such as those that occupy the San Francisco Bay Estuary in western North America, occupy already impaired habitats as result of human development and modifications and are highly susceptible to increased habitat loss and fragmentation due to global climate change. We examined the genetic variation of the California Ridgway’s rail (Rallus obsoletus obsoletus), a state and federally endangered species that occurs within the fragmented salt marsh of the San Francisco Bay Estuary. We genotyped 107 rails across 11 microsatellite loci and a single mitochondrial gene to estimate genetic diversity and population structure among seven salt marsh fragments and assessed demographic connectivity by inferring patterns of gene flow and migration rates. We found pronounced genetic structuring among four geographically separate genetic clusters across the San Francisco Bay. Gene flow analyses supported a stepping stone model of gene flow from south-to-north. However, contemporary gene flow among the regional embayments was low. Genetic diversity among occupied salt marshes and genetic clusters were not significantly different. We detected low effective population sizes and significantly high relatedness among individuals within salt marshes. Preserving genetic diversity and connectivity throughout the San Francisco Bay may require attention to salt marsh restoration in the Central Bay where habitat is both most limited and most fragmented. Incorporating periodic genetic sampling into the management regime may help evaluate population trends and guide long-term management priorities.
Gao, Xuelu; Zhuang, Wen; Chen, Chen-Tung Arthur; Zhang, Yong
2015-01-01
Historically, the Bohai Sea is one of the most important fishing grounds in China. Yet, surrounded by one of the biggest economic rims of China, its ecological functions have been declining rapidly in recent two decades under the heavy anthropogenic impacts. The Laizhou Bay is the smallest one of the three main bays in the Bohai Sea. Owing to the rich brine deposits, chemical industries using brine as raw materials are booming in the southern coast of the Laizhou Bay, the scale of which ranks as the largest one in China. In order to monitor and assess the environmental quality, surface sediments were collected from the coastal waters of southwestern Laizhou Bay and the rivers it connects with during summer and autumn in 2012, and analyzed for heavy metals. Several widely adopted methods were used in the overall assessment of heavy metal pollution status and potential ecological risks in these sediments, and the data were analyzed to infer the main sources of the pollutants. The results showed that the remarkably high concentrations of heavy metals were almost all recorded in a small number of riverine sites. Cr, Cu, Ni and Zn were the main environmental threat according to the sediment quality guidelines. The marine area was generally in good condition with no or low risk from the studied metals and adverse effects on biota could hardly occur. Natural sources dominated the concentrations and distributions of Cu, Ni, Pb and Zn in the marine area. Our results indicated that heavy metal pollution was not a main cause of the ecological degradation of the Laizhou Bay at present.
A&M. TAN607. Sections for second phase expansion: engine maintenance, machine, ...
A&M. TAN-607. Sections for second phase expansion: engine maintenance, machine, and welding shops; high bay assembly shop, chemical cleaning room (decontamination). Details of sliding door hoods. Approved by INEEL Classification Office for public release. Ralph M. Parsons 1299-5-ANP/GE-3-607-A 109. Date: August 1956. INEEL index code no. 034-0607-00-693-107169 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID
13. Detail of doubledoor opening between machine shop section of ...
13. Detail of double-door opening between machine shop section of roundhouse and engine house section of roundhouse. Engine house visible through rectangular opening. At one time the opening extended to the intrados of the arch above the doorway, allowing railroad engines to fit inside the machine shop. View to east. - Duluth & Iron Range Rail Road Company Shops, Roundhouse, Southwest of downtown Two Harbors, northwest of Agate Bay, Two Harbors, Lake County, MN
Bayes and blickets: Effects of knowledge on causal induction in children and adults
Griffiths, Thomas L.; Sobel, David M.; Tenenbaum, Joshua B.; Gopnik, Alison
2011-01-01
People are adept at inferring novel causal relations, even from only a few observations. Prior knowledge about the probability of encountering causal relations of various types and the nature of the mechanisms relating causes and effects plays a crucial role in these inferences. We test a formal account of how this knowledge can be used and acquired, based on analyzing causal induction as Bayesian inference. Five studies explored the predictions of this account with adults and 4-year-olds, using tasks in which participants learned about the causal properties of a set of objects. The studies varied the two factors that our Bayesian approach predicted should be relevant to causal induction: the prior probability with which causal relations exist, and the assumption of a deterministic or a probabilistic relation between cause and effect. Adults’ judgments (Experiments 1, 2, and 4) were in close correspondence with the quantitative predictions of the model, and children’s judgments (Experiments 3 and 5) agreed qualitatively with this account. PMID:21972897
Data analysis using scale-space filtering and Bayesian probabilistic reasoning
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak; Kutulakos, Kiriakos; Robinson, Peter
1991-01-01
This paper describes a program for analysis of output curves from Differential Thermal Analyzer (DTA). The program first extracts probabilistic qualitative features from a DTA curve of a soil sample, and then uses Bayesian probabilistic reasoning to infer the mineral in the soil. The qualifier module employs a simple and efficient extension of scale-space filtering suitable for handling DTA data. We have observed that points can vanish from contours in the scale-space image when filtering operations are not highly accurate. To handle the problem of vanishing points, perceptual organizations heuristics are used to group the points into lines. Next, these lines are grouped into contours by using additional heuristics. Probabilities are associated with these contours using domain-specific correlations. A Bayes tree classifier processes probabilistic features to infer the presence of different minerals in the soil. Experiments show that the algorithm that uses domain-specific correlation to infer qualitative features outperforms a domain-independent algorithm that does not.
Campbell, Kieran R; Yau, Christopher
2017-03-15
Modeling bifurcations in single-cell transcriptomics data has become an increasingly popular field of research. Several methods have been proposed to infer bifurcation structure from such data, but all rely on heuristic non-probabilistic inference. Here we propose the first generative, fully probabilistic model for such inference based on a Bayesian hierarchical mixture of factor analyzers. Our model exhibits competitive performance on large datasets despite implementing full Markov-Chain Monte Carlo sampling, and its unique hierarchical prior structure enables automatic determination of genes driving the bifurcation process. We additionally propose an Empirical-Bayes like extension that deals with the high levels of zero-inflation in single-cell RNA-seq data and quantify when such models are useful. We apply or model to both real and simulated single-cell gene expression data and compare the results to existing pseudotime methods. Finally, we discuss both the merits and weaknesses of such a unified, probabilistic approach in the context practical bioinformatics analyses.
91. ARAIII. GCRE reactor building (ARA608) at 48 percent completion. ...
91. ARA-III. GCRE reactor building (ARA-608) at 48 percent completion. Camera faces west end of building; shows roll-up door. High bay section on right view. Petro-chem heater stack extends above roof of low-bay section on left. Excavation for 13, 8 kv electrical conduit in foreground. January 20, 1959. Ineel photo no. 59-322. Photographer: Jack L. Anderson. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID
1993-09-01
developing oocytes. In effect , gametogenesis and subsequent reproductive success are largely dependent on the energy and metabolic substrates assimi- lated...Miscellaneous Paper D-93-4 September 1993 US Army Corps AD-A269 901 of Engineers Waterways Experiment i Ii Station Long- Term Effects of Dredging...Operations Program Chronic Sublethal Effects of San Francisco Bay Sediments on Nereis (Neanthes) arenaceodentata; Effect of Food Ration on Sediment
Wu, Z. Y.; Saito, Yoshiki; Zhao, D. N.; Zhou, J. Q.; Cao, Z. Y.; Li, S. J.; Shang, J. H.; Liang, Y. Y.
2016-01-01
Estuaries have been sites of intensive human activities during the past century. Tracing the evolution of subaqueous topography in estuaries on a decadal timescale enables us to understand the effects of human activities on estuaries. Bathymetric data from 1955 to 2010 show that land reclamation decreased the subaqueous area of Lingding Bay, in the Pearl River estuary, by ~170 km2 and decreased its water volume by 615 × 106 m3, representing a net decrease of 11.2 × 106 m3 per year and indicating the deposition of approximately 14.5 Mt/yr of sediment in Lingding Bay during that period. Whereas Lingding Bay was mainly governed by natural processes with slight net deposition before 1980, subsequent dredging and large port engineering projects changed the subaqueous topography of the bay by shallowing its shoals and deepening its troughs. Between 2012 and 2013, continuous dredging and a surge of sand excavation resulted in local changes in water depth of ± 5 m/yr, far exceeding the magnitude of natural topographic evolution in Lingding Bay. Reclamation, dredging, and navigation-channel projects removed 8.4 Mt/yr of sediment from Lingding Bay, representing 29% of the sediment input to the bay, and these activities have increased recently. PMID:27886227
Library Service to Industry at USC: The Industrial Associates of the School of Engineering.
ERIC Educational Resources Information Center
Frohmberg, Katherine A.
Special libraries in Southern California and the San Francisco Bay Area who were members of the University of Southern California (USC) School of Engineering Industrial Associate program were surveyed on their use of the USC program and other similar programs. The questionnaire was designed to discover the attitudes and needs of the Industrial…
Statistics for nuclear engineers and scientists. Part 1. Basic statistical inference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beggs, W.J.
1981-02-01
This report is intended for the use of engineers and scientists working in the nuclear industry, especially at the Bettis Atomic Power Laboratory. It serves as the basis for several Bettis in-house statistics courses. The objectives of the report are to introduce the reader to the language and concepts of statistics and to provide a basic set of techniques to apply to problems of the collection and analysis of data. Part 1 covers subjects of basic inference. The subjects include: descriptive statistics; probability; simple inference for normally distributed populations, and for non-normal populations as well; comparison of two populations; themore » analysis of variance; quality control procedures; and linear regression analysis.« less
Witter, Robert C.; Knudsen, Keith L.; Sowers, Janet M.; Wentworth, Carl M.; Koehler, Richard D.; Randolph, Carolyn E.; Brooks, Suzanna K.; Gans, Kathleen D.
2006-01-01
This report presents a map and database of Quaternary deposits and liquefaction susceptibility for the urban core of the San Francisco Bay region. It supercedes the equivalent area of U.S. Geological Survey Open-File Report 00-444 (Knudsen and others, 2000), which covers the larger 9-county San Francisco Bay region. The report consists of (1) a spatial database, (2) two small-scale colored maps (Quaternary deposits and liquefaction susceptibility), (3) a text describing the Quaternary map and liquefaction interpretation (part 3), and (4) a text introducing the report and describing the database (part 1). All parts of the report are digital; part 1 describes the database and digital files and how to obtain them by downloading across the internet. The nine counties surrounding San Francisco Bay straddle the San Andreas fault system, which exposes the region to serious earthquake hazard (Working Group on California Earthquake Probabilities, 1999). Much of the land adjacent to the Bay and the major rivers and streams is underlain by unconsolidated deposits that are particularly vulnerable to earthquake shaking and liquefaction of water-saturated granular sediment. This new map provides a consistent detailed treatment of the central part of the 9-county region in which much of the mapping of Open-File Report 00-444 was either at smaller (less detailed) scale or represented only preliminary revision of earlier work. Like Open-File Report 00-444, the current mapping uses geomorphic expression, pedogenic soils, inferred depositional environments, and geologic age to define and distinguish the map units. Further scrutiny of the factors controlling liquefaction susceptibility has led to some changes relative to Open-File Report 00-444: particularly the reclassification of San Francisco Bay mud (Qhbm) to have only MODERATE susceptibility and the rating of artificial fills according to the Quaternary map units inferred to underlie them (other than dams - adf). The two colored maps provide a regional summary of the new mapping at a scale of 1:200,000, a scale that is sufficient to show the general distribution and relationships of the map units but not to distinguish the more detailed elements that are present in the database. The report is the product of cooperative work by the National Earthquake Hazards Reduction Program (NEHRP) and National Cooperative Geologic Mapping Program of the U.S. Geological Survey, William Lettis and & Associates, Inc. (WLA), and the California Geological Survey. An earlier version was submitted to the U.S. Geological Survey by WLA as a final report for a NEHRP grant (Witter and others, 2005). The mapping has been carried out by WLA geologists under contract to the NEHRP Earthquake Program (Grant 99-HQ-GR-0095) and by the California Geological Survey.
An Ada inference engine for expert systems
NASA Technical Reports Server (NTRS)
Lavallee, David B.
1986-01-01
The purpose is to investigate the feasibility of using Ada for rule-based expert systems with real-time performance requirements. This includes exploring the Ada features which give improved performance to expert systems as well as optimizing the tradeoffs or workarounds that the use of Ada may require. A prototype inference engine was built using Ada, and rule firing rates in excess of 500 per second were demonstrated on a single MC68000 processor. The knowledge base uses a directed acyclic graph to represent production lines. The graph allows the use of AND, OR, and NOT logical operators. The inference engine uses a combination of both forward and backward chaining in order to reach goals as quickly as possible. Future efforts will include additional investigation of multiprocessing to improve performance and creating a user interface allowing rule input in an Ada-like syntax. Investigation of multitasking and alternate knowledge base representations will help to analyze some of the performance issues as they relate to larger problems.
Evaluation of dredged material proposed for ocean disposal from Gravesend Bay Anchorage, New York
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrows, E.S.; Gruendell, B.D.
1996-09-01
The Gravesend Bay Anchorage was one of seven waterways that the US Army Corps of Engineers-New York District (USACE-NYD) requested the Battelle Marine Sciences Laboratory (MSL) to sample and evaluate for dredging and disposal in February 1994. Sediment samples were submitted for physical and chemical analyses to provide baseline sediment chemistry data on the Gravesend Bay Anchorage. Individual sediment core samples collected at the Gravesend Bay Anchorage were analyzed for grain size, moisture content, and total organic carbon (TOC). Two samples, one of composited sediment cores representing the southeast corner of the anchorage (COMP GR), and one sediment core representingmore » the northeast corner of the anchorage (Station GR-1 0), were analyzed for bulk density, specific gravity, metals, chlorinated pesticides, polychlorinated biphenyl (PCB) congeners, polynuclear aromatic hydrocarbons (PAH), and 1,4-dichlorobenzene.« less
NASA’s Stennis Space Center Conducts RS-25 Engine Test
2017-03-24
On March 23, NASA conducted a test of an RS-25 engine at the agency’s Stennis Space Center in Bay St. Louis, Mississippi. Four RS-25’s will help power NASA’s Space Launch System (SLS) rocket to space. During this test, engineers evaluated the engine’s new controller or “brain”, which communicates with the SLS vehicle. Once test data is certified, the engine controller will be removed and installed on one of the four flight engines that will help power the first integrated flight of SLS and the Orion spacecraft.
Hein, James R.; Mizell, Kira; Barnard, Patrick L.; Barnard, P.L.; Jaffee, B.E.; Schoellhamer, D.H.
2013-01-01
The mineralogical compositions of 119 samples collected from throughout the San Francisco Bay coastal system, including bayfloor and seafloor, area beaches, cliff outcrops, and major drainages, were determined using X-ray diffraction (XRD). Comparison of the mineral concentrations and application of statistical cluster analysis of XRD spectra allowed for the determination of provenances and transport pathways. The use of XRD mineral identifications provides semi-quantitative compositions needed for comparisons of beach and offshore sands with potential cliff and river sources, but the innovative cluster analysis of XRD diffraction spectra provides a unique visualization of how groups of samples within the San Francisco Bay coastal system are related so that sand-sized sediment transport pathways can be inferred. The main vector for sediment transport as defined by the XRD analysis is from San Francisco Bay to the outer coast, where the sand then accumulates on the ebb tidal delta and also moves alongshore. This mineralogical link defines a critical pathway because large volumes of sediment have been removed from the Bay over the last century via channel dredging, aggregate mining, and borrow pit mining, with comparable volumes of erosion from the ebb tidal delta over the same period, in addition to high rates of shoreline retreat along the adjacent, open-coast beaches. Therefore, while previously only a temporal relationship was established, the transport pathway defined by mineralogical and geochemical tracers support the link between anthropogenic activities in the Bay and widespread erosion outside the Bay. The XRD results also establish the regional and local importance of sediment derived from cliff erosion, as well as both proximal and distal fluvial sources. This research is an important contribution to a broader provenance study aimed at identifying the driving forces for widespread geomorphic change in a heavily urbanized coastal-estuarine system.
The Art of Artificial Intelligence. 1. Themes and Case Studies of Knowledge Engineering
1977-08-01
in scientific and medical inference illuminate the art of knowledge engineering and its parent science , Artificial Intelligence....The knowledge engineer practices the art of bringing the principles and tools of AI research to bear on difficult applications problems requiring
Impact of petrophysical uncertainty on Bayesian hydrogeophysical inversion and model selection
NASA Astrophysics Data System (ADS)
Brunetti, Carlotta; Linde, Niklas
2018-01-01
Quantitative hydrogeophysical studies rely heavily on petrophysical relationships that link geophysical properties to hydrogeological properties and state variables. Coupled inversion studies are frequently based on the questionable assumption that these relationships are perfect (i.e., no scatter). Using synthetic examples and crosshole ground-penetrating radar (GPR) data from the South Oyster Bacterial Transport Site in Virginia, USA, we investigate the impact of spatially-correlated petrophysical uncertainty on inferred posterior porosity and hydraulic conductivity distributions and on Bayes factors used in Bayesian model selection. Our study shows that accounting for petrophysical uncertainty in the inversion (I) decreases bias of the inferred variance of hydrogeological subsurface properties, (II) provides more realistic uncertainty assessment and (III) reduces the overconfidence in the ability of geophysical data to falsify conceptual hydrogeological models.
Wang, Wei; Xia, Minxuan; Chen, Jie; Deng, Fenni; Yuan, Rui; Zhang, Xiaopei; Shen, Fafu
2016-12-01
The data presented in this paper is supporting the research article "Genome-Wide Analysis of Superoxide Dismutase Gene Family in Gossypium raimondii and G. arboreum" [1]. In this data article, we present phylogenetic tree showing dichotomy with two different clusters of SODs inferred by the Bayesian method of MrBayes (version 3.2.4), "Bayesian phylogenetic inference under mixed models" [2], Ramachandran plots of G. raimondii and G. arboreum SODs, the protein sequence used to generate 3D sructure of proteins and the template accession via SWISS-MODEL server, "SWISS-MODEL: modelling protein tertiary and quaternary structure using evolutionary information." [3] and motif sequences of SODs identified by InterProScan (version 4.8) with the Pfam database, "Pfam: the protein families database" [4].
1984-09-01
principal coordinating mechanism for the Study. Since its establishment, the Advisory Group advised the District Engineer regarding study policy and...standards. Finfish in the vicinity of Tilghrnan Island include those species typical for Bay waters having a salinity of 9-14 parts per thousand (ppt...usually reaches its highest point during the year in January and February, reflecting the area’s dependence on farming and fisheries. INSTITUTIONAL
1993-09-01
carbohydrates from the worm’s internal musculature and transfer them to developing oocytes. In effect , the energy and metabolic 22 Chapter 4 Discussion...Miscellaneous Paper D-93-5 September 1993 US Army Corps AD-A269 836 of Engineers Waterways Experiment IIii l llli, Station Long- Term Effects of...Dredging Operations Program Chronic Sublethal Effects of San Francisco Bay Sediments on Nereis (Neanthes) arenaceodentata; Interpretative Guidance for a
Commencement Bay Cumulative Impact Study: Historic Review of Special Aquatic Sites
1991-05-04
is generally defined as a geographic region of south Puget Sound in Washington State extending from Brown’s Point to Point Defiance. (Figure-10. it...amount of sediment load. 2 2 Area enlarged Commencement Bay Cumulative Impacts Study (U Puget Sound 0 0 3.0 600E,0 Point) Figureat 1. Study AreaMa...the Puget Sound Environmental Atlas was produced under funding from the Seattle District Corps of Engineers, EPA, and the Puget Sound Water Quality
1994-06-01
sediment-associated toxicant in the lower Fox River and Green Bay , Wisconsin," Environ. Toxicol. Chem. 9, 313-322. Burton, G. A., Jr., Stemmer, B. L...Barton, J. ¶ USEPA, Region X, Seatle, WA Bay . S. It So. CA Coastal Water Research Project, Long Beach, CA Black, J. t EA Engineering, Science and...Umbeck F. I USAGE, Seattle District, Seattle, WA Ward, J. f Battelle Northwest Pacific Laboratory, Sequim , WA Weber, C.¶ USEPA, Cincinnati, OH Welch, T
Retail Building Guide for Entrance Energy Efficiency Measures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, J.; Kung, F.
2012-03-01
This booklet is based on the findings of an infiltration analysis for supermarkets and large retail buildings without refrigerated cases. It enables retail building managers and engineers to calculate the energy savings potential for vestibule additions for supermarkets; and bay door operation changes in large retail stores without refrigerated cases. Retail managers can use initial estimates to decide whether to engage vendors or contractors of vestibules for pricing or site-specific analyses, or to decide whether to test bay door operation changes in pilot stores, respectively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ploskey, Gene R.; Weiland, Mark A.; Faber, Derrek M.
This report describes a 2008 acoustic telemetry survival study conducted by the Pacific Northwest National Laboratory for the Portland District of the U.S. Army Corps of Engineers. The study estimated the survival of juvenile Chinook salmon and steelhead passing Bonneville Dam (BON) and its spillway. Of particular interest was the relative survival of smolts detected passing through end spill bays 1-3 and 16-18, which had deep flow deflectors immediately downstream of spill gates, versus survival of smolts passing middle spill bays 4-15, which had shallow flow deflectors.
Universal Darwinism As a Process of Bayesian Inference.
Campbell, John O
2016-01-01
Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.
Universal Darwinism As a Process of Bayesian Inference
Campbell, John O.
2016-01-01
Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an “experiment” in the external world environment, and the results of that “experiment” or the “surprise” entailed by predicted and actual outcomes of the “experiment.” Minimization of free energy implies that the implicit measure of “surprise” experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature. PMID:27375438
A Variational Bayes Genomic-Enabled Prediction Model with Genotype × Environment Interaction
Montesinos-López, Osval A.; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José Cricelio; Luna-Vázquez, Francisco Javier; Salinas-Ruiz, Josafhat; Herrera-Morales, José R.; Buenrostro-Mariscal, Raymundo
2017-01-01
There are Bayesian and non-Bayesian genomic models that take into account G×E interactions. However, the computational cost of implementing Bayesian models is high, and becomes almost impossible when the number of genotypes, environments, and traits is very large, while, in non-Bayesian models, there are often important and unsolved convergence problems. The variational Bayes method is popular in machine learning, and, by approximating the probability distributions through optimization, it tends to be faster than Markov Chain Monte Carlo methods. For this reason, in this paper, we propose a new genomic variational Bayes version of the Bayesian genomic model with G×E using half-t priors on each standard deviation (SD) term to guarantee highly noninformative and posterior inferences that are not sensitive to the choice of hyper-parameters. We show the complete theoretical derivation of the full conditional and the variational posterior distributions, and their implementations. We used eight experimental genomic maize and wheat data sets to illustrate the new proposed variational Bayes approximation, and compared its predictions and implementation time with a standard Bayesian genomic model with G×E. Results indicated that prediction accuracies are slightly higher in the standard Bayesian model with G×E than in its variational counterpart, but, in terms of computation time, the variational Bayes genomic model with G×E is, in general, 10 times faster than the conventional Bayesian genomic model with G×E. For this reason, the proposed model may be a useful tool for researchers who need to predict and select genotypes in several environments. PMID:28391241
Remote sensing of particle backscattering in Chesapeake Bay: a 6-year SeaWiFS retrospective view
Zawada, D.G.; Hu, C.; Clayton, T.; Chen, Z.; Brock, J.C.; Muller-Karger, F. E.
2007-01-01
Traditional field techniques to monitor water quality in large estuaries, such as boat-based surveys and autonomous moored sensors, generally provide limited spatial coverage. Satellite imagery potentially can be used to address both of these limitations. Here, we show that satellite-based observations are useful for inferring total-suspended-solids (TSS) concentrations in estuarine areas. A spectra-matching optimization algorithm was used to estimate the particle backscattering coefficient at 400 nm, bbp(400), in Chesapeake Bay from Sea-viewing Wide-Field-of-view Sensor (SeaWiFS) satellite imagery. These estimated values of bbp(400) were compared to in situ measurements of TSS for the study period of September 1997–December 2003. Contemporaneous SeaWiFS bbp(400) values and TSS concentrations were positively correlated (N = 340, r2 = 0.4, P bp(400) values served as a reasonable first-order approximation for synoptically mapping TSS. Overall, large-scale patterns of SeaWiFS bbp(400) appeared to be consistent with expectations based on field observations and historical reports of TSS. Monthly averages indicated that SeaWiFS bbp(400) was typically largest in winter (>0.049 m−1, November–February) and smallest in summer (−1, June–August), regardless of the amount of riverine discharge to the bay. The study period also included Hurricanes Floyd and Isabel, which caused large-scale turbidity events and changes in the water quality of the bay. These results demonstrate that this technique can provide frequent synoptic assessments of suspended solids concentrations in Chesapeake Bay and other coastal regions.
A Variational Bayes Genomic-Enabled Prediction Model with Genotype × Environment Interaction.
Montesinos-López, Osval A; Montesinos-López, Abelardo; Crossa, José; Montesinos-López, José Cricelio; Luna-Vázquez, Francisco Javier; Salinas-Ruiz, Josafhat; Herrera-Morales, José R; Buenrostro-Mariscal, Raymundo
2017-06-07
There are Bayesian and non-Bayesian genomic models that take into account G×E interactions. However, the computational cost of implementing Bayesian models is high, and becomes almost impossible when the number of genotypes, environments, and traits is very large, while, in non-Bayesian models, there are often important and unsolved convergence problems. The variational Bayes method is popular in machine learning, and, by approximating the probability distributions through optimization, it tends to be faster than Markov Chain Monte Carlo methods. For this reason, in this paper, we propose a new genomic variational Bayes version of the Bayesian genomic model with G×E using half-t priors on each standard deviation (SD) term to guarantee highly noninformative and posterior inferences that are not sensitive to the choice of hyper-parameters. We show the complete theoretical derivation of the full conditional and the variational posterior distributions, and their implementations. We used eight experimental genomic maize and wheat data sets to illustrate the new proposed variational Bayes approximation, and compared its predictions and implementation time with a standard Bayesian genomic model with G×E. Results indicated that prediction accuracies are slightly higher in the standard Bayesian model with G×E than in its variational counterpart, but, in terms of computation time, the variational Bayes genomic model with G×E is, in general, 10 times faster than the conventional Bayesian genomic model with G×E. For this reason, the proposed model may be a useful tool for researchers who need to predict and select genotypes in several environments. Copyright © 2017 Montesinos-López et al.
Lithospheric Architecture Beneath Hudson Bay
NASA Astrophysics Data System (ADS)
Porritt, R. W.; Miller, M. S.; Darbyshire, F. A.
2015-12-01
Hudson Bay overlies some of the thickest Precambrian lithosphere on Earth, whose internal structures contain important clues to the earliest workings of plate formation. The terminal collision, the Trans-Hudson Orogen, brought together the Western Churchill craton to the northwest and the Superior craton to the southeast. These two Archean cratons along with the Paleo-Proterozoic Trans-Hudson internides, form the core of the North American craton. We use S to P converted wave imaging and absolute shear velocity information from a joint inversion of P to S receiver functions, new ambient noise derived phase velocities, and teleseismic phase velocities to investigate this region and determine both the thickness of the lithosphere and the presence of internal discontinuities. The lithosphere under central Hudson Bay approaches 350 km thick but is thinner (200-250 km) around the periphery of the Bay. Furthermore, the amplitude of the lithosphere-asthenosphere boundary (LAB) conversion from the S receiver functions is unusually large for a craton, suggesting a large thermal contrast across the LAB, which we interpret as direct evidence of the thermal insulation effect of continents on the asthenosphere. Within the lithosphere, midlithospheric discontinuities, significantly shallower than the base of the lithosphere, are often imaged, suggesting the mechanisms that form these layers are common. Lacking time-history information, we infer that these discontinuities reflect reactivation of formation structures during deformation of the craton.
Lithospheric architecture beneath Hudson Bay
NASA Astrophysics Data System (ADS)
Porritt, Robert W.; Miller, Meghan S.; Darbyshire, Fiona A.
2015-07-01
Hudson Bay overlies some of the thickest Precambrian lithosphere on Earth, whose internal structures contain important clues to the earliest workings of plate formation. The terminal collision, the Trans-Hudson Orogen, brought together the Western Churchill craton to the northwest and the Superior craton to the southeast. These two Archean cratons along with the Paleo-Proterozoic Trans-Hudson internides, form the core of the North American craton. We use S to P converted wave imaging and absolute shear velocity information from a joint inversion of P to S receiver functions, new ambient noise derived phase velocities, and teleseismic phase velocities to investigate this region and determine both the thickness of the lithosphere and the presence of internal discontinuities. The lithosphere under central Hudson Bay approaches ˜350 km thick but is thinner (˜200-250 km) around the periphery of the Bay. Furthermore, the amplitude of the LAB conversion from the S receiver functions is unusually large for a craton, suggesting a large thermal contrast across the LAB, which we interpret as direct evidence of the thermal insulation effect of continents on the asthenosphere. Within the lithosphere, midlithospheric discontinuities, significantly shallower than the base of the lithosphere, are often imaged, suggesting the mechanisms that form these layers are common. Lacking time-history information, we infer that these discontinuities reflect reactivation of formation structures during deformation of the craton.
CytoBayesJ: software tools for Bayesian analysis of cytogenetic radiation dosimetry data.
Ainsbury, Elizabeth A; Vinnikov, Volodymyr; Puig, Pedro; Maznyk, Nataliya; Rothkamm, Kai; Lloyd, David C
2013-08-30
A number of authors have suggested that a Bayesian approach may be most appropriate for analysis of cytogenetic radiation dosimetry data. In the Bayesian framework, probability of an event is described in terms of previous expectations and uncertainty. Previously existing, or prior, information is used in combination with experimental results to infer probabilities or the likelihood that a hypothesis is true. It has been shown that the Bayesian approach increases both the accuracy and quality assurance of radiation dose estimates. New software entitled CytoBayesJ has been developed with the aim of bringing Bayesian analysis to cytogenetic biodosimetry laboratory practice. CytoBayesJ takes a number of Bayesian or 'Bayesian like' methods that have been proposed in the literature and presents them to the user in the form of simple user-friendly tools, including testing for the most appropriate model for distribution of chromosome aberrations and calculations of posterior probability distributions. The individual tools are described in detail and relevant examples of the use of the methods and the corresponding CytoBayesJ software tools are given. In this way, the suitability of the Bayesian approach to biological radiation dosimetry is highlighted and its wider application encouraged by providing a user-friendly software interface and manual in English and Russian. Copyright © 2013 Elsevier B.V. All rights reserved.
General Purpose Probabilistic Programming Platform with Effective Stochastic Inference
2018-04-01
2.2 Venture 10 2.3 BayesDB 12 2.4 Picture 17 2.5 MetaProb 20 3.0 METHODS , ASSUMPTIONS, AND PROCEDURES 22 4.0 RESULTS AND DISCUSSION 23 4.1...The methods section outlines the research approach. The results and discussion section gives representative quantitative and qualitative results...modeling via CrossCat, a probabilistic method that emulates many of the judgment calls ordinarily made by a human data analyst. This AI assistance
Autonomous entropy-based intelligent experimental design
NASA Astrophysics Data System (ADS)
Malakar, Nabin Kumar
2011-07-01
The aim of this thesis is to explore the application of probability and information theory in experimental design, and to do so in a way that combines what we know about inference and inquiry in a comprehensive and consistent manner. Present day scientific frontiers involve data collection at an ever-increasing rate. This requires that we find a way to collect the most relevant data in an automated fashion. By following the logic of the scientific method, we couple an inference engine with an inquiry engine to automate the iterative process of scientific learning. The inference engine involves Bayesian machine learning techniques to estimate model parameters based upon both prior information and previously collected data, while the inquiry engine implements data-driven exploration. By choosing an experiment whose distribution of expected results has the maximum entropy, the inquiry engine selects the experiment that maximizes the expected information gain. The coupled inference and inquiry engines constitute an autonomous learning method for scientific exploration. We apply it to a robotic arm to demonstrate the efficacy of the method. Optimizing inquiry involves searching for an experiment that promises, on average, to be maximally informative. If the set of potential experiments is described by many parameters, the search involves a high-dimensional entropy space. In such cases, a brute force search method will be slow and computationally expensive. We develop an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment. This helps to reduce the number of computations necessary to find the optimal experiment. We also extended the method of maximizing entropy, and developed a method of maximizing joint entropy so that it could be used as a principle of collaboration between two robots. This is a major achievement of this thesis, as it allows the information-based collaboration between two robotic units towards a same goal in an automated fashion.
Variable neighborhood search for reverse engineering of gene regulatory networks.
Nicholson, Charles; Goodwin, Leslie; Clark, Corey
2017-01-01
A new search heuristic, Divided Neighborhood Exploration Search, designed to be used with inference algorithms such as Bayesian networks to improve on the reverse engineering of gene regulatory networks is presented. The approach systematically moves through the search space to find topologies representative of gene regulatory networks that are more likely to explain microarray data. In empirical testing it is demonstrated that the novel method is superior to the widely employed greedy search techniques in both the quality of the inferred networks and computational time. Copyright © 2016 Elsevier Inc. All rights reserved.
Development of a Spacecraft Materials Selector Expert System
NASA Technical Reports Server (NTRS)
Pippin, G.; Kauffman, W. (Technical Monitor)
2002-01-01
This report contains a description of the knowledge base tool and examples of its use. A downloadable version of the Spacecraft Materials Selector (SMS) knowledge base is available through the NASA Space Environments and Effects Program. The "Spacecraft Materials Selector" knowledge base is part of an electronic expert system. The expert system consists of an inference engine that contains the "decision-making" code and the knowledge base that contains the selected body of information. The inference engine is a software package previously developed at Boeing, called the Boeing Expert System Tool (BEST) kit.
Signal Processing Expert Code (SPEC)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ames, H.S.
1985-12-01
The purpose of this paper is to describe a prototype expert system called SPEC which was developed to demonstrate the utility of providing an intelligent interface for users of SIG, a general purpose signal processing code. The expert system is written in NIL, runs on a VAX 11/750 and consists of a backward chaining inference engine and an English-like parser. The inference engine uses knowledge encoded as rules about the formats of SIG commands and about how to perform frequency analyses using SIG. The system demonstrated that expert system can be used to control existing codes.
78 FR 35749 - Airworthiness Directives; The Boeing Company Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-14
... contamination of the strut forward dry bay, which could result in hydrogen embrittlement of the titanium forward... could result in hydrogen embrittlement of the titanium forward engine mount bulkhead fittings, and...
NASA Concludes Summer of RS-25 Testing
2017-08-30
NASA engineers closed a summer of hot fire testing Aug. 30 for flight controllers on RS-25 engines that will help power the new Space Launch System (SLS) rocket being built to carry astronauts to deep-space destinations, including Mars. The 500-second hot fire an RS-25 engine flight controller unit on the A-1 Test Stand at Stennis Space Center near Bay St. Louis, Mississippi marked another step toward the nation’s return to human deep-space exploration missions.
Fitzpatrick, Faith A.; Garrison, Paul J.; Fitzgerald, Sharon A.; Elder, John F.
2003-01-01
Sediment cores were collected from Musky Bay, Lac Courte Oreilles, and from surrounding areas in 1999 and 2001 to determine whether the water quality of Musky Bay has declined during the last 100 years or more as a result of human activity, specifically cottage development and cranberry farming. Selected cores were analyzed for sedimentation rates, nutrients, minor and trace elements, biogenic silica, diatom assemblages, and pollen over the past several decades. Two cranberry bogs constructed along Musky Bay in 1939 and the early 1950s were substantially expanded between 1950?62 and between 1980?98. Cottage development on Musky Bay has occurred at a steady rate since about 1930, although currently housing density on Musky Bay is one-third to one-half the housing density surrounding three other Lac Courte Oreilles bays. Sedimentation rates were reconstructed for a core from Musky Bay by use of three lead radioisotope models and the cesium-137 profile. The historical average mass and linear sedimentation rates for Musky Bay are 0.023 grams per square centimeter per year and 0.84 centimeters per year, respectively, for the period of about 1936?90. There is also limited evidence that sedimentation rates may have increased after the mid-1990s. Historical changes in input of organic carbon, nitrogen, phosphorus, and sulfur to Musky Bay could not be directly identified from concentration profiles of these elements because of the potential for postdepositional migration and recycling. Minor- and trace-element profiles from the Musky Bay core possibly reflect historical changes in the input of clastic material over time, as well as potential changes in atmospheric deposition inputs. The input of clastic material to the bay increased slightly after European settlement and possibly in the 1930s through 1950s. Concentrations of copper in the Musky Bay core increased steadily through the early to mid-1900s until about 1980 and appear to reflect inputs from atmospheric deposition. Aluminum- normalized concentrations of calcium, copper, nickel, and zinc increased in the Musky Bay core in the mid-1990s. However, concentrations of these elements in surficial sediment from Musky Bay were similar to concentrations in other Lac Courte Oreilles bays, nearby lakes, and soils and were below probable effects concentrations for aquatic life. Biogenic-silica, diatom-community, and pollen profiles indicate that Musky Bay has become more eutrophic since about 1940 with the onset of cottage development and cranberry farming. The water quality of the bay has especially degraded during the last 25 years with increased growth of aquatic plants and the onset of a floating algal mat during the last decade. Biogenic silica data indicate that diatom production has consistently increased since the 1930s. Diatom assemblage profiles indicate a shift from low-nutrient species to higher-nutrient species during the 1940s and that aquatic plants reached their present density and/or composition during the 1970s. The diatom Fragilaria capucina (indicative of algal mat) greatly increased during the mid-1990s. Pollen data indicate that milfoil, which often becomes more common with elevated nutrients, became more widespread after 1920. The pollen data also indicate that wild rice was present in the eastern end of Musky Bay during the late 1800s and the early 1900s but disappeared after about 1920, probably because of water-level changes more so than eutrophication.
Multiple sources for late-Holocene tsunamis at Discovery Bay, Washington State, USA
Williams, H.F.L.; Hutchinson, I.; Nelson, A.R.
2005-01-01
Nine muddy sand beds interrupt a 2500-yr-old sequence of peat deposits beneath a tidal marsh at the head of Discovery Bay on the south shore of the Strait of Juan de Fuca, Washington. An inferred tsunami origin for the sand beds is assessed by means of six criteria. Although all the sand beds contain marine diatoms and almost all the beds display internal stratification, the areal extent of the oldest beds is too limited to confirm their origin as tsunami deposits. The ages of four beds overlap with known late-Holocene tsunamis generated by plate-boundary earthquakes at the Cascadia subduction zone. Diatom assemblages in peat deposits bracketing these four beds do not indicate concurrent change in elevation at Discovery Bay. Diatoms in the peat bracketing a tsunami bed deposited about 1000 cal. yr BP indicate a few decimeters of submergence, suggesting deformation on a nearby upper-plate fault. Other beds may mark tsunamis caused by more distant upper-plate earthquakes or local submarine landslides triggered by earthquake shaking. Tsunamis from both subduction zone and upper-plate sources pose a significant hazard to shoreline areas in this region.
Recording ground motions where people live
NASA Astrophysics Data System (ADS)
Cranswick, E.; Gardner, B.; Hammond, S.; Banfill, R.
The 1989 Loma Prieta, Calif., earthquake caused spectacular damage to structures up to 100 km away in the San Francisco Bay sedimentary basin, including the Cypress Street viaduct overpass, the Bay Bridge, and buildings in the San Francisco Marina district. Although the few mainshock ground motions recorded in the northern San Francisco Bay area were “significantly larger … than would be expected from the pre-existing data set,” none were recorded at the sites of these damaged structures [Hanks and Krawinkler, 1991].Loma Prieta aftershocks produced order-of-magnitude variations of ground motions related to sedimentary basin response over distances of 1-2 km and less [Cranswick et al., 1990]. In densely populated neighborhoods, these distances can encompass the residences of thousands of people, but it is very unlikely that these neighborhoods are monitored by even one seismograph. In the last decade, the complexity of computer models used to simulate high-frequency ground motions has increased by several orders of magnitude [e.g., Frankel and Vidale, 1992], but the number of seismograph stations—hence, the spatial density of the sampling of ground motion data—has remained relatively unchanged. Seismologists must therefore infer the nature of the ground motions in the great unknown regions between observation points.
An empirical Bayes approach to network recovery using external knowledge.
Kpogbezan, Gino B; van der Vaart, Aad W; van Wieringen, Wessel N; Leday, Gwenaël G R; van de Wiel, Mark A
2017-09-01
Reconstruction of a high-dimensional network may benefit substantially from the inclusion of prior knowledge on the network topology. In the case of gene interaction networks such knowledge may come for instance from pathway repositories like KEGG, or be inferred from data of a pilot study. The Bayesian framework provides a natural means of including such prior knowledge. Based on a Bayesian Simultaneous Equation Model, we develop an appealing Empirical Bayes (EB) procedure that automatically assesses the agreement of the used prior knowledge with the data at hand. We use variational Bayes method for posterior densities approximation and compare its accuracy with that of Gibbs sampling strategy. Our method is computationally fast, and can outperform known competitors. In a simulation study, we show that accurate prior data can greatly improve the reconstruction of the network, but need not harm the reconstruction if wrong. We demonstrate the benefits of the method in an analysis of gene expression data from GEO. In particular, the edges of the recovered network have superior reproducibility (compared to that of competitors) over resampled versions of the data. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Empirical Bayes Estimation of Coalescence Times from Nucleotide Sequence Data.
King, Leandra; Wakeley, John
2016-09-01
We demonstrate the advantages of using information at many unlinked loci to better calibrate estimates of the time to the most recent common ancestor (TMRCA) at a given locus. To this end, we apply a simple empirical Bayes method to estimate the TMRCA. This method is both asymptotically optimal, in the sense that the estimator converges to the true value when the number of unlinked loci for which we have information is large, and has the advantage of not making any assumptions about demographic history. The algorithm works as follows: we first split the sample at each locus into inferred left and right clades to obtain many estimates of the TMRCA, which we can average to obtain an initial estimate of the TMRCA. We then use nucleotide sequence data from other unlinked loci to form an empirical distribution that we can use to improve this initial estimate. Copyright © 2016 by the Genetics Society of America.
MTR BUILDING, TRA603. DETAILED VIEW OF NORTHWEST CORNERS OF MTR ...
MTR BUILDING, TRA-603. DETAILED VIEW OF NORTHWEST CORNERS OF MTR HIGH-BAY AND SECOND/THIRD STORY SECTIONS. NOTE SHAPE OF PANEL ABOVE WINDOW OVER "TRA-603" BUILDING NUMBERS. THIS IS A "STANDARD PANEL." INL NEGATIVE NUMBER HD46-42-2. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
Code of Federal Regulations, 2010 CFR
2010-07-01
... within the regulated navigation area and: (i) Sustained winds are greater than 25 knots but less than 40 knots, ensure the main engines are ready to provide full power in five minutes or less; and (ii) Sustained winds are 40 knots or over, ensure that the main engines are on line to immediately provide...
1995-04-17
KENNEDY SPACE CENTER, FLA. - A Space Shuttle Main Engine (SSME) hoist prepares to lift the first Block 1 engine to be installed in an orbiter into the number one position on Discovery while the spaceplane is being prepared for the STS-70 mission in the high bay of Orbiter Processing Facility 2. The new engine, SSME No. 2036, features a new high-pressure liquid oxygen turbopump, a two-duct powerhead, a baffleless main injector, a single-coil heat exchanger and start sequence modifications. The other two main engines to be used during the liftoff of the STS-70 mission are of the existing Phase II design.
Sambo, Francesco; de Oca, Marco A Montes; Di Camillo, Barbara; Toffolo, Gianna; Stützle, Thomas
2012-01-01
Reverse engineering is the problem of inferring the structure of a network of interactions between biological variables from a set of observations. In this paper, we propose an optimization algorithm, called MORE, for the reverse engineering of biological networks from time series data. The model inferred by MORE is a sparse system of nonlinear differential equations, complex enough to realistically describe the dynamics of a biological system. MORE tackles separately the discrete component of the problem, the determination of the biological network topology, and the continuous component of the problem, the strength of the interactions. This approach allows us both to enforce system sparsity, by globally constraining the number of edges, and to integrate a priori information about the structure of the underlying interaction network. Experimental results on simulated and real-world networks show that the mixed discrete/continuous optimization approach of MORE significantly outperforms standard continuous optimization and that MORE is competitive with the state of the art in terms of accuracy of the inferred networks.
Real-time earthquake monitoring using a search engine method.
Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong
2014-12-04
When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.
Real-time earthquake monitoring using a search engine method
Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong
2014-01-01
When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861
Creative brains: designing in the real world†
Goel, Vinod
2014-01-01
The process of designing artifacts is a creative activity. It is proposed that, at the cognitive level, one key to understanding design creativity is to understand the array of symbol systems designers utilize. These symbol systems range from being vague, imprecise, abstract, ambiguous, and indeterminate (like conceptual sketches), to being very precise, concrete, unambiguous, and determinate (like contract documents). The former types of symbol systems support associative processes that facilitate lateral (or divergent) transformations that broaden the problem space, while the latter types of symbol systems support inference processes facilitating vertical (or convergent) transformations that deepen of the problem space. The process of artifact design requires the judicious application of both lateral and vertical transformations. This leads to a dual mechanism model of design problem-solving comprising of an associative engine and an inference engine. It is further claimed that this dual mechanism model is supported by an interesting hemispheric dissociation in human prefrontal cortex. The associative engine and neural structures that support imprecise, ambiguous, abstract, indeterminate representations are lateralized in the right prefrontal cortex, while the inference engine and neural structures that support precise, unambiguous, determinant representations are lateralized in the left prefrontal cortex. At the brain level, successful design of artifacts requires a delicate balance between the two hemispheres of prefrontal cortex. PMID:24817846
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, Xiankui; Wu, Jichun; Wang, Dong, E-mail: wangdong@nju.edu.cn
Coastal areas have great significance for human living, economy and society development in the world. With the rapid increase of pressures from human activities and climate change, the safety of groundwater resource is under the threat of seawater intrusion in coastal areas. The area of Laizhou Bay is one of the most serious seawater intruded areas in China, since seawater intrusion phenomenon was firstly recognized in the middle of 1970s. This study assessed the pollution risk of a groundwater source filed of western Laizhou Bay area by inferring the probability distribution of groundwater Cl{sup −} concentration. The numerical model ofmore » seawater intrusion process is built by using SEAWAT4. The parameter uncertainty of this model is evaluated by Markov Chain Monte Carlo (MCMC) simulation, and DREAM{sub (ZS)} is used as sampling algorithm. Then, the predictive distribution of Cl{sup -} concentration at groundwater source field is inferred by using the samples of model parameters obtained from MCMC. After that, the pollution risk of groundwater source filed is assessed by the predictive quantiles of Cl{sup -} concentration. The results of model calibration and verification demonstrate that the DREAM{sub (ZS)} based MCMC is efficient and reliable to estimate model parameters under current observation. Under the condition of 95% confidence level, the groundwater source point will not be polluted by seawater intrusion in future five years (2015–2019). In addition, the 2.5% and 97.5% predictive quantiles show that the Cl{sup −} concentration of groundwater source field always vary between 175 mg/l and 200 mg/l. - Highlights: • The parameter uncertainty of seawater intrusion model is evaluated by MCMC. • Groundwater source field won’t be polluted by seawater intrusion in future 5 years. • The pollution risk is assessed by the predictive quantiles of Cl{sup −} concentration.« less
1990-09-01
turbid estuarine habitats such as Mobile Bay are very tolerant of moderately high concentrations of suspended sediments and thin layers of sediment...GULF OF MEXIC BAYOU 4M 2CAE SI2LTK Fiur 3. DsrbuinoAedmnTyesi oie a fo IsphordingLT anCLmb190 PART III: DREDGING EQUIPMENT AND OPERATIONAL TECHNIQUES...increase in ambient turbidity was noted. Water samples were collected at surface, middepth, and bottom. The sampling boats proceeded across their
Predicted liquefaction of East Bay fills during a repeat of the 1906 San Francisco earthquake
Holzer, T.L.; Blair, J.L.; Noce, T.E.; Bennett, M.J.
2006-01-01
Predicted conditional probabilities of surface manifestations of liquefaction during a repeat of the 1906 San Francisco (M7.8) earthquake range from 0.54 to 0.79 in the area underlain by the sandy artificial fills along the eastern shore of San Francisco Bay near Oakland, California. Despite widespread liquefaction in 1906 of sandy fills in San Francisco, most of the East Bay fills were emplaced after 1906 without soil improvement to increase their liquefaction resistance. They have yet to be shaken strongly. Probabilities are based on the liquefaction potential index computed from 82 CPT soundings using median (50th percentile) estimates of PGA based on a ground-motion prediction equation. Shaking estimates consider both distance from the San Andreas Fault and local site conditions. The high probabilities indicate extensive and damaging liquefaction will occur in East Bay fills during the next M ??? 7.8 earthquake on the northern San Andreas Fault. ?? 2006, Earthquake Engineering Research Institute.
Responses of water environment to tidal flat reduction in Xiangshan Bay: Part I hydrodynamics
NASA Astrophysics Data System (ADS)
Li, Li; Guan, Weibing; Hu, Jianyu; Cheng, Peng; Wang, Xiao Hua
2018-06-01
Xiangshan Bay consists of a deep tidal channel and three shallow inlets. A large-scale tidal flat has been utilized through coastal construction. To ascertain the accumulate influences of these engineering projects upon the tidal dynamics of the channel-inlets system, this study uses FVCOM to investigate the tides and flow asymmetries of the bay, and numerically simulate the long-term variations of tidal dynamics caused by the loss of tidal flats. It was found that the reduction of tidal flat areas from 1963 to 2010 slightly dampened M2 tidal amplitudes (0.1 m, ∼6%) and advanced its phases by reducing shoaling effects, while amplified M4 tidal amplitudes (0.09 m, ∼27%) and advanced its phases by reducing bottom friction, in the inner bay. Consequently, the ebb dominance was dampened indicated by reduced absolute value of elevation skewness (∼20%) in the bay. The tides and tidal asymmetry were impacted by the locations, areas and slopes of the tidal flats through changing tidal prism, shoaling effect and bottom friction, and consequently impacted tidal duration asymmetry in the bay. Tides and tidal asymmetry were more sensitive to the tidal flat at the head of the bay than the side bank. Reduced/increased tidal flat slopes around the Tie inlet dampened the ebb dominance. Tidal flat had a role in dissipating the M4 tide rather than generating it, while the advection only play a secondary role in generating the M4 tide. The full-length tidal flats reclamation would trigger the reverse of ebb to flood dominance in the bay. This study would be applicable for similar narrow bays worldwide.
Automated Interpretation of LIBS Spectra using a Fuzzy Logic Inference Engine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeremy J. Hatch; Timothy R. McJunkin; Cynthia Hanson
2012-02-01
Automated interpretation of laser-induced breakdown spectroscopy (LIBS) data is necessary due to the plethora of spectra that can be acquired in a relatively short time. However, traditional chemometric and artificial neural network methods that have been employed are not always transparent to a skilled user. A fuzzy logic approach to data interpretation has now been adapted to LIBS spectral interpretation. A fuzzy logic inference engine (FLIE) was used to differentiate between various copper containing and stainless steel alloys as well as unknowns. Results using FLIE indicate a high degree of confidence in spectral assignment.
Inference Engine in an Intelligent Ship Course-Keeping System
2017-01-01
The article presents an original design of an expert system, whose function is to automatically stabilize ship's course. The focus is put on the inference engine, a mechanism that consists of two functional components. One is responsible for the construction of state space regions, implemented on the basis of properly processed signals recorded by sensors from the input and output of an object. The other component is responsible for generating a control decision based on the knowledge obtained in the first module. The computing experiments described herein prove the effective and correct operation of the proposed system. PMID:29317859
Pollution and Climate Effects on Tree-Ring Nitrogen Isotopes
NASA Astrophysics Data System (ADS)
Savard, M. M.; Bégin, C.; Marion, J.; Smirnoff, A.
2009-04-01
BACKGROUND Monitoring of nitrous oxide concentration only started during the last 30 years in North America, but anthropogenic atmospheric nitrogen has been significantly emitted over the last 150 years. Can geochemical characteristics of tree rings be used to infer past changes in the nitrogen cycle of temperate regions? To address this question we use nitrogen stable isotopes in 125 years-long ring series from beech specimens (Fagus grandifolia) of the Georgian Bay Islands National Park (eastern Ontario), and pine (Pinus strobus) and beech trees of the Arboretum Morgan near Montreal (western Quebec). To evaluate the reliability of the N stable isotopes in wood treated for removal of soluble materials, we tested both tree species from the Montreal area. The reproducibility from tree to tree was excellent for both pine and beech trees, the isotopic trends were strongly concordant, and they were not influenced by the heartwood-sapwood transition zone. The coherence of changes of the isotopic series observed for the two species suggests that their tree-ring N isotopic values can serve as environmental indicator. RESULTS AND INTERPRETATION In Montreal and Georgian Bay, the N isotopes show strong and similar parallel agreement (Gleichlaufigkeit test) with the climatic parameters. So in fact, the short-term isotopic fluctuations correlate directly with summer precipitation and inversely with summer and spring temperature. A long-term decreasing isotope trend in Montreal indicates progressive changes in soil chemistry after 1951. A pedochemical change is also inferred for the Georgian Bay site on the basis of a positive N isotopic trend initiated after 1971. At both sites, the long-term ^15N series correlate with a proxy for NOx emissions (Pearson correlation), and carbon-isotope ring series suggest that the same trees have been impacted by phytotoxic pollutants (Savard et al., 2009a). We propose that the contrasted long-term nitrogen-isotope changes of Montreal and Georgian Bay reflect deposition of NOx emissions from cars and coal-power plants, with higher proportions from coal burning in Georgian Bay (Savard et al., 2009b). This interpretation is conceivable because recent monitoring indicates that coal-power plant NOx emissions play an important role in the annual N budget in Ontario, but they are negligible on the Quebec side. CONCLUSION Interpretations of long tree-ring N isotopic series in terms of effects generated by airborne N-species have been previously advocated. Here we further propose that the contrasted isotopic trends obtained for wood samples from two regions reflect different regional anthropogenic N deposition combined with variations of climatic conditions. This research suggests that nitrogen tree-ring series may record both regional climatic conditions and anthropogenic perturbations of the N cycle. REFERENCES Savard, M.M., Bégin,C., Marion, J., Aznar, J.-C., Smirnoff, A., 2009a. Changes of Air Quality in an urban region as inferred from tree-ring width and stable isotopes. Chapter 9 in "Relating Atmospheric Source Apportionment to Vegetation Effects: Establishing Cause Effect Relationships" (A. Legge ed.). Elsevier, Amsterdam; doi: 10.1016/S1474-8177(08)00209x. Savard, M.M., Bégin, C., Smirnoff, A., Marion, J., Rioux-Paquette, E., 2009b. Tree-ring nitrogen isotopes reflect climatic effects and anthropogenic NOx emissions. Env. Sci. Tech (doi: 10.1021/es802437k).
7. VIEW OF E5 WORK STATION AND MANIPULATOR ARMS WITHIN ...
7. VIEW OF E-5 WORK STATION AND MANIPULATOR ARMS WITHIN THE SOUTHEAST CORNER OF THE HOT BAY. - Nevada Test Site, Engine Maintenance Assembly & Disassembly Facility, Area 25, Jackass Flats, Mercury, Nye County, NV
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-02
... Management Branch, telephone 617-223-8355, email [email protected] or Mr. Luke Dlhopolsky, Civil Engineering Unit, Environmental Protection Specialist, telephone 401-736-1743, email [email protected
Dingler, J.R.; Carlson, D.V.; Sallenger, A.H.
1987-01-01
In April 1985, sand samples were collected from many of the beaches on Tutuila Island, American Samoa, and in July 1985, three bays were surveyed using side-scan sonar and shallow seismic profiling. During that second trip, scuba divers collected sand samples from the surveyed areas. Dingler and others (1986) describes the study; this report presents the grain-size and composition data for the onshore and offshore sand samples. Locations of the onshore samples are plotted on the map of the island, which is reproduced from Normark and others (1985); locations of most of the offshore samples and side-scan sonar interpretations made during the study are plotted on enlargements (A and B, respectively) of Fagaitua and Nua-seetaga Bays. Lam Yuen (1981), U.S. Army Corps of Engineers (1980), and Sea Engineering Services Inc. (1980) provide additional information pertaining to the island's beaches.
NASA Astrophysics Data System (ADS)
Williams, Shaun; Zhang, Tianran; Chagué, Catherine; Williams, James; Goff, James; Lane, Emily M.; Bind, Jochen; Qasim, Ilyas; Thomas, Kristie-Lee; Mueller, Christof; Hampton, Sam; Borella, Josh
2018-07-01
The 14 November 2016 Kaikōura Tsunami inundated Little Pigeon Bay in Banks Peninsula, New Zealand, and left a distinct sedimentary deposit, on the ground and within the cottage near the shore. Sedimentary (grain size) and geochemical (electrical conductivity and X-Ray Fluorescence) analyses on samples collected over successive field campaigns are used to characterize the deposits. Sediment distribution observed in the cottage in combination with flow direction indicators suggests that sediment and debris laid down within the building were predominantly the result of a single wave that had been channeled up the stream bed rather than from offshore. Salinity data indicated that the maximum tsunami-wetted and/or seawater-sprayed area extended 12.5 m farther inland than the maximum inundation distance inferred from the debris line observed a few days after the event. In addition, the salinity signature was short-lived. An overall inland waning of tsunami energy was indicated by the mean grain size and portable X-Ray Fluorescence elemental results. ITRAX data collected from three cores along an inland transect indicated a distinct elevated elemental signature at the surfaces of the cores, with an associated increase in magnetic susceptibility. Comparable signatures were also identified within subsurface stratigraphic sequences, and likely represent older tsunamis known to have inundated this bay as well as adjacent bays in Banks Peninsula. The sedimentary and geochemical signatures of the 2016 Kaikōura Tsunami at Little Pigeon Bay provide a modern benchmark that can be used to identify older tsunami deposits in the Banks Peninsula region.
Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B
2006-08-01
Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.
Klouch, Z K; Caradec, F; Plus, M; Hernández-Fariñas, T; Pineau-Guillou, L; Chapelle, A; Schmitt, S; Quéré, J; Guillou, L; Siano, R
2016-12-01
Within the framework of research aimed at using genetic methods to evaluate harmful species distribution and their impact on coastal ecosystems, a portion of the ITS1rDNA of Alexandrium minutum was amplified by real-time PCR from DNA extracts of superficial (1-3cm) sediments of 30 subtidal and intertidal stations of the Bay of Brest (Brittany, France), during the winters of 2013 and 2015. Cell germinations and rDNA amplifications of A. minutum were obtained for sediments of all sampled stations, demonstrating that the whole bay is currently contaminated by this toxic species. Coherent estimations of ITS1rDNA copy numbers were obtained for the two sampling cruises, supporting the hypothesis of regular accumulation of A. minutum resting stages in the south-eastern, more confined embayments of the study area, where fine-muddy sediments are also more abundant. Higher ITS1rDNA copy numbers were detected in sediments of areas where blooms have been seasonally detected since 2012. This result suggests that specific genetic material estimations in superficial sediments of the bay may be a proxy of the cyst banks of A. minutum. The simulation of particle trajectory analyses by a Lagrangian physical model showed that blooms occurring in the south-eastern part of the bay are disconnected from those of the north-eastern zone. The heterogeneous distribution of A. minutum inferred from both water and sediment suggests the existence of potential barriers for the dispersal of this species in the Bay of Brest and encourages finer analyses at the population level for this species within semi-enclosed coastal ecosystems. Copyright © 2016 Elsevier B.V. All rights reserved.
Knowledge Engineering Aspects of Affective Bi-Modal Educational Applications
NASA Astrophysics Data System (ADS)
Alepis, Efthymios; Virvou, Maria; Kabassi, Katerina
This paper analyses the knowledge and software engineering aspects of educational applications that provide affective bi-modal human-computer interaction. For this purpose, a system that provides affective interaction based on evidence from two different modes has been developed. More specifically, the system's inferences about students' emotions are based on user input evidence from the keyboard and the microphone. Evidence from these two modes is combined by a user modelling component that incorporates user stereotypes as well as a multi criteria decision making theory. The mechanism that integrates the inferences from the two modes has been based on the results of two empirical studies that were conducted in the context of knowledge engineering of the system. The evaluation of the developed system showed significant improvements in the recognition of the emotional states of users.
113. ARAI Hot cell (ARA626) Building wall sections and details ...
113. ARA-I Hot cell (ARA-626) Building wall sections and details of radio chemistry lab. Shows high-bay roof over hot cells and isolation rooms below grade storage pit for fuel elements. Norman Engineering Company: 961-area/SF-626-A-4. Date: January 1959. Ineel index code no. 068-0626-00-613-102724. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID
Baleen whale ecology and feeding habitat use in the Bay of Fundy, Canada
NASA Astrophysics Data System (ADS)
Davies, K. T. A.; Brown, M.; Taggart, C. T.
2016-02-01
The Bay of Fundy on the east coast of Canada contains a rich supply of zooplankton and fish that provide food for diverse baleen whales. Endangered North Atlantic right whales and other large baleen whales have been monitored in the Bay of Fundy at least weekly during every summer since the 1980s. Over the most recent years, significant declines in sightings and residency of the right whales have been observed in this habitat; hypothetically indicative of a substantial and multi-year reduction in food supply. Whether concurrent changes in other baleen whales and, by inference, their respective food supplies have also occurred is unknown. This study quantifies changes in baleen whale ecology in the Bay over three decades with a focus on comparing (1) recent declines in right whale sightings to long-term historical trends, and (2) variation in right whale sightings to other large baleen whale species who share similar zooplankton food sources (e.g., sei whales) or rely on different food sources (e.g., fin, humpback). First, survey effort is reconstructed as survey track-length and as time-on-effort, and the space-time variations in effort are quantified. Then, variation in whale density indices and residency among surveys and survey-years are examined through effort-corrected sightings of all baleen whale species, as well as effort-corrected sightings of photo-identified right whale individuals. Results are then interpreted within the context of the oceanographic and food supply variation in the habitat.
Learning and inference using complex generative models in a spatial localization task.
Bejjanki, Vikranth R; Knill, David C; Aslin, Richard N
2016-01-01
A large body of research has established that, under relatively simple task conditions, human observers integrate uncertain sensory information with learned prior knowledge in an approximately Bayes-optimal manner. However, in many natural tasks, observers must perform this sensory-plus-prior integration when the underlying generative model of the environment consists of multiple causes. Here we ask if the Bayes-optimal integration seen with simple tasks also applies to such natural tasks when the generative model is more complex, or whether observers rely instead on a less efficient set of heuristics that approximate ideal performance. Participants localized a "hidden" target whose position on a touch screen was sampled from a location-contingent bimodal generative model with different variances around each mode. Over repeated exposure to this task, participants learned the a priori locations of the target (i.e., the bimodal generative model), and integrated this learned knowledge with uncertain sensory information on a trial-by-trial basis in a manner consistent with the predictions of Bayes-optimal behavior. In particular, participants rapidly learned the locations of the two modes of the generative model, but the relative variances of the modes were learned much more slowly. Taken together, our results suggest that human performance in a more complex localization task, which requires the integration of sensory information with learned knowledge of a bimodal generative model, is consistent with the predictions of Bayes-optimal behavior, but involves a much longer time-course than in simpler tasks.
NASA Astrophysics Data System (ADS)
Vreeland, Nicholas Paul
According to some theories, subglacial deformation of sediment is the process of sediment transport most responsible for drumlin formation. If so, strain indicators in the sediment should yield deformation patterns that are systematically related to drumlin morphology. Clast fabrics have been used most commonly to make inferences about strain patterns in drumlins but with a wide range of sometimes divergent interpretations. These divergent interpretations reflect, in part, a lack of experimental control on the relationship between the state of strain and resulting fabrics. Herein, fabrics determined from the anisotropy of magnetic susceptibility (AMS) of till within selected drumlins of the Green Bay Lobe are used to study the role of bed deformation in drumlin formation. AMS fabrics are a proxy for fabrics formed by non-equant, silt-sized, magnetite grains. Unlike past fabric studies of drumlins, laboratory deformation experiments conducted with this till provide a quantitative foundation for inferring strain magnitude, shearing direction, and shear-plane orientations from fabrics determined from principal directions of magnetic susceptibility (k1, k2, and k3). Intact till samples were collected from transects in seven drumlins in Dane, Dodge, Jefferson, Waupaca, and Waushara counties of south-central Wisconsin, by both exploiting five existing outcrops and collecting 42 89 mm-diameter cores and sub-sampling them. Overall, ˜2800 samples were collected for AMS analysis, and 112 AMS fabrics were computed. Much of the till sampled (84% of fabrics) has k1 fabric strengths weaker than the lower 95% confidence limit for till (S1< 0.82) sheared to moderate strains (˜10), suggesting the till has been deformed but to strains too small to indicate that bed deformation was the principal till transport mechanism. Three of five drumlins studied have k1 fabric orientations that are not symmetrically disposed about the local flow direction indicated by drumlins. Rather, these fabrics are oriented 7-25° to the southeast of the drumlin orientations, consistent with reinterpreted microfabric data collected from nearby drumlins (Evenson, 1971). Furthermore, in all drumlins, orientations of shear planes inferred from principal susceptibilities deviate markedly from the local surface slopes of drumlins, with a 23.8° average difference between the poles to inferred shear planes and to local slopes. We infer that the drumlin fabric was set by basal till deformation during glacier flow to the southeast prior to drumlin formation and that drumlinization did not significantly reset the fabric. Thus, these drumlins are inferred to have been formed by differential erosion of a pre-existing till layer by processes unrelated to bed deformation.
ERTS program of the US Army Corps of Engineers. [water resources
NASA Technical Reports Server (NTRS)
Jarman, J. W.
1974-01-01
The Army Corps of Engineers research and development efforts associated with the ERTS Program are confined to applications of investigation, design, construction, operation, and maintenance of water resource projects. Problems investigated covered: (1) resource inventory; (2) environmental impact; (3) pollution monitoring; (4) water circulation; (5) sediment transport; (6) data collection systems; (7) engineering; and (8) model verification. These problem areas were investigated in relation to bays, reservoirs, lakes, rivers, coasts, and regions. ERTS-1 imagery has been extremely valuable in developing techniques and is now being used in everyday applications.
Uncertainty quantification of measured quantities for a HCCI engine: composition or temperatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petitpas, Guillaume; Whitesides, Russell
UQHCCI_1 computes the measurement uncertainties of a HCCI engine test bench using the pressure trace and the estimated uncertainties of the measured quantities as inputs, then propagating them through Bayesian inference and a mixing model.
STS-26 Discovery, OV-103, SSME (2019) installed in position number one at KSC
1988-01-10
S88-29076 (10 Jan 1988) --- KSC employees work together to carefully guide a 7,000 pound main engine into the number one position in Discovery's aft compartment. Because of the engine's weight and size, special handling equipment is needed to perform the installation. Discovery is currently being prepared for the upcoming STS-26 mission in bay 1 of the Orbiter Processing Facility. This engine, 2019, arrived at KSC on Jan. 6 and was installed Jan. 10. The other two engines are scheduled to be installed later this month. The shuttle's three main liquid fueled engines provide the main propulsion for the orbiter vehicle. The cluster of three engines operate in parallel with the solid rocket boosters during the initial ascent.
2009-11-05
CAPE CANAVERAL, Fla. – Pratt & Whitney Rocketdyne technicians install a space shuttle main engine on space shuttle Endeavour in Orbiter Processing Facility Bay 2 at NASA's Kennedy Space Center in Florida. The engine will fly on the shuttle's STS-130 mission to the International Space Station. Even though this engine weighs one-seventh as much as a locomotive engine, its high-pressure fuel pump alone delivers as much horsepower as 28 locomotives, while its high-pressure oxidizer pump delivers the equivalent horsepower of an additional 11 locomotives. The maximum equivalent horsepower developed by the shuttle's three main engines is more than 37 million horsepower. Endeavour is targeted to launch Feb. 4, 2010. Photo credit: NASA/Jim Grossmann
2009-11-05
CAPE CANAVERAL, Fla. – A Pratt & Whitney Rocketdyne technician carefully maneuvers a space shuttle main engine into position on space shuttle Endeavour in Orbiter Processing Facility Bay 2 at NASA's Kennedy Space Center in Florida. The engine will fly on the shuttle's STS-130 mission to the International Space Station. Even though this engine weighs one-seventh as much as a locomotive engine, its high-pressure fuel pump alone delivers as much horsepower as 28 locomotives, while its high-pressure oxidizer pump delivers the equivalent horsepower of an additional 11 locomotives. The maximum equivalent horsepower developed by the shuttle's three main engines is more than 37 million horsepower. Endeavour is targeted to launch Feb. 4, 2010. Photo credit: NASA/Jim Grossmann
2009-11-05
CAPE CANAVERAL, Fla. – A Pratt & Whitney Rocketdyne technician carefully maneuvers a space shuttle main engine into position on space shuttle Endeavour in Orbiter Processing Facility Bay 2 at NASA's Kennedy Space Center in Florida. The engine will fly on the shuttle's STS-130 mission to the International Space Station. Even though this engine weighs one-seventh as much as a locomotive engine, its high-pressure fuel pump alone delivers as much horsepower as 28 locomotives, while its high-pressure oxidizer pump delivers the equivalent horsepower of an additional 11 locomotives. The maximum equivalent horsepower developed by the shuttle's three main engines is more than 37 million horsepower. Endeavour is targeted to launch Feb. 4, 2010. Photo credit: NASA/Jim Grossmann
Gálvez, Jorge A; Pappas, Janine M; Ahumada, Luis; Martin, John N; Simpao, Allan F; Rehman, Mohamed A; Witmer, Char
2017-10-01
Venous thromboembolism (VTE) is a potentially life-threatening condition that includes both deep vein thrombosis (DVT) and pulmonary embolism. We sought to improve detection and reporting of children with a new diagnosis of VTE by applying natural language processing (NLP) tools to radiologists' reports. We validated an NLP tool, Reveal NLP (Health Fidelity Inc, San Mateo, CA) and inference rules engine's performance in identifying reports with deep venous thrombosis using a curated set of ultrasound reports. We then configured the NLP tool to scan all available radiology reports on a daily basis for studies that met criteria for VTE between July 1, 2015, and March 31, 2016. The NLP tool and inference rules engine correctly identified 140 out of 144 reports with positive DVT findings and 98 out of 106 negative reports in the validation set. The tool's sensitivity was 97.2% (95% CI 93-99.2%), specificity was 92.5% (95% CI 85.7-96.7%). Subsequently, the NLP tool and inference rules engine processed 6373 radiology reports from 3371 hospital encounters. The NLP tool and inference rules engine identified 178 positive reports and 3193 negative reports with a sensitivity of 82.9% (95% CI 74.8-89.2) and specificity of 97.5% (95% CI 96.9-98). The system functions well as a safety net to screen patients for HA-VTE on a daily basis and offers value as an automated, redundant system. To our knowledge, this is the first pediatric study to apply NLP technology in a prospective manner for HA-VTE identification.
8. VIEW OF E3 WORK STATION WITH MANIPULATOR ARMS IN ...
8. VIEW OF E-3 WORK STATION WITH MANIPULATOR ARMS IN EAST OPERATING GALLERY LOOKING INTO THE HOT BAY. - Nevada Test Site, Engine Maintenance Assembly & Disassembly Facility, Area 25, Jackass Flats, Mercury, Nye County, NV
NASA Astrophysics Data System (ADS)
Panmei, Champoungam; Naidu, Pothuri Divakar; Naik, Sushant Suresh
2018-06-01
Oceanographic processes in the Bay of Bengal (BoB) are strongly impacted by south-westerly and north-easterly winds of the Indian monsoon system during the summer and winter respectively. Variations in calcium carbonate (CaCO3) content and magnetic susceptibility (MS), along with Ba, Ti, and Al, were reconstructed for the past 80 kyr using a sediment core (MD 161/28) from the northern BoB in order to understand the changes in calcium carbonate deposition and MS signals associated with the Indian monsoon system. Our records infer monsoon-induced dilution through river discharges from different sediment provenance to be the main controlling factor of the CaCO3 variations at the core location. Generally lower CaCO3 content during stronger-southwest monsoon (SWM) interglacial periods (Marine Isotope Stage (MIS) 5a & 1, except 3) and higher CaCO3 content during weaker-SWM glacial periods (MIS 4 & 2) were documented. High MS correspond to MIS 4 & 2 of weakened SWM and strengthened northeast monsoon (NEM) periods caused due to enhanced sediment supply from the Peninsular Indian regions, whereas lower MS values correspond to MIS 5, 3 & 1 of strengthened SWM and weakened NEM derived through Ganges-Brahmaputra from the Himalaya Region. Thus, our records infer coupling of major rivers' discharges to the BoB with the SWM and NEM strengths, which has implications on the linkage with other climatic variations such as East Asian monsoon and Northern Hemisphere climate.
Equivalent statistics and data interpretation.
Francis, Gregory
2017-08-01
Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.
NASA Astrophysics Data System (ADS)
Ndu, Obibobi Kamtochukwu
To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.
Liu, Yanfeng; Li, Jianghua; Du, Guocheng; Chen, Jian; Liu, Long
By combining advanced omics technology and computational modeling, systems biologists have identified and inferred thousands of regulatory events and system-wide interactions of the bacterium Bacillus subtilis, which is commonly used both in the laboratory and in industry. This dissection of the multiple layers of regulatory networks and their interactions has provided invaluable information for unraveling regulatory mechanisms and guiding metabolic engineering. In this review, we discuss recent advances in the systems biology and metabolic engineering of B. subtilis and highlight current gaps in our understanding of global metabolism and global pathway engineering in this organism. We also propose future perspectives in the systems biology of B. subtilis and suggest ways that this approach can be used to guide metabolic engineering. Specifically, although hundreds of regulatory events have been identified or inferred via systems biology approaches, systematic investigation of the functionality of these events in vivo has lagged, thereby preventing the elucidation of regulatory mechanisms and further rational pathway engineering. In metabolic engineering, ignoring the engineering of multilayer regulation hinders metabolic flux redistribution. Post-translational engineering, allosteric engineering, and dynamic pathway analyses and control will also contribute to the modulation and control of the metabolism of engineered B. subtilis, ultimately producing the desired cellular traits. We hope this review will aid metabolic engineers in making full use of available systems biology datasets and approaches for the design and perfection of microbial cell factories through global metabolism optimization. Copyright © 2016 Elsevier Inc. All rights reserved.
An expert system environment for the Generic VHSIC Spaceborne Computer (GVSC)
NASA Astrophysics Data System (ADS)
Cockerham, Ann; Labhart, Jay; Rowe, Michael; Skinner, James
The authors describe a Phase II Phillips Laboratory Small Business Innovative Research (SBIR) program being performed to implement a flexible and general-purpose inference environment for embedded space and avionics applications. This inference environment is being developed in Ada and takes special advantage of the target architecture, the GVSC. The GVSC implements the MIL-STD-1750A ISA and contains enhancements to allow access of up to 8 MBytes of memory. The inference environment makes use of the Merit Enhanced Traversal Engine (METE) algorithm, which employs the latest inference and knowledge representation strategies to optimize both run-time speed and memory utilization.
Continuity equation for probability as a requirement of inference over paths
NASA Astrophysics Data System (ADS)
González, Diego; Díaz, Daniela; Davis, Sergio
2016-09-01
Local conservation of probability, expressed as the continuity equation, is a central feature of non-equilibrium Statistical Mechanics. In the existing literature, the continuity equation is always motivated by heuristic arguments with no derivation from first principles. In this work we show that the continuity equation is a logical consequence of the laws of probability and the application of the formalism of inference over paths for dynamical systems. That is, the simple postulate that a system moves continuously through time following paths implies the continuity equation. The translation between the language of dynamical paths to the usual representation in terms of probability densities of states is performed by means of an identity derived from Bayes' theorem. The formalism presented here is valid independently of the nature of the system studied: it is applicable to physical systems and also to more abstract dynamics such as financial indicators, population dynamics in ecology among others.
Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten
2017-08-01
Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
Annealed Importance Sampling for Neural Mass Models
Penny, Will; Sengupta, Biswa
2016-01-01
Neural Mass Models provide a compact description of the dynamical activity of cell populations in neocortical regions. Moreover, models of regional activity can be connected together into networks, and inferences made about the strength of connections, using M/EEG data and Bayesian inference. To date, however, Bayesian methods have been largely restricted to the Variational Laplace (VL) algorithm which assumes that the posterior distribution is Gaussian and finds model parameters that are only locally optimal. This paper explores the use of Annealed Importance Sampling (AIS) to address these restrictions. We implement AIS using proposals derived from Langevin Monte Carlo (LMC) which uses local gradient and curvature information for efficient exploration of parameter space. In terms of the estimation of Bayes factors, VL and AIS agree about which model is best but report different degrees of belief. Additionally, AIS finds better model parameters and we find evidence of non-Gaussianity in their posterior distribution. PMID:26942606
ETR COOLING TOWER. PUMP HOUSE (TRA645) IN SHADOW OF TOWER ...
ETR COOLING TOWER. PUMP HOUSE (TRA-645) IN SHADOW OF TOWER ON LEFT. AT LEFT OF VIEW, HIGH-BAY BUILDING IS ETR. ONE STORY ATTACHMENT IS ETR ELECTRICAL BUILDING. STACK AT RIGHT IS ETR STACK; MTR STACK IS TOWARD LEFT. CAMERA FACING NORTHEAST. INL NEGATIVE NO. 56-3799. Jack L. Anderson, 11/26/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
1980-09-01
the past when these ties. zones were thought to have shifted in response In this study 21 maps (scale 1:6,000) of repre- to climatic changes . sentative...Alaska Grasses Roads Climate Permafrost Soil erosion Drainage Pipelines Vegetation Environmental engineering Restoration Erosion control Revegetation 20...of changes in the environment associated with the road, 3) documentation of flora and vegetation along the 577-km-long transect, 4) methodologies for
1988-06-27
de olf nessse end Id e ;-tl Sb ieeI smleo) ,Optical Artificial Intellegence ; Optical inference engines; Optical logic; Optical informationprocessing...common. They arise in areas such as expert systems and other artificial intelligence systems. In recent years, the computer science language PROLOG has...cal processors should in principle be well suited for : I artificial intelligence applications. In recent years, symbolic logic processing. , the
Computational statistics using the Bayesian Inference Engine
NASA Astrophysics Data System (ADS)
Weinberg, Martin D.
2013-09-01
This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.
NASA Astrophysics Data System (ADS)
WU, Z. Y.; Saito, Y.; Milliman, J. D.; Zhao, D.; Zhou, J.
2015-12-01
Estuaries have been the site of intensive human activities. During the past century, decreased fluvial water and sediment discharge, increasing land reclamation, changing climate, and rising sea level have had an ever-increasing impact on river deltas, particularly those deltas bordering Southeast Asia. Using six stages of navigational and bathymetric chart data from 1906 to 2013 and 2 years (2012,2013) single-beam bathymetric data, together with more than 50 years of fluvial discharge data, we document the impact of human activities on the Pearl River Delta and its estuary at Lingding Bay. Between 1906 and 2010, land reclamation decreased the bay's water area by ~300 km2 (>17%), mostly at the expense of the shrinking intertidal and shallow subtidal mudflats. Before 1980, the estuary was mainly governed by natural processes with slight net deposition, whereas after 1980 dredging in the estuary and large port engineering projects changed the estuarine topography by shallowing the shoals and deepening the troughs. From 1955 to 2010, the water volume of Lingding Bay decreased by 536 × 106 m3 for a net decrease of 9.7 × 106 m3 a year, which indicates that approximately 9.7 Mt/yr of sediment was deposited in Lingding Bay during that period. In 2012 and 2013, large-scale human activities within Lingding Bay included continued dredging plus a surge of sand excavation that changed local water depths by ±5 m/yr, far exceeding the range of natural topographic evolution in the estuary. The impacts of various human activities have significantly changed submarine topography in Lingding Bay of the complex Pearl River Estuary. With continuing economic expansion in the Pearl River Delta, Lingding Bay should continue to shrink in both area and water volume.
6. OVERALL VIEW OF THE FRONT AND THE TOWER, LOOKING ...
6. OVERALL VIEW OF THE FRONT AND THE TOWER, LOOKING WEST FROM THE ACTIVE PIER OF BAY SHIP AND YACHT COMPANY. COAST GUARD CUTTER SHERMAN AT RIGHT. - United Engineering Company Shipyard, Crane, 2900 Main Street, Alameda, Alameda County, CA
5. View toward the northeast at the inside elevation of ...
5. View toward the northeast at the inside elevation of the eastern segment of the north roundhouse. Bay numbers for stalls 58 through 66 are evident. - Central Railroad of New Jersey, Engine Terminal, Jersey City, Hudson County, NJ
NASA Marches on with Test of RS-25 Engine for New Space Launch System
2016-07-29
NASA engineers conducted a successful developmental test of RS-25 rocket engine No. 0528 July 29, 2016, to collect critical performance data for the most powerful rocket in the world – the Space Launch System (SLS). The engine roared to life for a full 650-second test on the A-1 Test Stand at NASA’s Stennis Space Center, near Bay St. Louis, Mississippi, marking another step forward in development of the SLS, which will launch humans deeper into space than ever before, including on the journey to Mars. Four RS-25 engines, joined with a pair of solid rocket boosters, will power the SLS core stage at launch. The RS-25 engines used on the first four SLS flights are former space shuttle main engines, modified to operate at a higher performance level and with a new engine controller, which allows communication between the vehicle and engine.
Biological communities in San Francisco Bay track large-scale climate forcing over the North Pacific
NASA Astrophysics Data System (ADS)
Cloern, James E.; Hieb, Kathryn A.; Jacobson, Teresa; Sansó, Bruno; Di Lorenzo, Emanuele; Stacey, Mark T.; Largier, John L.; Meiring, Wendy; Peterson, William T.; Powell, Thomas M.; Winder, Monika; Jassby, Alan D.
2010-11-01
Long-term observations show that fish and plankton populations in the ocean fluctuate in synchrony with large-scale climate patterns, but similar evidence is lacking for estuaries because of shorter observational records. Marine fish and invertebrates have been sampled in San Francisco Bay since 1980 and exhibit large, unexplained population changes including record-high abundances of common species after 1999. Our analysis shows that populations of demersal fish, crabs and shrimp covary with the Pacific Decadal Oscillation (PDO) and North Pacific Gyre Oscillation (NPGO), both of which reversed signs in 1999. A time series model forced by the atmospheric driver of NPGO accounts for two-thirds of the variability in the first principal component of species abundances, and generalized linear models forced by PDO and NPGO account for most of the annual variability of individual species. We infer that synchronous shifts in climate patterns and community variability in San Francisco Bay are related to changes in oceanic wind forcing that modify coastal currents, upwelling intensity, surface temperature, and their influence on recruitment of marine species that utilize estuaries as nursery habitat. Ecological forecasts of estuarine responses to climate change must therefore consider how altered patterns of atmospheric forcing across ocean basins influence coastal oceanography as well as watershed hydrology.
Temporal variability in shell mound formation at Albatross Bay, northern Australia
Petchey, Fiona; Allely, Kasey; Shiner, Justin I.; Bailey, Geoffrey
2017-01-01
We report the results of 212 radiocarbon determinations from the archaeological excavation of 70 shell mound deposits in the Wathayn region of Albatross Bay, Australia. This is an intensive study of a closely co-located group of mounds within a geographically restricted area in a wider region where many more shell mounds have been reported. Valves from the bivalve Tegillarca granosa (Linnaeus, 1758) were dated. The dates obtained are used to calculate rates of accumulation for the shell mound deposits. These demonstrate highly variable rates of accumulation both within and between mounds. We assess these results in relation to likely mechanisms of shell deposition and show that rates of deposition are affected by time-dependent processes both during the accumulation of shell deposits and during their subsequent deformation. This complicates the interpretation of the rates at which shell mound deposits appear to have accumulated. At Wathayn, there is little temporal or spatial consistency in the rates at which mounds accumulated. Comparisons between the Wathayn results and those obtained from shell deposits elsewhere, both in the wider Albatross Bay region and worldwide, suggest the need for caution when deriving behavioural inferences from shell mound deposition rates, and the need for more comprehensive sampling of individual mounds and groups of mounds. PMID:28854234
Joukhadar, Zeina; Patterson, W.P.; Todd, T.N.; Smith, G.R.
2002-01-01
The population of Coregonus artedi in the St. Marys River, between lakes Superior and Huron, was sampled and otoliths were analyzed for oxygen isotopic composition to determine whether the fish are residents in the St. Marys River and its warm bays or migrants to and from cold Lake Huron. Otoliths were extracted, sectioned, and growth ring-specific samples of calcium carbonate were milled to obtain samples for determination of oxygen isotope ratios (18O values). The 18O values of calcium carbonate (CaCO3) in accretionary structures such as otoliths allow calculation of growth temperatures of the fish, because of differential fractionation of oxygen isotopes at different temperatures. Growth temperatures of 10 St. Marys River lake herring were compared with lake and catch data as well as growth temperatures of lake herring collected from Lake Huron and other ciscoes from the Great Lakes. Results of this analysis indicate that these fish remained in the bays of the St. Marys River for their entire life history. After their second year they grew at average temperatures between 11 C and 13 C, consistent with temperature in the warmer bays of the St. Marys River and 6 C higher than expected for growth of this species in Lake Huron.
Bayesian Parameter Inference and Model Selection by Population Annealing in Systems Biology
Murakami, Yohei
2014-01-01
Parameter inference and model selection are very important for mathematical modeling in systems biology. Bayesian statistics can be used to conduct both parameter inference and model selection. Especially, the framework named approximate Bayesian computation is often used for parameter inference and model selection in systems biology. However, Monte Carlo methods needs to be used to compute Bayesian posterior distributions. In addition, the posterior distributions of parameters are sometimes almost uniform or very similar to their prior distributions. In such cases, it is difficult to choose one specific value of parameter with high credibility as the representative value of the distribution. To overcome the problems, we introduced one of the population Monte Carlo algorithms, population annealing. Although population annealing is usually used in statistical mechanics, we showed that population annealing can be used to compute Bayesian posterior distributions in the approximate Bayesian computation framework. To deal with un-identifiability of the representative values of parameters, we proposed to run the simulations with the parameter ensemble sampled from the posterior distribution, named “posterior parameter ensemble”. We showed that population annealing is an efficient and convenient algorithm to generate posterior parameter ensemble. We also showed that the simulations with the posterior parameter ensemble can, not only reproduce the data used for parameter inference, but also capture and predict the data which was not used for parameter inference. Lastly, we introduced the marginal likelihood in the approximate Bayesian computation framework for Bayesian model selection. We showed that population annealing enables us to compute the marginal likelihood in the approximate Bayesian computation framework and conduct model selection depending on the Bayes factor. PMID:25089832
Automated Classification of Pathology Reports.
Oleynik, Michel; Finger, Marcelo; Patrão, Diogo F C
2015-01-01
This work develops an automated classifier of pathology reports which infers the topography and the morphology classes of a tumor using codes from the International Classification of Diseases for Oncology (ICD-O). Data from 94,980 patients of the A.C. Camargo Cancer Center was used for training and validation of Naive Bayes classifiers, evaluated by the F1-score. Measures greater than 74% in the topographic group and 61% in the morphologic group are reported. Our work provides a successful baseline for future research for the classification of medical documents written in Portuguese and in other domains.
Bayesian inference based on dual generalized order statistics from the exponentiated Weibull model
NASA Astrophysics Data System (ADS)
Al Sobhi, Mashail M.
2015-02-01
Bayesian estimation for the two parameters and the reliability function of the exponentiated Weibull model are obtained based on dual generalized order statistics (DGOS). Also, Bayesian prediction bounds for future DGOS from exponentiated Weibull model are obtained. The symmetric and asymmetric loss functions are considered for Bayesian computations. The Markov chain Monte Carlo (MCMC) methods are used for computing the Bayes estimates and prediction bounds. The results have been specialized to the lower record values. Comparisons are made between Bayesian and maximum likelihood estimators via Monte Carlo simulation.
Long time-series of turbid coastal water using AVHRR: An example from Florida Bay, USA
Stumpf, R.P.; Frayer, M.L.
1997-01-01
The AVHRR can provide information on the reflectance of turbid case II water, permitting examination of large estuaries and plumes from major rivers. The AVHRR has been onboard several NOAA satellites, with afternoon overpasses since 1981, offering a long time-series to examine changes in coastal water. We are using AVHRR data starting in December 1989, to examine water clarity in Florida Bay, which has undergone a decline since the late 1980's. The processing involves obtaining a nominal reflectance for red light with standard corrections including those for Rayleigh and aerosol path radiances. Established relationships between reflectance and the water properties being measured in the Bay provide estimates of diffuse attenuation and light limitation for phytoplankton and seagrass productivity studies. Processing also includes monthly averages of reflectance and attenuation. The AVHRR data set describes spatial and temporal patterns, including resuspension of bottom sediments in the winter, and changes in water clarity. The AVHRR also indicates that Florida Bay has much higher reflectivity relative to attenuation than other southeastern US estuaries. ??2005 Copyright SPIE - The International Society for Optical Engineering.
1998-07-06
James W. Tibble (pointing at engine), an Engine Systems/Ground Support Equipment team manager for Rocketdyne, discusses the operation of a Space Shuttle Main Engine with Robert B. Sieck, director of Shuttle Processing; U.S. Congressman Dave Weldon; and KSC Center Director Roy D. Bridges Jr. Following the ribbon cutting ceremony for KSC's new 34,600-square-foot Space Shuttle Main Engine Processing Facility (SSMEPF), KSC employees and media explored the facility. A major addition to the existing Orbiter Processing Facility Bay 3, the SSMEPF replaces the Shuttle Main Engine Shop located in the Vehicle Assembly Building (VAB). The decision to move the shop out of the VAB was prompted by safety considerations and recent engine processing improvements. The first three main engines to be processed in the new facility will fly on Shuttle Endeavour's STS-88 mission in December 1998
The SSMEPF opens with a ribbon-cutting ceremony
NASA Technical Reports Server (NTRS)
1998-01-01
James W. Tibble (pointing at engine), an Engine Systems/Ground Support Equipment team manager for Rocketdyne, discusses the operation of a Space Shuttle Main Engine with Robert B. Sieck, director of Shuttle Processing; U.S. Congressman Dave Weldon; and KSC Center Director Roy D. Bridges Jr. Following the ribbon cutting ceremony for KSC's new 34,600-square-foot Space Shuttle Main Engine Processing Facility (SSMEPF), KSC employees and media explored the facility. A major addition to the existing Orbiter Processing Facility Bay 3, the SSMEPF replaces the Shuttle Main Engine Shop located in the Vehicle Assembly Building (VAB). The decision to move the shop out of the VAB was prompted by safety considerations and recent engine processing improvements. The first three main engines to be processed in the new facility will fly on Shuttle Endeavour's STS-88 mission in December 1998.
Wisdom of crowds for robust gene network inference
Marbach, Daniel; Costello, James C.; Küffner, Robert; Vega, Nicci; Prill, Robert J.; Camacho, Diogo M.; Allison, Kyle R.; Kellis, Manolis; Collins, James J.; Stolovitzky, Gustavo
2012-01-01
Reconstructing gene regulatory networks from high-throughput data is a long-standing problem. Through the DREAM project (Dialogue on Reverse Engineering Assessment and Methods), we performed a comprehensive blind assessment of over thirty network inference methods on Escherichia coli, Staphylococcus aureus, Saccharomyces cerevisiae, and in silico microarray data. We characterize performance, data requirements, and inherent biases of different inference approaches offering guidelines for both algorithm application and development. We observe that no single inference method performs optimally across all datasets. In contrast, integration of predictions from multiple inference methods shows robust and high performance across diverse datasets. Thereby, we construct high-confidence networks for E. coli and S. aureus, each comprising ~1700 transcriptional interactions at an estimated precision of 50%. We experimentally test 53 novel interactions in E. coli, of which 23 were supported (43%). Our results establish community-based methods as a powerful and robust tool for the inference of transcriptional gene regulatory networks. PMID:22796662
NASA Astrophysics Data System (ADS)
Hadwin, Paul J.; Sipkens, T. A.; Thomson, K. A.; Liu, F.; Daun, K. J.
2016-01-01
Auto-correlated laser-induced incandescence (AC-LII) infers the soot volume fraction (SVF) of soot particles by comparing the spectral incandescence from laser-energized particles to the pyrometrically inferred peak soot temperature. This calculation requires detailed knowledge of model parameters such as the absorption function of soot, which may vary with combustion chemistry, soot age, and the internal structure of the soot. This work presents a Bayesian methodology to quantify such uncertainties. This technique treats the additional "nuisance" model parameters, including the soot absorption function, as stochastic variables and incorporates the current state of knowledge of these parameters into the inference process through maximum entropy priors. While standard AC-LII analysis provides a point estimate of the SVF, Bayesian techniques infer the posterior probability density, which will allow scientists and engineers to better assess the reliability of AC-LII inferred SVFs in the context of environmental regulations and competing diagnostics.
Code of Federal Regulations, 2010 CFR
2010-07-01
... ruling to the District Engineer whose decision shall be final. A clearance by the dispatcher for a vessel..., jetties, piers, fences, buildings, trees, telephone lines, lighting structures, or any other property of...
33 CFR 334.1190 - Hood Canal and Dabob Bay, Wash.; naval non-explosive torpedo testing area.
Code of Federal Regulations, 2010 CFR
2010-07-01
... CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA... the area. (iii) No vessel shall anchor in the area except between the shore and the 10-fathom depth...
33 CFR 334.1190 - Hood Canal and Dabob Bay, Wash.; naval non-explosive torpedo testing area.
Code of Federal Regulations, 2011 CFR
2011-07-01
... CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA... the area. (iii) No vessel shall anchor in the area except between the shore and the 10-fathom depth...
Affective cognition: Exploring lay theories of emotion.
Ong, Desmond C; Zaki, Jamil; Goodman, Noah D
2015-10-01
Humans skillfully reason about others' emotions, a phenomenon we term affective cognition. Despite its importance, few formal, quantitative theories have described the mechanisms supporting this phenomenon. We propose that affective cognition involves applying domain-general reasoning processes to domain-specific content knowledge. Observers' knowledge about emotions is represented in rich and coherent lay theories, which comprise consistent relationships between situations, emotions, and behaviors. Observers utilize this knowledge in deciphering social agents' behavior and signals (e.g., facial expressions), in a manner similar to rational inference in other domains. We construct a computational model of a lay theory of emotion, drawing on tools from Bayesian statistics, and test this model across four experiments in which observers drew inferences about others' emotions in a simple gambling paradigm. This work makes two main contributions. First, the model accurately captures observers' flexible but consistent reasoning about the ways that events and others' emotional responses to those events relate to each other. Second, our work models the problem of emotional cue integration-reasoning about others' emotion from multiple emotional cues-as rational inference via Bayes' rule, and we show that this model tightly tracks human observers' empirical judgments. Our results reveal a deep structural relationship between affective cognition and other forms of inference, and suggest wide-ranging applications to basic psychological theory and psychiatry. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Engineers test STS-37 CETA electrical hand pedal cart in JSC MAIL Bldg 9A
NASA Technical Reports Server (NTRS)
1990-01-01
McDonnell Douglas engineers Noland Talley (left) and Gary Peters (center) and ILC-Dover engineer Richard Richard Smallcombe prepare test setup for the evaluation of the crew and equipment translation aid (CETA) electrical hand pedal cart in JSC's Mockup and Integration Laboratory (MAIL) Bldg 9A. Peters, wearing extravehicular mobility unit (EMU) boots and positioned in portable foot restraint (PFR), is suspended above CETA cart and track via harness to simulate weightlessness. CETA will be tested in orbit in the payload bay of Atlantis, Orbiter Vehicle (OV) 104, during STS-37.
Bodkin, James L.; Kloecker, Kimberly A.; Esslinger, George G.; Monson, Daniel H.; DeGroot, J.D.
2001-01-01
Following translocations to the outer coast of Southeast Alaska in 1965, sea otters have been expanding their range and increasing in abundance. We began conducting surveys for sea otters in Cross Sound, Icy Strait and Glacier Bay, Alaska in 1994, following initial reports of their presence in Glacier Bay in 1993. Since 1995, the number of sea otters in Glacier Bay proper has increased from about 5 to more than 500. Between 1993 and 1997 sea otters were apparently only occasional visitors to Glacier Bay, but in 1998 long-term residence was established as indicated by the presence of adult females and their dependent pups. Sea otter distribution is limited to the Lower Bay, south of Sandy Cove, and is not continuous within that area. Concentration occur in the vicinity of Sita Reef and Boulder Island and between Pt. Carolus and Rush Pt. on the west side of the Bay (Figure 1). We describe the diet of sea otters in Glacier Bay and south Icy Strait through visual observations of prey during >4,000 successful forage dives. In 2,399 successful foraging dives observed in Glacier Bay proper, diet consisted of 40% clam, 21% urchins, 18% mussel, 4% crab, 5% other and 12% unidentified. Most prey recovered by sea otters are commercially, socially, or ecological important species. Species of clam are primarily Saxidomus gigantea, Protothaca staminea, and Serripes groenlandicus. Urchins are primarily Strongylocentrotus droebachiensis while both mussles, Modiolus modiolus and Mytilus trossulus, are taken. Crabs include species of Cancer, Chinoecetes, Paralithodes, and Telmessus. Although we characterize diet at broad geographic scales, we found diet to vary between sites separated by as little as several hundred meters. Dietary variation among and within sites can reflect differences in prey availability and individual choice.We estimated species composition, density, biomass, and sizes of intertidal clams at 59 sites in Glacier Bay, 14 sites in Idaho Inlet, 12 sites in Port Althorp and 2 sites in Dundas Bay. There is no direct evidence of otter foraging at any of our clam sampling sites except at Port Althorp where sea otters have been present for >20 years and regularly forage intertidally. There is some indication of intertidal foraging in Idaho Inlet, based on reduced mean size of preferred clam species. Sea otters have been present in Idaho Inlet for at least 12 years. We sampled 48 systematically selected sites to allow inference throughout Glacier Bay intertidal areas and 12 preferred habitat intertidal sites to estimate maximum clam densities in the Bay. We also sampled 14 and 12 random sites in Idaho Inlet and Port Althorp, respectively, to provide contrast between sites with and without sea otters. Densities and biomass of intertidal clams were greater in the Lower Bay than either the East or West Arms. Mean densities (#/0.25m2) of all species of clams > 10.0 mm total length were 96.5 at preferred sites, 32.8 in the Lower Bay, 12.2 in the East Arm, 6.6 in the West Arm, 11.32 at Port Althorp and 27.1 at Idaho Inlet. Clam densities were lower in the Upper Arms of Glacier Bay, compared to the Lower Bay and were similar to densities at Port Althorp. In the Lower Bay, clam densities were nearly twice as high at preferred clam sites compared to those systematically sampled. Species of Macoma were the numerically dominant intertidal clam at most sites in Glacier Bay, while Protothaca staminea was dominant at Idaho Inlet and Port Althorp. Biomas (g/0.25m2) was higher in the Lower Bay (23.5) than either Arm (2.1 and .91) and higher at preferred sites (73.4) than systematically selected sites in Glacier Bay. Biomass estimates at Port Althorp were 5.2 and 9.7 at Idaho Inlet. Biomass estimates were dominated by species of Saxidomus, Protothaca and Mya in Glacier Bay and by Protothaca and Saxidomus at Idaho Inlet and Port Althrop. We suspect differences in density and biomass relate to habitat differences between areas within Glacier Bay
An Annotated Bibliography of CERC Coastal Ecology Research.
1980-06-01
the Atlantic and gulf coasts of the United States. The experimentation has been directed toward the use of sand fences and dune grasses to catch and...Pismo Clams ," MP 8-75, U.S. Army, Corps of Engineers, Coastal Engineering Research Center, Fort Belvoir, Va., Sept. 1975, NTIS AD No. A016 948. Three...aspects of the ecology of Pismo clams were investigated in Monterey Bay, California: distribution, reproduction cycle, and age and growth. Pismo clam
Welding As Science: Applying Basic Engineering Principles to the Discipline
NASA Technical Reports Server (NTRS)
Nunes, A. C., Jr.
2010-01-01
This Technical Memorandum provides sample problems illustrating ways in which basic engineering science has been applied to the discipline of welding. Perhaps inferences may be drawn regarding optimal approaches to particular welding problems, as well as for the optimal education for welding engineers. Perhaps also some readers may be attracted to the science(s) of welding and may make worthwhile contributions to the discipline.
ERIC Educational Resources Information Center
Rajasenan, D.
2014-01-01
The major problem of the engineering entrance examination is the exclusion of certain sections of the society in social, economic, regional and gender dimensions. This has seldom been taken for analysis towards policy correction. To lessen this problem a minor policy shift was prepared in the year 2011 with a 50-50 proportion in academic marks and…
The Heuristic Value of p in Inductive Statistical Inference
Krueger, Joachim I.; Heck, Patrick R.
2017-01-01
Many statistical methods yield the probability of the observed data – or data more extreme – under the assumption that a particular hypothesis is true. This probability is commonly known as ‘the’ p-value. (Null Hypothesis) Significance Testing ([NH]ST) is the most prominent of these methods. The p-value has been subjected to much speculation, analysis, and criticism. We explore how well the p-value predicts what researchers presumably seek: the probability of the hypothesis being true given the evidence, and the probability of reproducing significant results. We also explore the effect of sample size on inferential accuracy, bias, and error. In a series of simulation experiments, we find that the p-value performs quite well as a heuristic cue in inductive inference, although there are identifiable limits to its usefulness. We conclude that despite its general usefulness, the p-value cannot bear the full burden of inductive inference; it is but one of several heuristic cues available to the data analyst. Depending on the inferential challenge at hand, investigators may supplement their reports with effect size estimates, Bayes factors, or other suitable statistics, to communicate what they think the data say. PMID:28649206
Stan : A Probabilistic Programming Language
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.
Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less
The Heuristic Value of p in Inductive Statistical Inference.
Krueger, Joachim I; Heck, Patrick R
2017-01-01
Many statistical methods yield the probability of the observed data - or data more extreme - under the assumption that a particular hypothesis is true. This probability is commonly known as 'the' p -value. (Null Hypothesis) Significance Testing ([NH]ST) is the most prominent of these methods. The p -value has been subjected to much speculation, analysis, and criticism. We explore how well the p -value predicts what researchers presumably seek: the probability of the hypothesis being true given the evidence, and the probability of reproducing significant results. We also explore the effect of sample size on inferential accuracy, bias, and error. In a series of simulation experiments, we find that the p -value performs quite well as a heuristic cue in inductive inference, although there are identifiable limits to its usefulness. We conclude that despite its general usefulness, the p -value cannot bear the full burden of inductive inference; it is but one of several heuristic cues available to the data analyst. Depending on the inferential challenge at hand, investigators may supplement their reports with effect size estimates, Bayes factors, or other suitable statistics, to communicate what they think the data say.
Stan : A Probabilistic Programming Language
Carpenter, Bob; Gelman, Andrew; Hoffman, Matthew D.; ...
2017-01-01
Stan is a probabilistic programming language for specifying statistical models. A Stan program imperatively defines a log probability function over parameters conditioned on specified data and constants. As of version 2.14.0, Stan provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler, an adaptive form of Hamiltonian Monte Carlo sampling. Penalized maximum likelihood estimates are calculated using optimization methods such as the limited memory Broyden-Fletcher-Goldfarb-Shanno algorithm. Stan is also a platform for computing log densities and their gradients and Hessians, which can be used in alternative algorithms such as variational Bayes, expectationmore » propagation, and marginal inference using approximate integration. To this end, Stan is set up so that the densities, gradients, and Hessians, along with intermediate quantities of the algorithm such as acceptance probabilities, are easily accessible. Stan can also be called from the command line using the cmdstan package, through R using the rstan package, and through Python using the pystan package. All three interfaces support sampling and optimization-based inference with diagnostics and posterior analysis. rstan and pystan also provide access to log probabilities, gradients, Hessians, parameter transforms, and specialized plotting.« less
Seismic stability of the Duwamish River Delta, Seattle, Washington
Kayen, Robert E.; Barnhardt, Walter A.
2007-01-01
The delta front of the Duwamish River valley near Elliott Bay and Harbor Island is founded on young Holocene deposits shaped by sea-level rise, episodic volcanism, and seismicity. These river-mouth deposits are highly susceptible to seismic soil liquefaction and are potentially prone to submarine landsliding and disintegrative flow failure. A highly developed commercial-industrial corridor, extending from the City of Kent to the Elliott Bay/Harbor Island marine terminal facilities, is founded on the young Holocene deposits of the Duwamish River valley. The deposits of this Holocene delta have been shaped not only by relative sea-level rise but also by episodic volcanism and seismicity. Ground-penetrating radar (GPR), cores, in situ testing, and outcrops are being used to examine the delta stratigraphy and to infer how these deposits will respond to future volcanic eruptions and earthquakes in the region. A geotechnical investigation of these river-mouth deposits indicates high initial liquefaction susceptibility during earthquakes, and possibly the potential for unlimited-strain disintegrative flow failure of the delta front.
Ambulance snatching: how vulnerable are we?
Alves, Donald W; Bissell, Richard A
2003-08-01
Out of concern that ambulances might be targeted for hijack for terrorism purposes, we observed security-related behaviors of a cross-section of ambulance crews and their vehicles in Emergency Department ambulance bays. We sent observers to a convenience sample of trauma and suburban Emergency Department ambulance entrances in several states. We observed 151 total ambulance arrivals. Overall, the average time present was 21.5 min, 23.2% of units were left with the engine running, 26.5% were left open, 90.1% were left unattended, 84.1% were unlocked, and 16.6% had a non-crew visitor in the ambulance bay. Several issues were identified demonstrating potential "attractiveness" to individuals who may wish to disrupt Emergency Medical Services or steal an emergency vehicle. We are concerned that this is the case at the majority of ambulance bays in our country. Emergency services agencies should take steps to train their personnel to secure the ambulance.
Code of Federal Regulations, 2010 CFR
2010-07-01
... lock is available, a green light, semaphore or flag will be displayed; when not available, a red light... booms and piling must be obtained by written permit from the District Engineer. (8) The building...
105. ARAIII. Interior view of ARA608 highbay pit in 1983 ...
105. ARA-III. Interior view of ARA-608 high-bay pit in 1983 modified to contain high-temperature, high-pressure autoclave and furnace test area. Ineel photo no. 81-109. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID
Code of Federal Regulations, 2011 CFR
2011-07-01
... Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA... the exclusion of watercraft is required in the interest of safety or for accomplishment of the mission...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA... the exclusion of watercraft is required in the interest of safety or for accomplishment of the mission...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA... the exclusion of watercraft is required in the interest of safety or for accomplishment of the mission...
Code of Federal Regulations, 2014 CFR
2014-07-01
... Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA... the exclusion of watercraft is required in the interest of safety or for accomplishment of the mission...
Code of Federal Regulations, 2013 CFR
2013-07-01
... Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE DANGER ZONE AND RESTRICTED AREA... the exclusion of watercraft is required in the interest of safety or for accomplishment of the mission...
Narimani, Zahra; Beigy, Hamid; Ahmad, Ashar; Masoudi-Nejad, Ali; Fröhlich, Holger
2017-01-01
Inferring the structure of molecular networks from time series protein or gene expression data provides valuable information about the complex biological processes of the cell. Causal network structure inference has been approached using different methods in the past. Most causal network inference techniques, such as Dynamic Bayesian Networks and ordinary differential equations, are limited by their computational complexity and thus make large scale inference infeasible. This is specifically true if a Bayesian framework is applied in order to deal with the unavoidable uncertainty about the correct model. We devise a novel Bayesian network reverse engineering approach using ordinary differential equations with the ability to include non-linearity. Besides modeling arbitrary, possibly combinatorial and time dependent perturbations with unknown targets, one of our main contributions is the use of Expectation Propagation, an algorithm for approximate Bayesian inference over large scale network structures in short computation time. We further explore the possibility of integrating prior knowledge into network inference. We evaluate the proposed model on DREAM4 and DREAM8 data and find it competitive against several state-of-the-art existing network inference methods.
A 3-Component Mixture of Rayleigh Distributions: Properties and Estimation in Bayesian Framework
Aslam, Muhammad; Tahir, Muhammad; Hussain, Zawar; Al-Zahrani, Bander
2015-01-01
To study lifetimes of certain engineering processes, a lifetime model which can accommodate the nature of such processes is desired. The mixture models of underlying lifetime distributions are intuitively more appropriate and appealing to model the heterogeneous nature of process as compared to simple models. This paper is about studying a 3-component mixture of the Rayleigh distributionsin Bayesian perspective. The censored sampling environment is considered due to its popularity in reliability theory and survival analysis. The expressions for the Bayes estimators and their posterior risks are derived under different scenarios. In case the case that no or little prior information is available, elicitation of hyperparameters is given. To examine, numerically, the performance of the Bayes estimators using non-informative and informative priors under different loss functions, we have simulated their statistical properties for different sample sizes and test termination times. In addition, to highlight the practical significance, an illustrative example based on a real-life engineering data is also given. PMID:25993475
North American tidal power prospects
NASA Astrophysics Data System (ADS)
Wayne, W. W., Jr.
1981-07-01
Prospects for North American tidal power electrical generation are reviewed. Studies by the US Army Corps of Engineers of 90 possible generation schemes in Cobscook Bay, ME, indicated that maximum power generation rather than dependable capacity was the most economic method. Construction cost estimates for 15 MW bulb units in a single effect mode from basin to the sea are provided; five projects were considered ranging from 110-160 MW. Additional tidal power installations are examined for: Half-Moon Cove, ME (12 MW, 18 ft tide); Cook Inlet, AK, which is shown to pose severe environmental and engineering problems due to fish migration, earthquake hazards, and 300 ft deep silt deposits; and the Bay of Fundy, Canada. This last has a 17.8 MW plant under construction in a 29 ft maximum tide area. Other tidal projects of the Maritime Provinces are reviewed, and it is noted that previous economic evaluations based on an oil price of $16/barrel are in need of revision.
Protein Inference from the Integration of Tandem MS Data and Interactome Networks.
Zhong, Jiancheng; Wang, Jianxing; Ding, Xiaojun; Zhang, Zhen; Li, Min; Wu, Fang-Xiang; Pan, Yi
2017-01-01
Since proteins are digested into a mixture of peptides in the preprocessing step of tandem mass spectrometry (MS), it is difficult to determine which specific protein a shared peptide belongs to. In recent studies, besides tandem MS data and peptide identification information, some other information is exploited to infer proteins. Different from the methods which first use only tandem MS data to infer proteins and then use network information to refine them, this study proposes a protein inference method named TMSIN, which uses interactome networks directly. As two interacting proteins should co-exist, it is reasonable to assume that if one of the interacting proteins is confidently inferred in a sample, its interacting partners should have a high probability in the same sample, too. Therefore, we can use the neighborhood information of a protein in an interactome network to adjust the probability that the shared peptide belongs to the protein. In TMSIN, a multi-weighted graph is constructed by incorporating the bipartite graph with interactome network information, where the bipartite graph is built with the peptide identification information. Based on multi-weighted graphs, TMSIN adopts an iterative workflow to infer proteins. At each iterative step, the probability that a shared peptide belongs to a specific protein is calculated by using the Bayes' law based on the neighbor protein support scores of each protein which are mapped by the shared peptides. We carried out experiments on yeast data and human data to evaluate the performance of TMSIN in terms of ROC, q-value, and accuracy. The experimental results show that AUC scores yielded by TMSIN are 0.742 and 0.874 in yeast dataset and human dataset, respectively, and TMSIN yields the maximum number of true positives when q-value less than or equal to 0.05. The overlap analysis shows that TMSIN is an effective complementary approach for protein inference.
Genetic Network Inference: From Co-Expression Clustering to Reverse Engineering
NASA Technical Reports Server (NTRS)
Dhaeseleer, Patrik; Liang, Shoudan; Somogyi, Roland
2000-01-01
Advances in molecular biological, analytical, and computational technologies are enabling us to systematically investigate the complex molecular processes underlying biological systems. In particular, using high-throughput gene expression assays, we are able to measure the output of the gene regulatory network. We aim here to review datamining and modeling approaches for conceptualizing and unraveling the functional relationships implicit in these datasets. Clustering of co-expression profiles allows us to infer shared regulatory inputs and functional pathways. We discuss various aspects of clustering, ranging from distance measures to clustering algorithms and multiple-duster memberships. More advanced analysis aims to infer causal connections between genes directly, i.e., who is regulating whom and how. We discuss several approaches to the problem of reverse engineering of genetic networks, from discrete Boolean networks, to continuous linear and non-linear models. We conclude that the combination of predictive modeling with systematic experimental verification will be required to gain a deeper insight into living organisms, therapeutic targeting, and bioengineering.
Achete, Fernanda; Van der Wegen, Mick; Roelvink, Jan Adriaan; Jaffe, Bruce E.
2017-01-01
Suspended sediment concentration is an important estuarine health indicator. Estuarine ecosystems rely on the maintenance of habitat conditions, which are changing due to direct human impact and climate change. This study aims to evaluate the impact of climate change relative to engineering measures on estuarine fine sediment dynamics and sediment budgets. We use the highly engineered San Francisco Bay-Delta system as a case study. We apply a process-based modeling approach (Delft3D-FM) to assess the changes in hydrodynamics and sediment dynamics resulting from climate change and engineering scenarios. The scenarios consider a direct human impact (shift in water pumping location), climate change (sea level rise and suspended sediment concentration decrease), and abrupt disasters (island flooding, possibly as the results of an earthquake). Levee failure has the largest impact on the hydrodynamics of the system. Reduction in sediment input from the watershed has the greatest impact on turbidity levels, which are key to primary production and define habitat conditions for endemic species. Sea level rise leads to more sediment suspension and a net sediment export if little room for accommodation is left in the system due to continuous engineering works. Mitigation measures like levee reinforcement are effective for addressing direct human impacts, but less effective for a persistent, widespread, and increasing threat like sea level rise. Progressive adaptive mitigation measures to the changes in sediment and flow dynamics resulting from sea level rise may be a more effective strategy. Our approach shows that a validated process-based model is a useful tool to address long-term (decades to centuries) changes in sediment dynamics in highly engineered estuarine systems. In addition, our modeling approach provides a useful basis for long-term, process-based studies addressing ecosystem dynamics and health.
1989-01-01
This 1989 artist's rendering shows how a Shuttle-C would look during launch. As envisioned by Marshall Space Flight Center plarners, the Shuttle-C would be an unmanned heavy-lift cargo vehicle derived from Space Shuttle elements. The vehicle would utilize the basic Shuttle propulsion units (Solid Rocket Boosters, Space Shuttle Main Engine, External Tank), but would replace the Orbiter with an unmanned Shuttle-C Cargo Element (SCE). The SCE would have a payload bay lenght of eighty-two feet, compared to sixty feet for the Orbiter cargo bay, and would be able to deliver 170,000 pound payloads to low Earth orbit, more than three times the Orbiter's capacity.
Database Search Engines: Paradigms, Challenges and Solutions.
Verheggen, Kenneth; Martens, Lennart; Berven, Frode S; Barsnes, Harald; Vaudel, Marc
2016-01-01
The first step in identifying proteins from mass spectrometry based shotgun proteomics data is to infer peptides from tandem mass spectra, a task generally achieved using database search engines. In this chapter, the basic principles of database search engines are introduced with a focus on open source software, and the use of database search engines is demonstrated using the freely available SearchGUI interface. This chapter also discusses how to tackle general issues related to sequence database searching and shows how to minimize their impact.
Enriching Mental Health Mobile Assessment and Intervention with Situation Awareness †
Soares Teles, Ariel; Rocha, Artur; José da Silva e Silva, Francisco; Correia Lopes, João; O’Sullivan, Donal; Van de Ven, Pepijn; Endler, Markus
2017-01-01
Current mobile devices allow the execution of sophisticated applications with the capacity for identifying the user situation, which can be helpful in treatments of mental disorders. In this paper, we present SituMan, a solution that provides situation awareness to MoodBuster, an ecological momentary assessment and intervention mobile application used to request self-assessments from patients in depression treatments. SituMan has a fuzzy inference engine to identify patient situations using context data gathered from the sensors embedded in mobile devices. Situations are specified jointly by the patient and mental health professional, and they can represent the patient’s daily routine (e.g., “studying”, “at work”, “working out”). MoodBuster requests mental status self-assessments from patients at adequate moments using situation awareness. In addition, SituMan saves and displays patient situations in a summary, delivering them for consultation by mental health professionals. A first experimental evaluation was performed to assess the user satisfaction with the approaches to define and identify situations. This experiment showed that SituMan was well evaluated in both criteria. A second experiment was performed to assess the accuracy of the fuzzy engine to infer situations. Results from the second experiment showed that the fuzzy inference engine has a good accuracy to identify situations. PMID:28075417
Enriching Mental Health Mobile Assessment and Intervention with Situation Awareness.
Soares Teles, Ariel; Rocha, Artur; José da Silva E Silva, Francisco; Correia Lopes, João; O'Sullivan, Donal; Van de Ven, Pepijn; Endler, Markus
2017-01-10
Current mobile devices allow the execution of sophisticated applications with the capacity for identifying the user situation, which can be helpful in treatments of mental disorders. In this paper, we present SituMan , a solution that provides situation awareness to MoodBuster , an ecological momentary assessment and intervention mobile application used to request self-assessments from patients in depression treatments. SituMan has a fuzzy inference engine to identify patient situations using context data gathered from the sensors embedded in mobile devices. Situations are specified jointly by the patient and mental health professional, and they can represent the patient's daily routine (e.g., "studying", "at work", "working out"). MoodBuster requests mental status self-assessments from patients at adequate moments using situation awareness. In addition, SituMan saves and displays patient situations in a summary, delivering them for consultation by mental health professionals. A first experimental evaluation was performed to assess the user satisfaction with the approaches to define and identify situations. This experiment showed that SituMan was well evaluated in both criteria. A second experiment was performed to assess the accuracy of the fuzzy engine to infer situations. Results from the second experiment showed that the fuzzy inference engine has a good accuracy to identify situations.
ETR COMPLEX. CAMERA FACING SOUTH. FROM BOTTOM OF VIEW TO ...
ETR COMPLEX. CAMERA FACING SOUTH. FROM BOTTOM OF VIEW TO TOP: MTR, MTR SERVICE BUILDING, ETR CRITICAL FACILITY, ETR CONTROL BUILDING (ATTACHED TO ETR), ETR BUILDING (HIGH-BAY), COMPRESSOR BUILDING (ATTACHED AT LEFT OF ETR), HEAT EXCHANGER BUILDING (JUST BEYOND COMPRESSOR BUILDING), COOLING TOWER PUMP HOUSE, COOLING TOWER. OTHER BUILDINGS ARE CONTRACTORS' CONSTRUCTION BUILDINGS. INL NEGATIVE NO. 56-4105. Unknown Photographer, ca. 1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
2017-05-01
ERDC/CHL TR-16-4 May 2016 Identifying Fossil Shell Resources via Geophysical Surveys: Chesapeake Bay Region, Virginia, by H.M. Wadman and J.E...Welp AD1013242 ERDC/CHL TR-16-11 Jul 2016 Evaluation of Biodiesel Fuels to Reduce Fossil Fuel Use in Corps of Engineers Floating Plant Operations, by...KRIA Ionizing Water Treatment System for Waters Contaminated with Diesel, PCBs, and Nutrients (Nitrogen Forms ), by V.F. Medina, A. Morrow, C.C
Genetic-evolution-based optimization methods for engineering design
NASA Technical Reports Server (NTRS)
Rao, S. S.; Pan, T. S.; Dhingra, A. K.; Venkayya, V. B.; Kumar, V.
1990-01-01
This paper presents the applicability of a biological model, based on genetic evolution, for engineering design optimization. Algorithms embodying the ideas of reproduction, crossover, and mutation are developed and applied to solve different types of structural optimization problems. Both continuous and discrete variable optimization problems are solved. A two-bay truss for maximum fundamental frequency is considered to demonstrate the continuous variable case. The selection of locations of actuators in an actively controlled structure, for minimum energy dissipation, is considered to illustrate the discrete variable case.
IPCS implications for future supersonic transport aircraft
NASA Technical Reports Server (NTRS)
Billig, L. O.; Kniat, J.; Schmidt, R. D.
1976-01-01
The Integrated Propulsion Control System (IPCS) demonstrates control of an entire supersonic propulsion module - inlet, engine afterburner, and nozzle - with an HDC 601 digital computer. The program encompasses the design, build, qualification, and flight testing of control modes, software, and hardware. The flight test vehicle is an F-111E airplane. The L.H. inlet and engine will be operated under control of a digital computer mounted in the weapons bay. A general description and the current status of the IPCS program are given.
An Investigation of the Influence of Waves on Sediment Processes in Skagit Bay
2011-09-30
source term parameterizations common to most surface wave models, including wave generation by wind , energy dissipation from whitecapping, and...I. Total energy and peak frequency. Coastal Engineering (29), 47-78. Zijlema, M. Computation of wind -wave spectra in coastal waters with SWAN on unstructured grids Coastal Engineering, 2010, 57, 267-277 ...supply and wind on tidal flat sediment transport. It will be used to evaluate the capabilities of state-of-the-art open source sediment models and to
Carbonate system biogeochemistry in a subterranean estuary - Waquoit Bay, USA
NASA Astrophysics Data System (ADS)
Liu, Qian; Charette, Matthew A.; Breier, Crystaline F.; Henderson, Paul B.; McCorkle, Daniel C.; Martin, William; Dai, Minhan
2017-04-01
Quantifying carbon fluxes associated with submarine groundwater discharge (SGD) remains challenging due to the complex biogeochemistry of the carbonate system in the subterranean estuary (STE). Here we conducted time series measurements of total alkalinity (TAlk) and dissolved inorganic carbon (DIC) in a well-studied coastal aquifer (Waquoit Bay, Massachusetts, USA). Groundwater samples were collected monthly from May 2009 to June 2010 across the freshwater-saltwater mixing zone of the Waquoit Bay (WB) STE. The concentrations of both TAlk and DIC in zero-salinity groundwater were variable, but were lower than those in the bay water (S ∼ 28). DIC underwent slightly non-conservative mixing between low and intermediate salinities while there was an apparent additional DIC source at high salinity (>20) in all seasons. TAlk concentrations exhibited even stronger variations, with evidence of both production and consumption in high salinity zones, and consistent TAlk consumption at intermediate salinity in summer and fall (June-December, 2009). The increases in DIC and TAlk at high salinity were attributed to aerobic respiration and denitrification in WB sediments during bay water recharge of the STE. We infer that the loss of TAlk at intermediate salinity reflects H+ production as reduced compounds (e.g. Fe2+) are oxidized within the STE. In terms of impacts on surface water inorganic carbon budgets, the SGD-derived DIC flux was mainly controlled by seasonal changes in SGD while a combination of TAlk concentration variability and SGD drove the TAlk flux. SGD-derived DIC, aqueous CO2, and H+ fluxes to the bay were ∼40-50% higher in summer vs. in winter, a result of enhanced marine groundwater flux and significant TAlk removal (proton addition) during periods of high seawater intrusion. Furthermore, the SGD-derived DIC flux was consistently greater than TAlk flux regardless of season, indicating that SGD serves to reduce the CO2 buffering capacity of surface water. Our results highlight the importance of seasonality and subsurface biogeochemical processes on the subterranean estuary carbonate system and the resulting impact on SGD-derived TAlk, DIC, aqueous CO2, and H+ fluxes to the coastal ocean.
Dartnell, Peter; Barnard, Patrick L.; Chin, John L.; Hanes, Daniel; Kvitek, Rikk G.; Iampietro, Pat J.; Gardner, James V.
2006-01-01
San Francisco Bay in Northern California is one of the largest and most altered estuaries within the United States. The sea floor within the bay as well as at its entrance is constantly changing due to strong tidal currents, aggregate mining, dredge disposal, and the creation of new land using artificial fill. Understanding this dynamic sea floor is critical for addressing local environmental issues, which include defining pollution transport pathways, deciphering tectonics, and identifying benthic habitats. Mapping commercial interests such as safe ship navigation and dredge disposal is also significantly aided by such understanding. Over the past decade, the U.S. Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), and California State University, Monterey Bay (CSUMB) in cooperation with the U.S. Army Corps of Engineers (USACOE) and the Center for Integrative Coastal Observation, Research and Education (CICORE) have partnered to map central San Francisco Bay and its entrance under the Golden Gate Bridge using multibeam echosounders. These sonar systems can continuously map to produce 100 percent coverage of the sea floor at meter-scale resolution and thus produce an unprecedented view of the floor of the bay. This poster shows views of the sea floor in west-central San Francisco Bay around Alcatraz and Angel Islands, underneath the Golden Gate Bridge, and through its entrance from the Pacific Ocean. The sea floor is portrayed as a shaded relief surface generated from the multibeam data color-coded for depth from light blues for the shallowest values to purples for the deepest. The land regions are portrayed by USGS digital orthophotographs (DOQs) overlaid on USGS digital elevation models (DEMs). The water depths have a 4x vertical exaggeration while the land areas have a 2x vertical exaggeration.
Distribution of Epiphytic Diatoms in a Sub-Tropical Estuary
NASA Astrophysics Data System (ADS)
Frankovich, T. A.; Gaiser, E. E.; Wachnicka, A.; Zieman, J. C.
2005-05-01
Within estuaries, seagrasses may represent an order of magnitude greater surface area relative to sediments for the colonization and growth of diatoms. Fossil diatom distributions have proven useful in inferring paleoenvironmental conditions. The strength of these inferences is dependent upon defining the environmental relationships of contempory diatom compositions. The present investigation characterized the modern epiphytic diatom flora on the seagrass Thalassia testudinum at seven sites in the sub-tropical Florida Bay estuary and at one Atlantic Ocean site east of the upper Florida Keys. These sites were sampled six times between March 2000 and April 2001. Diatom species composition was related to water quality parameters using multivariate statistics. 338 diatom species were identified. The seven most abundant species from pooled samples were Cocconeis placentula, Brachysira aponina, Nitzschia liebetruthii, Hyalosynedra laevigata, Amphora cf. tenerrima, Mastogloia crucicula, and M. pusilla. These seven species collectively accounted for 51.7 percent of all valves counted and occurred in at least 85 percent of all samples. Analysis of similiarity and NMDS ordination of species relative abundances revealed four distinct diatom communities across the study region. The spatial variability of these communities was correlated with salinity and water-column nutrient availability. Summertime communities were significantly different from winter-spring communities, but these communities showed a gradual temporal progression with much overlap. The temporal variability was correlated with temperature. Indicator species analysis identified many species significantly influencing the four spatial groups. The Atlantic marine site was characterized by many different Mastogloia species and some epipsammic (sand-grain associated) diatoms (i.e., Cymatosira lorenziana, Dimerogramma dubium, and Neofragilaria nicobarica). Mastogloia pusilla, Rhopalodia pacifica, and Cocconeis woodii were strong indicators of the Gulf of Mexico marine site. Reimerothrix floridensis was particularly abundant in the western interior of Florida Bay (i.e., sites 2, 3, and 4) during summer months. The eastern interior of Florida Bay was characterized by high relative abundances of Brachysira aponina and Nitzschia liebetruthii. The optima and tolerance of these indicator species relative to individual water quality parameters were also determined.
Reguero, Borja G; Beck, Michael W; Agostini, Vera N; Kramer, Philip; Hancock, Boze
2018-03-15
Coastal communities in tropical environments are at increasing risk from both environmental degradation and climate change and require urgent local adaptation action. Evidences show coral reefs play a critical role in wave attenuation but relatively little direct connection has been drawn between these effects and impacts on shorelines. Reefs are rarely assessed for their coastal protection service and thus not managed for their infrastructure benefits, while widespread damage and degradation continues. This paper presents a systematic approach to assess the protective role of coral reefs and to examine solutions based on the reef's influence on wave propagation patterns. Portions of the shoreline of Grenville Bay, Grenada, have seen acute shoreline erosion and coastal flooding. This paper (i) analyzes the historical changes in the shoreline and the local marine, (ii) assess the role of coral reefs in shoreline positioning through a shoreline equilibrium model first applied to coral reef environments, and (iii) design and begin implementation of a reef-based solution to reduce erosion and flooding. Coastline changes in the bay over the past 6 decades are analyzed from bathymetry and benthic surveys, historical imagery, historical wave and sea level data and modeling of wave dynamics. The analysis shows that, at present, the healthy and well-developed coral reefs system in the southern bay keeps the shoreline in equilibrium and stable, whereas reef degradation in the northern bay is linked with severe coastal erosion. A comparison of wave energy modeling for past bathymetry indicates that degradation of the coral reefs better explains erosion than changes in climate and historical sea level rise. Using this knowledge on how reefs affect the hydrodynamics, a reef restoration solution is designed and studied to ameliorate the coastal erosion and flooding. A characteristic design provides a modular design that can meet specific engineering, ecological and implementation criteria. Four pilot units were implemented in 2015 and are currently being field-tested. This paper presents one of the few existing examples available to date of a reef restoration project designed and engineered to deliver risk reduction benefits. The case study shows how engineering and ecology can work together in community-based adaptation. Our findings are particularly important for Small Island States on the front lines of climate change, who have the most to gain from protecting and managing coral reefs as coastal infrastructure. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Robust Strategy for Rocket Engine Health Monitoring
NASA Technical Reports Server (NTRS)
Santi, L. Michael
2001-01-01
Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.
NASA Astrophysics Data System (ADS)
Lombardi, Ilaria; Console, Luca
In the paper we show how rule-based inference can be made more flexible by exploiting semantic information associated with the concepts involved in the rules. We introduce flexible forms of common sense reasoning in which whenever no rule applies to a given situation, the inference engine can fire rules that apply to more general or to similar situations. This can be obtained by defining new forms of match between rules and the facts in the working memory and new forms of conflict resolution. We claim that in this way we can overcome some of the brittleness problems that are common in rule-based systems.
NASA Astrophysics Data System (ADS)
Sadhuram, Y.; Maneesha, K.
2016-10-01
In this study, an attempt has been made to examine the relationship between summer monsoon rainfall (June-September) and the total number of depressions, cyclones and severe cyclones (TNDC) over Bay of Bengal during the post-monsoon (October-December) season. The seasonal rainfall of the subdivisions (located in south India) (referred as rainfall index - RI), is positively and significantly correlated ( r=0.59; significant at >99% level) with the TNDC during the period, 1984-2013. By using the first differences (current season minus previous season), the correlations are enhanced and a remarkably high correlation of 0.87 is observed between TNDC and RI for the recent period, 1993-2013. The average seasonal genesis potential parameter (GPP) showed a very high correlation of 0.84 with the TNDC. A very high correlation of 0.83 is observed between GPP and RI for the period, 1993-2013. The relative vorticity and mid-tropospheric relative humidity are found to be the dominant terms in GPP. The GPP was 3.5 times higher in above (below) normal RI in which TNDC was 4 (2). It is inferred that RI is playing a key role in TNDC by modulating the environmental conditions (low level vorticity and relative humidity) over Bay of Bengal during post-monsoon season which could be seen from the very high correlation of 0.87 (which explains 76% variability in TNDC). For the first time, we show that RI is a precursor for the TNDC over Bay of Bengal during post-monsoon season. Strong westerlies after the SW monsoon season transport moisture over the subdivisions towards Bay of Bengal due to cyclonic circulation. This circulation favours upward motion and hence transport moisture vertically to mid-troposphere which causes convective instability and this in turn favour more number of TNDC, under above-normal RI year.
NASA Astrophysics Data System (ADS)
Colizza, Ester; Finocchiaro, Furio; Giglio, Federico; Kuhn, Gerhard; Langone, Leonardo; Presti, Massimo
2010-05-01
The study of LGM and Holocene marine sediments is an important goal in Antarctic research and needs high-resolution sequences to reconstruct paleoclimatic events in detail. Literature reports a large number of data coming from inner-shelf bays and fjords, especially around Antarctic peninsula, but also from western Ross Sea. In this note we discuss compositional data from a gravity core (BAY05-45c; 74° 09.7' S, 165° 57.7' E; water depth: 1058 m; core length: 445.5 cm) collected in 2005 during the Italian PNRA cruise into the inner part of Wood Bay, in front of the Aviator Ice tongue. Wood Bay sea floor morphology is charcterised by a narrow basin, deeper than 1,000 m, oriented WNW-ESE, and transversally connected, by a 800-m deep sill, to the Drygalski basin, streching NE-SW. Core sediment is composed by laminated biosiliecous mud, with a strong hydrogen sulphide odour and black in colour. Within a few days from core sampling, sediment became oxidized: laminae colour ranges from dark (from dark olive grey to black) to light (from olive grey to olive). Some lighter laminae have cotton-like texture. Data set include X-ray images, magnetic susceptibility, AMS 14C dating, organic carbon, biogenic silica, XRF-scan of major and minor elements. Discussion of the data will point out inferences about sedimentary processes, paleoproductivity and oceanographic conditions during the Holocene. The most apparent feature is the occurrence, down-core, of at least two intervals of increased productivity, characterised by higher organic carbon and biogenic silica. Within such intervals, a few cm-thick levels show peaks of biogenic silica, as well as of barium, which correspond to relatively lows in organic carbon contents. Organic carbon content is higher in darker laminae, whereas lighter and fluffy laminae display an increased percentage of biogenic silica. Such levels probably mark a rapid and not persistent change in phytoplankton assemblage compositions.
NASA Astrophysics Data System (ADS)
Verzichelli, Gianluca
2016-08-01
An Availability Stochastic Model for the E-ELT has been developed in GeNIE. The latter is a Graphical User Interface (GUI) for the Structural Modeling, Inference, and Learning Engine (SMILE), originally distributed by the Decision Systems Laboratory from the University of Pittsburgh, and now being a product of Bayes Fusion, LLC. The E-ELT will be the largest optical/near-infrared telescope in the world. Its design comprises an Alt-Azimuth mount reflecting telescope with a 39-metre-diameter segmented primary mirror, a 4-metre-diameter secondary mirror, a 3.75-metre-diameter tertiary mirror, adaptive optics and multiple instruments. This paper highlights how a Model has been developed for an earlier on assessment of the Telescope Avail- ability. It also describes the modular structure and the underlying assumptions that have been adopted for developing the model and demonstrates the integration of FMEA, Influence Diagram and Bayesian Network elements. These have been considered for a better characterization of the Model inputs and outputs and for taking into account Degraded-based Reliability (DBR). Lastly, it provides an overview of how the information and knowledge captured in the model may be used for an earlier on definition of the Failure, Detection, Isolation and Recovery (FDIR) Control Strategy and the Telescope Minimum Master Equipment List (T-MMEL).
A Predictive Approach to Network Reverse-Engineering
NASA Astrophysics Data System (ADS)
Wiggins, Chris
2005-03-01
A central challenge of systems biology is the ``reverse engineering" of transcriptional networks: inferring which genes exert regulatory control over which other genes. Attempting such inference at the genomic scale has only recently become feasible, via data-intensive biological innovations such as DNA microrrays (``DNA chips") and the sequencing of whole genomes. In this talk we present a predictive approach to network reverse-engineering, in which we integrate DNA chip data and sequence data to build a model of the transcriptional network of the yeast S. cerevisiae capable of predicting the response of genes in unseen experiments. The technique can also be used to extract ``motifs,'' sequence elements which act as binding sites for regulatory proteins. We validate by a number of approaches and present comparison of theoretical prediction vs. experimental data, along with biological interpretations of the resulting model. En route, we will illustrate some basic notions in statistical learning theory (fitting vs. over-fitting; cross- validation; assessing statistical significance), highlighting ways in which physicists can make a unique contribution in data- driven approaches to reverse engineering.
Enhanced optical alignment of a digital micro mirror device through Bayesian adaptive exploration
NASA Astrophysics Data System (ADS)
Wynne, Kevin B.; Knuth, Kevin H.; Petruccelli, Jonathan
2017-12-01
As the use of Digital Micro Mirror Devices (DMDs) becomes more prevalent in optics research, the ability to precisely locate the Fourier "footprint" of an image beam at the Fourier plane becomes a pressing need. In this approach, Bayesian adaptive exploration techniques were employed to characterize the size and position of the beam on a DMD located at the Fourier plane. It couples a Bayesian inference engine with an inquiry engine to implement the search. The inquiry engine explores the DMD by engaging mirrors and recording light intensity values based on the maximization of the expected information gain. Using the data collected from this exploration, the Bayesian inference engine updates the posterior probability describing the beam's characteristics. The process is iterated until the beam is located to within the desired precision. This methodology not only locates the center and radius of the beam with remarkable precision but accomplishes the task in far less time than a brute force search. The employed approach has applications to system alignment for both Fourier processing and coded aperture design.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. NASA Administrator Michael Griffin (left) tours Orbiter Processing Facility bay 1 where Space Shuttle Atlantis is currently being processed for the second Return to Flight mission, STS-121. He is accompanied by NASA ground systems engineer Doug Moore. This is Griffin's first official visit to Kennedy Space Center. Griffin is the 11th administrator of NASA, a role he assumed on April 14, 2005. Griffin was nominated to the position in March while serving as the Space Department head at Johns Hopkins University's Applied Physics Laboratory in Baltimore. A registered professional engineer in Maryland and California, Griffin served as chief engineer at NASA earlier in his career. He holds numerous scientific and technical degrees including a Ph.D. in Aerospace Engineering from the University of Maryland.
ENGINEERING TEST REACTOR, TRA642. CONTEXTUAL VIEW ORIENTATING ETR TO MTR. ...
ENGINEERING TEST REACTOR, TRA-642. CONTEXTUAL VIEW ORIENTATING ETR TO MTR. CAMERA IS ON ROOF OF MTR BUILDING AND FACES DUE SOUTH. MTR SERVICE BUILDING, TRA-635, IN LOWER RIGHT CORNER. STEEL FRAMES SHOW BUILDINGS TO BE ATTACHED TO ETR BUILDING. HIGH-BAY SECTION IN CENTER IS REACTOR BUILDING. TWO-STORY CONTROL ROOM AND OFFICE BUILDING, TRA-647, IS BETWEEN IT AND MTR SERVICE BUILDING. STRUCTURE TO THE LEFT (WITH NO FRAMING YET) IS COMPRESSOR BUILDING, TRA-643, AND BEYOND IT WILL BE HEAT EXCHANGER BUILDING, TRA-644, GREAT SOUTHERN BUTTE ON HORIZON. INL NEGATIVE NO. 56-2382. Jack L. Anderson, Photographer, 6/10/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
2001-08-20
STS105-714-028 (20 August 2001) --- Backdropped by Lake Michigan, this distant view shows the recently deployed small science satellite called Simplesat, which is an engineering satellite, designed to evaluate the use of inexpensive commercial hardware for spacecraft. It was spring-ejected from a canister at the rear of the Shuttle's cargo bay.
PBF (PER620) south facade. Camera facing north. Note pedestrian bridge ...
PBF (PER-620) south facade. Camera facing north. Note pedestrian bridge crossing over conduit. Central high bay contains reactor room and canal. Date: March 2004. INEEL negative no. HD-41-2-1 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID
B-29 Superfortress with Ramjet Missile
1948-08-21
The NACA’s Lewis Flight Propulsion Laboratory used a Boeing B‒29 Superfortress as a testbed for ramjet investigations in the late 1940s. Lewis researchers conducted a wide variety of studies on ramjets to determine basic the operational data necessary to design missiles. Extensive wind tunnel and test stand studies were augmented by actual flight tests. Lewis engineers modified this B‒29 so that the ramjet could be stored in the bomb bay. Once the aircraft reached the desired altitude and speed, a mechanical arm suspended the ramjet 52 inches below the bomb bay. The ramjet’s angle-of-attack could be independently adjusted, and a periscope permitted a view of the test article from inside the aircraft. Researchers took measurements in free-stream conditions at speeds up to Mach 0.51 and at altitudes ranging from 5,000 to 30,000 feet. They then shut the ramjet down and retracted it into the aircraft. The researchers first determined that 14,000 feet was the maximum altitude at which the engine could be ignited by spark. They used flares to start the engine at altitudes up to 30,000 feet. They were able to determine maximum combustion efficiencies, response time to changes in fuel flow, and minimum fuel-air ratios. Overall the ramjet operated well at all speeds and altitudes.
Shuttle payload bay thermal environments: Summary and conclusion report for STS Flights 1-5
NASA Technical Reports Server (NTRS)
Fu, J. H.; Graves, G. R.
1987-01-01
The thermal data for the payload bay of the first five shuttle flights is summarized and the engineering evaluation of that data is presented. After a general discussion on mission profiles and vehicle configurations, the thermal design and flight instrumentation systems of the payload bay are described. The thermal flight data sources and a categorization of the data are then presented. A thermal flight data summarization section provides temperature data for the five phases of a typical mission profile. These are: prelaunch, ascent, on-orbit, entry and postlanding. The thermal flight data characterization section encompasses this flight data for flight to flight variations, payload effects, temperature ranges, and other variations. Discussion of the thermal environment prediction models in use by industry and various NASA Centers, and the results predicted by these models, is followed by an evaluation of the correlation between the actual flight data and the results predicted by the models. Finally, the available thermal data are evaluated from the viewpoint of the user concerned with establishing the thermal environment in the payload bay. The data deficiencies are discussed and recommendations for their elimination are presented.
Millennial-scale sustainability of the Chesapeake Bay Native American oyster fishery
Rick, Torben C.; Reeder-Myers, Leslie A.; Hofman, Courtney A.; Breitburg, Denise; Lockwood, Rowan; Henkes, Gregory; Kellogg, Lisa; Lowery, Darrin; Luckenbach, Mark W.; Mann, Roger; Ogburn, Matthew B.; Southworth, Melissa; Wah, John; Wesson, James; Hines, Anson H.
2016-01-01
Estuaries around the world are in a state of decline following decades or more of overfishing, pollution, and climate change. Oysters (Ostreidae), ecosystem engineers in many estuaries, influence water quality, construct habitat, and provide food for humans and wildlife. In North America’s Chesapeake Bay, once-thriving eastern oyster (Crassostrea virginica) populations have declined dramatically, making their restoration and conservation extremely challenging. Here we present data on oyster size and human harvest from Chesapeake Bay archaeological sites spanning ∼3,500 y of Native American, colonial, and historical occupation. We compare oysters from archaeological sites with Pleistocene oyster reefs that existed before human harvest, modern oyster reefs, and other records of human oyster harvest from around the world. Native American fisheries were focused on nearshore oysters and were likely harvested at a rate that was sustainable over centuries to millennia, despite changing Holocene climatic conditions and sea-level rise. These data document resilience in oyster populations under long-term Native American harvest, sea-level rise, and climate change; provide context for managing modern oyster fisheries in the Chesapeake Bay and elsewhere around the world; and demonstrate an interdisciplinary approach that can be applied broadly to other fisheries. PMID:27217572
Millennial-scale sustainability of the Chesapeake Bay Native American oyster fishery.
Rick, Torben C; Reeder-Myers, Leslie A; Hofman, Courtney A; Breitburg, Denise; Lockwood, Rowan; Henkes, Gregory; Kellogg, Lisa; Lowery, Darrin; Luckenbach, Mark W; Mann, Roger; Ogburn, Matthew B; Southworth, Melissa; Wah, John; Wesson, James; Hines, Anson H
2016-06-07
Estuaries around the world are in a state of decline following decades or more of overfishing, pollution, and climate change. Oysters (Ostreidae), ecosystem engineers in many estuaries, influence water quality, construct habitat, and provide food for humans and wildlife. In North America's Chesapeake Bay, once-thriving eastern oyster (Crassostrea virginica) populations have declined dramatically, making their restoration and conservation extremely challenging. Here we present data on oyster size and human harvest from Chesapeake Bay archaeological sites spanning ∼3,500 y of Native American, colonial, and historical occupation. We compare oysters from archaeological sites with Pleistocene oyster reefs that existed before human harvest, modern oyster reefs, and other records of human oyster harvest from around the world. Native American fisheries were focused on nearshore oysters and were likely harvested at a rate that was sustainable over centuries to millennia, despite changing Holocene climatic conditions and sea-level rise. These data document resilience in oyster populations under long-term Native American harvest, sea-level rise, and climate change; provide context for managing modern oyster fisheries in the Chesapeake Bay and elsewhere around the world; and demonstrate an interdisciplinary approach that can be applied broadly to other fisheries.
A Hybrid Stochastic-Neuro-Fuzzy Model-Based System for In-Flight Gas Turbine Engine Diagnostics
2001-04-05
Margin (ADM) and (ii) Fault Detection Margin (FDM). Key Words: ANFIS, Engine Health Monitoring , Gas Path Analysis, and Stochastic Analysis Adaptive Network...The paper illustrates the application of a hybrid Stochastic- Fuzzy -Inference Model-Based System (StoFIS) to fault diagnostics and prognostics for both...operational history monitored on-line by the engine health management (EHM) system. To capture the complex functional relationships between different
Genie: An Inference Engine with Applications to Vulnerability Analysis.
1986-06-01
Stanford Artifcial intelligence Laboratory, 1976. 15 D. A. Waterman and F. Hayes-Roth, eds. Pattern-Directed Inference Systems. Academic Press, Inc...Continue an reverse aide It nlecessary mid Identify by block rnmbor) ; f Expert Systems Artificial Intelligence % Vulnerability Analysis Knowledge...deduction it is used wherever possible in data -driven mode (forward chaining). Production rules - JIM 0 g79OOFMV55@S I INCLASSTpnF SECURITY CLASSIFICATION OF
Seasonal and interannual patterns in primary production ...
Measurements of primary production and respiration provide fundamental information about the trophic status of aquatic ecosystems, yet such measurements are logistically difficult and expensive to sustain as part of long-term monitoring programs. However, ecosystem metabolism parameters can be inferred from high frequency water quality data collections using autonomous logging instruments. For this study, we analyzed such time series datasets from three Gulf of Mexico estuaries: Grand Bay, MS, Weeks Bay AL and Apalachicola Bay FL. Data were acquired from NOAA's National Estuarine Research Reserve System Wide Monitoring Program and used to calculate gross primary production (GPP), ecosystem respiration (ER) and net ecosystem metabolism (NEM) using Odum's open water method. The three systems present a diversity of estuaries typical of the Gulf of Mexico region, varying by as much as 2 orders of magnitude in key physical characteristics, such as estuarine area, watershed area, freshwater flow, and nutrient loading. In all three systems, gross primary production (GPP) and ecosystem respiration (ER) displayed strong seasonality, peaking in summer and being lowest during winter. Peak rates of GPP and ER exceeded 200 mmol O2 m-2 d-1 52 in all three estuaries. To our knowledge, this is the only study examining long term trends in rates of GPP, ER and NEM in estuaries. Variability in metabolism tended to be small among sites within each estuary. Nitrogen loading was high
[Bayesian statistics in medicine -- part II: main applications and inference].
Montomoli, C; Nichelatti, M
2008-01-01
Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.
Ferragina, A.; de los Campos, G.; Vazquez, A. I.; Cecchinato, A.; Bittante, G.
2017-01-01
The aim of this study was to assess the performance of Bayesian models commonly used for genomic selection to predict “difficult-to-predict” dairy traits, such as milk fatty acid (FA) expressed as percentage of total fatty acids, and technological properties, such as fresh cheese yield and protein recovery, using Fourier-transform infrared (FTIR) spectral data. Our main hypothesis was that Bayesian models that can estimate shrinkage and perform variable selection may improve our ability to predict FA traits and technological traits above and beyond what can be achieved using the current calibration models (e.g., partial least squares, PLS). To this end, we assessed a series of Bayesian methods and compared their prediction performance with that of PLS. The comparison between models was done using the same sets of data (i.e., same samples, same variability, same spectral treatment) for each trait. Data consisted of 1,264 individual milk samples collected from Brown Swiss cows for which gas chromatographic FA composition, milk coagulation properties, and cheese-yield traits were available. For each sample, 2 spectra in the infrared region from 5,011 to 925 cm−1 were available and averaged before data analysis. Three Bayesian models: Bayesian ridge regression (Bayes RR), Bayes A, and Bayes B, and 2 reference models: PLS and modified PLS (MPLS) procedures, were used to calibrate equations for each of the traits. The Bayesian models used were implemented in the R package BGLR (http://cran.r-project.org/web/packages/BGLR/index.html), whereas the PLS and MPLS were those implemented in the WinISI II software (Infrasoft International LLC, State College, PA). Prediction accuracy was estimated for each trait and model using 25 replicates of a training-testing validation procedure. Compared with PLS, which is currently the most widely used calibration method, MPLS and the 3 Bayesian methods showed significantly greater prediction accuracy. Accuracy increased in moving from calibration to external validation methods, and in moving from PLS and MPLS to Bayesian methods, particularly Bayes A and Bayes B. The maximum R2 value of validation was obtained with Bayes B and Bayes A. For the FA, C10:0 (% of each FA on total FA basis) had the highest R2 (0.75, achieved with Bayes A and Bayes B), and among the technological traits, fresh cheese yield R2 of 0.82 (achieved with Bayes B). These 2 methods have proven to be useful instruments in shrinking and selecting very informative wavelengths and inferring the structure and functions of the analyzed traits. We conclude that Bayesian models are powerful tools for deriving calibration equations, and, importantly, these equations can be easily developed using existing open-source software. As part of our study, we provide scripts based on the open source R software BGLR, which can be used to train customized prediction equations for other traits or populations. PMID:26387015
NASA Astrophysics Data System (ADS)
Shaw, T.; Clear, J.; Horton, B.; Khan, N.; Nikitina, D.; Enache, M. D.; Potapova, M.; Frizzera, D.; Procopio, N.; Vane, C. H.; Walker, J. S.
2016-12-01
Due to the rapid and pervasive loss of coastal wetland ecosystems and the enumerable services they provide, recent attention has been given to understand their resilience and response to natural and anthropogenic impacts. Knowledge gaps exist particularly regarding response times of wetland ecosystems to natural factors (storms and sea-level rise) and the appropriate indices or metrics of ecosystem health to be incorporated in management practices to achieve restoration goals. Here we present results from monitoring studies and stratigraphic investigations from marshes across the New Jersey, USA shoreline from Delaware Bay to Raritan Bay (˜210 km of coastline that vary in degree of urbanization and anthropogenic disturbances) that address these limitations. In Delaware Bay, we identify a series of abrupt contacts (mud-peat couplets) from a sequence spanning the past two thousand years that we infer result from erosive storm events. By dating the base of these contacts and the return to high salt marsh peat, we are able to estimate the recovery time of marshes under varying rates of sea-level rise. In marshes from Great Sound to Raritan Bay, we use microfossils (e.g., foraminifera, diatoms) as indices of ecosystem health. We monitor the response of microfossils to natural (e.g., changes in salinity or inundation frequency from sea-level rise) and anthropogenic (e.g., nutrient loading) influences and apply quantitative paleoenvironmental reconstruction techniques to sediment archives to understand the relative influence of these factors on New Jersey wetlands over the past two thousand years. These results can be used to inform future coastal wetland restoration targets and as a model to develop site-specific goals in other regions.
NASA Astrophysics Data System (ADS)
Meldgaard, Asger; Nielsen, Lars; Iaffaldano, Giampiero
2017-04-01
Relative sea level data, primarily obtained through isolation basin analysis in western Greenland and on Disko Island, indicates asynchronous rates of uplift during the Early Holocene with larger rates of uplift in southern Disko Bay compared to the northern part of the bay. Similar short-wavelength variations can be inferred from the Holocene marine limit as observations on the north and south side of Disko Island differ by as much as 60 m. While global isostatic adjustment models are needed to account for far field contributions to the relative sea level and for the calculation of accurate ocean functions, they are generally not suited for a detailed analysis of the short-wavelength uplift patterns observed close to present ice margins. This is in part due to the excessive computational cost required for sufficient resolution, and because these models generally ignore regional lateral heterogeneities in mantle and lithosphere rheology. To mitigate this problem, we perform sensitivity tests to investigate the effects of near field loading on a regional plane-Earth finite element model of the lithosphere and mantle of the Disko Bay area, where the global isostatic uplift chronology is well documented. By loading the model area through detailed regional ocean function and ice models, and by including a high resolution topography model of the area, we seek to assess the isostatic rebound generated by surface processes with wavelengths similar to those of the observed rebound signal. We also investigate possible effects of varying lithosphere and mantle rheology, which may play an important role in explaining the rebound signal. We use the abundance of relative sea level curves obtained in the region primarily through isolation basin analysis on Disko Island to constrain the parameters of the Earth model.
2012-01-01
Background An important question in the analysis of biochemical data is that of identifying subsets of molecular variables that may jointly influence a biological response. Statistical variable selection methods have been widely used for this purpose. In many settings, it may be important to incorporate ancillary biological information concerning the variables of interest. Pathway and network maps are one example of a source of such information. However, although ancillary information is increasingly available, it is not always clear how it should be used nor how it should be weighted in relation to primary data. Results We put forward an approach in which biological knowledge is incorporated using informative prior distributions over variable subsets, with prior information selected and weighted in an automated, objective manner using an empirical Bayes formulation. We employ continuous, linear models with interaction terms and exploit biochemically-motivated sparsity constraints to permit exact inference. We show an example of priors for pathway- and network-based information and illustrate our proposed method on both synthetic response data and by an application to cancer drug response data. Comparisons are also made to alternative Bayesian and frequentist penalised-likelihood methods for incorporating network-based information. Conclusions The empirical Bayes method proposed here can aid prior elicitation for Bayesian variable selection studies and help to guard against mis-specification of priors. Empirical Bayes, together with the proposed pathway-based priors, results in an approach with a competitive variable selection performance. In addition, the overall procedure is fast, deterministic, and has very few user-set parameters, yet is capable of capturing interplay between molecular players. The approach presented is general and readily applicable in any setting with multiple sources of biological prior knowledge. PMID:22578440
NASA Astrophysics Data System (ADS)
García, Marga; Dowdeswell, J. A.; Noormets, R.; Hogan, K. A.; Evans, J.; Ó Cofaigh, C.; Larter, R. D.
2016-12-01
Detailed bathymetric and sub-bottom acoustic observations in Bourgeois Fjord (Marguerite Bay, Antarctic Peninsula) provide evidence on sedimentary processes and glacier dynamics during the last glacial cycle. Submarine landforms observed in the 50 km-long fjord, from the margins of modern tidewater glaciers to the now ice-distal Marguerite Bay, are described and interpreted. The landforms are grouped into four morpho-sedimentary systems: (i) glacial advance and full-glacial; (ii) subglacial and ice-marginal meltwater; (iii) glacial retreat and neoglaciation; and (iv) Holocene mass-wasting. These morpho-sedimentary systems have been integrated with morphological studies of the Marguerite Bay continental shelf and analysed in terms of the specific sedimentary processes and/or stages of the glacial cycle. They demonstrate the action of an ice-sheet outlet glacier that produced drumlins and crag-and-tail features in the main and outer fjord. Meltwater processes eroded bedrock channels and ponds infilled by fine-grained sediments. Following the last deglaciation of the fjord at about 9000 yr BP, subsequent Holocene neoglacial activity involved minor readvances of a tidewater glacier terminus in Blind Bay. Recent stillstands and/or minor readvances are inferred from the presence of a major transverse moraine that indicates grounded ice stabilization, probably during the Little Ice Age, and a series of smaller landforms that reveal intermittent minor readvances. Mass-wasting processes also affected the walls of the fjord and produced scars and fan-shaped deposits during the Holocene. Glacier-terminus changes during the last six decades, derived from satellite images and aerial photographs, reveal variable behaviour of adjacent tidewater glaciers. The smaller glaciers show the most marked recent retreat, influenced by regional physiography and catchment-area size.
Multiproxy evidence of Holocene climate variability from estuarine sediments, eastern North America
Cronin, T. M.; Thunell, R.; Dwyer, G.S.; Saenger, C.; Mann, M.E.; Vann, C.; Seal, R.R.
2005-01-01
We reconstructed paleoclimate patterns from oxygen and carbon isotope records from the fossil estuarine benthic foraminifera Elphidium and Mg/ Ca ratios from the ostracode Loxoconcha from sediment cores from Chesapeake Bay to examine the Holocene evolution of North Atlantic Oscillation (NAO)-type climate variability. Precipitation-driven river discharge and regional temperature variability are the primary influences on Chesapeake Bay salinity and water temperature, respectively. We first calibrated modern ??18 Owater to salinity and applied this relationship to calculate trends in paleosalinity from the ??18 Oforam, correcting for changes in water temperature estimated from ostracode Mg /Ca ratios. The results indicate a much drier early Holocene in which mean paleosalinity was ???28 ppt in the northern bay, falling ???25% to ???20 ppt during the late Holocene. Early Holocene Mg/Ca-derived temperatures varied in a relatively narrow range of 13?? to 16??C with a mean temperature of 14.2??C and excursions above 16??C; the late Holocene was on average cooler (mean temperature of 12.8??C). In addition to the large contrast between early and late Holocene regional climate conditions, multidecadal (20-40 years) salinity and temperature variability is an inherent part of the region's climate during both the early and late Holocene, including the Medieval Warm Period and Little Ice Age. These patterns are similar to those observed during the twentieth century caused by NAO-related processes. Comparison of the midlatitude Chesapeake Bay salinity record with tropical climate records of Intertropical Convergence Zone fluctuations inferred from the Cariaco Basin titanium record suggests an anticorrelation between precipitation in the two regions at both millennial and centennial timescales. Copyright 2005 by the American Geophysical Union.
Richmond, Jonathan Q.; Wood, Dustin A.; Swaim, Karen; Fisher, Robert N.; Vandergast, Amy
2016-01-01
We used microsatellites and mtDNA sequences to examine the mixed effects of geophysical, habitat, and contemporary urban barriers on the genetics of threatened Alameda Striped Racers (Coluber lateralis euryxanthus), a species with close ties to declining coastal scrub and chaparral habitat in the eastern San Francisco Bay area of California. We used cluster assignments to characterize population genetic structuring with respect to land management units and approximate Bayesian analysis to rank the ability of five alternative evolutionary hypotheses to explain the inferred structure. Then, we estimated rates of contemporary and historical migration among the major clusters and measured the fit of different historical migration models to better understand the formation of the current population structure. Our results reveal a ring-like pattern of historical connectivity around the Tri-Valley area of the East Bay (i.e., San Ramon, Amador, and Livermore valleys), with clusters largely corresponding to different management units. We found no evidence of continuous gene flow throughout the ring, however, and that the main gap in continuity is centered across the Livermore Valley. Historical migration models support higher rates of gene flow away from the terminal ends of the ring on the north and south sides of the Valley, compared with rates into those areas from western sites that border the interior San Francisco Bay. We attribute the break in ring-like connectivity to the presence of unsuitable habitat within the Livermore Valley that has been reinforced by 20th century urbanization, and the asymmetry in gene flow rates to spatial constraints on movement and east–west environmental gradients influenced by the proximity of the San Francisco Bay.
NASA Astrophysics Data System (ADS)
Chan, J. H.; Richardson, I. S.; Strayer, L. M.; Catchings, R.; McEvilly, A.; Goldman, M.; Criley, C.; Sickler, R. R.
2017-12-01
The Hayward Fault Zone (HFZ) includes the Hayward fault (HF), as well as several named and unnamed subparallel, subsidiary faults to the east, among them the Quaternary-active Chabot Fault (CF), the Miller Creek Fault (MCF), and a heretofore unnamed fault, the Redwood Thrust Fault (RTF). With an ≥M6.0 recurrence interval of 130 y for the HF and the last major earthquake in 1868, the HFZ is a major seismic hazard in the San Francisco Bay Area, exacerbated by the many unknown and potentially active secondary faults of the HFZ. In 2016, researchers from California State University, East Bay, working in concert with the United States Geological Survey conducted the East Bay Seismic Investigation (EBSI). We deployed 296 RefTek RT125 (Texan) seismographs along a 15-km-long linear seismic profile across the HF, extending from the bay in San Leandro to the hills in Castro Valley. Two-channel seismographs were deployed at 100 m intervals to record P- and S-waves, and additional single-channel seismographs were deployed at 20 m intervals where the seismic line crossed mapped faults. The active-source survey consisted of 16 buried explosive shots located at approximately 1-km intervals along the seismic line. We used the Multichannel Analysis of Surfaces Waves (MASW) method to develop 2-D shear-wave velocity models across the CF, MCF, and RTF. Preliminary MASW analysis show areas of anomalously low S-wave velocities , indicating zones of reduced shear modulus, coincident with these three mapped faults; additional velocity anomalies coincide with unmapped faults within the HFZ. Such compliant zones likely correspond to heavily fractured rock surrounding the faults, where the shear modulus is expected to be low compared to the undeformed host rock.
Learning topic models by belief propagation.
Zeng, Jia; Cheung, William K; Liu, Jiming
2013-05-01
Latent Dirichlet allocation (LDA) is an important hierarchical Bayesian model for probabilistic topic modeling, which attracts worldwide interest and touches on many important applications in text mining, computer vision and computational biology. This paper represents the collapsed LDA as a factor graph, which enables the classic loopy belief propagation (BP) algorithm for approximate inference and parameter estimation. Although two commonly used approximate inference methods, such as variational Bayes (VB) and collapsed Gibbs sampling (GS), have gained great success in learning LDA, the proposed BP is competitive in both speed and accuracy, as validated by encouraging experimental results on four large-scale document datasets. Furthermore, the BP algorithm has the potential to become a generic scheme for learning variants of LDA-based topic models in the collapsed space. To this end, we show how to learn two typical variants of LDA-based topic models, such as author-topic models (ATM) and relational topic models (RTM), using BP based on the factor graph representations.
Learning what to expect (in visual perception)
Seriès, Peggy; Seitz, Aaron R.
2013-01-01
Expectations are known to greatly affect our experience of the world. A growing theory in computational neuroscience is that perception can be successfully described using Bayesian inference models and that the brain is “Bayes-optimal” under some constraints. In this context, expectations are particularly interesting, because they can be viewed as prior beliefs in the statistical inference process. A number of questions remain unsolved, however, for example: How fast do priors change over time? Are there limits in the complexity of the priors that can be learned? How do an individual’s priors compare to the true scene statistics? Can we unlearn priors that are thought to correspond to natural scene statistics? Where and what are the neural substrate of priors? Focusing on the perception of visual motion, we here review recent studies from our laboratories and others addressing these issues. We discuss how these data on motion perception fit within the broader literature on perceptual Bayesian priors, perceptual expectations, and statistical and perceptual learning and review the possible neural basis of priors. PMID:24187536
NASA Astrophysics Data System (ADS)
Caticha, Ariel
2007-11-01
What is information? Is it physical? We argue that in a Bayesian theory the notion of information must be defined in terms of its effects on the beliefs of rational agents. Information is whatever constrains rational beliefs and therefore it is the force that induces us to change our minds. This problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), which is designed for updating from arbitrary priors given information in the form of arbitrary constraints, includes as special cases both MaxEnt (which allows arbitrary constraints) and Bayes' rule (which allows arbitrary priors). Thus, ME unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme that allows us to handle problems that lie beyond the reach of either of the two methods separately. I conclude with a couple of simple illustrative examples.
Generalized species sampling priors with latent Beta reinforcements
Airoldi, Edoardo M.; Costa, Thiago; Bassetti, Federico; Leisen, Fabrizio; Guindani, Michele
2014-01-01
Many popular Bayesian nonparametric priors can be characterized in terms of exchangeable species sampling sequences. However, in some applications, exchangeability may not be appropriate. We introduce a novel and probabilistically coherent family of non-exchangeable species sampling sequences characterized by a tractable predictive probability function with weights driven by a sequence of independent Beta random variables. We compare their theoretical clustering properties with those of the Dirichlet Process and the two parameters Poisson-Dirichlet process. The proposed construction provides a complete characterization of the joint process, differently from existing work. We then propose the use of such process as prior distribution in a hierarchical Bayes modeling framework, and we describe a Markov Chain Monte Carlo sampler for posterior inference. We evaluate the performance of the prior and the robustness of the resulting inference in a simulation study, providing a comparison with popular Dirichlet Processes mixtures and Hidden Markov Models. Finally, we develop an application to the detection of chromosomal aberrations in breast cancer by leveraging array CGH data. PMID:25870462
Data Analysis with Graphical Models: Software Tools
NASA Technical Reports Server (NTRS)
Buntine, Wray L.
1994-01-01
Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
Cortical Coupling Reflects Bayesian Belief Updating in the Deployment of Spatial Attention.
Vossel, Simone; Mathys, Christoph; Stephan, Klaas E; Friston, Karl J
2015-08-19
The deployment of visuospatial attention and the programming of saccades are governed by the inferred likelihood of events. In the present study, we combined computational modeling of psychophysical data with fMRI to characterize the computational and neural mechanisms underlying this flexible attentional control. Sixteen healthy human subjects performed a modified version of Posner's location-cueing paradigm in which the percentage of cue validity varied in time and the targets required saccadic responses. Trialwise estimates of the certainty (precision) of the prediction that the target would appear at the cued location were derived from a hierarchical Bayesian model fitted to individual trialwise saccadic response speeds. Trial-specific model parameters then entered analyses of fMRI data as parametric regressors. Moreover, dynamic causal modeling (DCM) was performed to identify the most likely functional architecture of the attentional reorienting network and its modulation by (Bayes-optimal) precision-dependent attention. While the frontal eye fields (FEFs), intraparietal sulcus, and temporoparietal junction (TPJ) of both hemispheres showed higher activity on invalid relative to valid trials, reorienting responses in right FEF, TPJ, and the putamen were significantly modulated by precision-dependent attention. Our DCM results suggested that the precision of predictability underlies the attentional modulation of the coupling of TPJ with FEF and the putamen. Our results shed new light on the computational architecture and neuronal network dynamics underlying the context-sensitive deployment of visuospatial attention. Spatial attention and its neural correlates in the human brain have been studied extensively with the help of fMRI and cueing paradigms in which the location of targets is pre-cued on a trial-by-trial basis. One aspect that has so far been neglected concerns the question of how the brain forms attentional expectancies when no a priori probability information is available but needs to be inferred from observations. This study elucidates the computational and neural mechanisms under which probabilistic inference governs attentional deployment. Our results show that Bayesian belief updating explains changes in cortical connectivity; in that directional influences from the temporoparietal junction on the frontal eye fields and the putamen were modulated by (Bayes-optimal) updates. Copyright © 2015 Vossel et al.
Mobile Context Provider for Social Networking
NASA Astrophysics Data System (ADS)
Santos, André C.; Cardoso, João M. P.; Ferreira, Diogo R.; Diniz, Pedro C.
The ability to infer user context based on a mobile device together with a set of external sensors opens up the way to new context-aware services and applications. In this paper, we describe a mobile context provider that makes use of sensors available in a smartphone as well as sensors externally connected via bluetooth. We describe the system architecture from sensor data acquisition to feature extraction, context inference and the publication of context information to well-known social networking services such as Twitter and Hi5. In the current prototype, context inference is based on decision trees, but the middleware allows the integration of other inference engines. Experimental results suggest that the proposed solution is a promising approach to provide user context to both local and network-level services.
Raffaello Multi-Purpose Logistics Module (MPLM) in Discovery Cargo Bay
NASA Technical Reports Server (NTRS)
2005-01-01
Launched on July 26, 2005 from the Kennedy Space Center in Florida, STS-114 was classified as Logistics Flight 1. Among the Station-related activities of the mission were the delivery of new supplies and the replacement of one of the orbital outpost's Control Moment Gyroscopes (CMGs). STS-114 also carried the Raffaello Multi-Purpose Logistics Module (MPLM) and the External Stowage Platform-2. Back dropped by popcorn-like clouds, the MPLM can be seen in the cargo bay as Discovery undergoes rendezvous and docking operations. Cosmonaut Sergei K. Kriklev, Expedition 11 Commander, and John L. Phillips, NASA Space Station officer and flight engineer photographed the spacecraft from the International Space Station (ISS).
Raffaello Multi-Purpose Logistics Module (MPLM) in Discovery Cargo Bay
NASA Technical Reports Server (NTRS)
2005-01-01
Launched on July 26 2005 from the Kennedy Space Center in Florida, STS-114 was classified as Logistics Flight 1. Among the Station-related activities of the mission were the delivery of new supplies and the replacement of one of the orbital outpost's Control Moment Gyroscopes (CMGs). STS-114 also carried the Raffaello Multi-Purpose Logistics Module (MPLM) and the External Stowage Platform-2. Back dropped by popcorn-like clouds, the MPLM can be seen in the cargo bay as Discovery undergoes rendezvous and docking operations. Cosmonaut Sergei K. Kriklev, Expedition 11 Commander, and John L. Phillips, NASA Space Station officer and flight engineer photographed the spacecraft from the International Space Station (ISS).
Effective Online Bayesian Phylogenetics via Sequential Monte Carlo with Guided Proposals
Fourment, Mathieu; Claywell, Brian C; Dinh, Vu; McCoy, Connor; Matsen IV, Frederick A; Darling, Aaron E
2018-01-01
Abstract Modern infectious disease outbreak surveillance produces continuous streams of sequence data which require phylogenetic analysis as data arrives. Current software packages for Bayesian phylogenetic inference are unable to quickly incorporate new sequences as they become available, making them less useful for dynamically unfolding evolutionary stories. This limitation can be addressed by applying a class of Bayesian statistical inference algorithms called sequential Monte Carlo (SMC) to conduct online inference, wherein new data can be continuously incorporated to update the estimate of the posterior probability distribution. In this article, we describe and evaluate several different online phylogenetic sequential Monte Carlo (OPSMC) algorithms. We show that proposing new phylogenies with a density similar to the Bayesian prior suffers from poor performance, and we develop “guided” proposals that better match the proposal density to the posterior. Furthermore, we show that the simplest guided proposals can exhibit pathological behavior in some situations, leading to poor results, and that the situation can be resolved by heating the proposal density. The results demonstrate that relative to the widely used MCMC-based algorithm implemented in MrBayes, the total time required to compute a series of phylogenetic posteriors as sequences arrive can be significantly reduced by the use of OPSMC, without incurring a significant loss in accuracy. PMID:29186587
NASA Astrophysics Data System (ADS)
Hincks, Ian; Granade, Christopher; Cory, David G.
2018-01-01
The analysis of photon count data from the standard nitrogen vacancy (NV) measurement process is treated as a statistical inference problem. This has applications toward gaining better and more rigorous error bars for tasks such as parameter estimation (e.g. magnetometry), tomography, and randomized benchmarking. We start by providing a summary of the standard phenomenological model of the NV optical process in terms of Lindblad jump operators. This model is used to derive random variables describing emitted photons during measurement, to which finite visibility, dark counts, and imperfect state preparation are added. NV spin-state measurement is then stated as an abstract statistical inference problem consisting of an underlying biased coin obstructed by three Poisson rates. Relevant frequentist and Bayesian estimators are provided, discussed, and quantitatively compared. We show numerically that the risk of the maximum likelihood estimator is well approximated by the Cramér-Rao bound, for which we provide a simple formula. Of the estimators, we in particular promote the Bayes estimator, owing to its slightly better risk performance, and straightforward error propagation into more complex experiments. This is illustrated on experimental data, where quantum Hamiltonian learning is performed and cross-validated in a fully Bayesian setting, and compared to a more traditional weighted least squares fit.
Bayesian inference for OPC modeling
NASA Astrophysics Data System (ADS)
Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.
2016-03-01
The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.
Receptive Field Inference with Localized Priors
Park, Mijung; Pillow, Jonathan W.
2011-01-01
The linear receptive field describes a mapping from sensory stimuli to a one-dimensional variable governing a neuron's spike response. However, traditional receptive field estimators such as the spike-triggered average converge slowly and often require large amounts of data. Bayesian methods seek to overcome this problem by biasing estimates towards solutions that are more likely a priori, typically those with small, smooth, or sparse coefficients. Here we introduce a novel Bayesian receptive field estimator designed to incorporate locality, a powerful form of prior information about receptive field structure. The key to our approach is a hierarchical receptive field model that flexibly adapts to localized structure in both spacetime and spatiotemporal frequency, using an inference method known as empirical Bayes. We refer to our method as automatic locality determination (ALD), and show that it can accurately recover various types of smooth, sparse, and localized receptive fields. We apply ALD to neural data from retinal ganglion cells and V1 simple cells, and find it achieves error rates several times lower than standard estimators. Thus, estimates of comparable accuracy can be achieved with substantially less data. Finally, we introduce a computationally efficient Markov Chain Monte Carlo (MCMC) algorithm for fully Bayesian inference under the ALD prior, yielding accurate Bayesian confidence intervals for small or noisy datasets. PMID:22046110
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
Statistical Inference of a RANS closure for a Jet-in-Crossflow simulation
NASA Astrophysics Data System (ADS)
Heyse, Jan; Edeling, Wouter; Iaccarino, Gianluca
2016-11-01
The jet-in-crossflow is found in several engineering applications, such as discrete film cooling for turbine blades, where a coolant injected through hols in the blade's surface protects the component from the hot gases leaving the combustion chamber. Experimental measurements using MRI techniques have been completed for a single hole injection into a turbulent crossflow, providing full 3D averaged velocity field. For such flows of engineering interest, Reynolds-Averaged Navier-Stokes (RANS) turbulence closure models are often the only viable computational option. However, RANS models are known to provide poor predictions in the region close to the injection point. Since these models are calibrated on simple canonical flow problems, the obtained closure coefficient estimates are unlikely to extrapolate well to more complex flows. We will therefore calibrate the parameters of a RANS model using statistical inference techniques informed by the experimental jet-in-crossflow data. The obtained probabilistic parameter estimates can in turn be used to compute flow fields with quantified uncertainty. Stanford Graduate Fellowship in Science and Engineering.
Methodologies for Combined Loads Tests Using a Multi-Actuator Test Machine
NASA Technical Reports Server (NTRS)
Rouse, Marshall
2013-01-01
The NASA Langley COmbined Loads Test System (COLTS) Facility was designed to accommodate a range of fuselage structures and wing sections and subject them to both quasistatic and cyclic loading conditions. Structural tests have been conducted in COLTS that address structural integrity issues of metallic and fiber reinforced composite aerospace structures in support of NASA Programs (i.e. the Aircraft Structural Integrity (ASIP) Program, High-Speed-Research program and the Supersonic Project, NASA Engineering and Safety Center (NESC) Composite Crew Module Project, and the Environmentally Responsible Aviation Program),. This paper presents experimental results for curved panels subjected to mechanical and internal pressure loads using a D-box test fixture. Also, results are presented that describe use of a checkout beam for development of testing procedures for a combined mechanical and pressure loading test of a Multi-bay box. The Multi-bay box test will be used to experimentally verify the structural performance of the Multi-bay box in support of the Environmentally Responsible Aviation Project at NASA Langley.
2007-08-03
KENNEDY SPACE CENTER, FLA. - In Discovery's payload bay in Orbiter Processing Facility bay 3, STS-120 crew members are getting hands-on experience with a winch that is used to manually close the payload bay doors in the event that becomes necessary. At right is Expedition 16 Flight Engineer Daniel M. Tani. The STS-120 crew is at Kennedy for a crew equipment interface test, or CEIT, which includes harness training, inspection of the thermal protection system and camera operation for planned extravehicular activities, or EVAs. The STS-120 mission will deliver the Harmony module, christened after a school contest, which will provide attachment points for European and Japanese laboratory modules on the International Space Station. Known in technical circles as Node 2, it is similar to the six-sided Unity module that links the U.S. and Russian sections of the station. Built in Italy for the United States, Harmony will be the first new U.S. pressurized component to be added. The STS-120 mission is targeted to launch on Oct. 20. Photo credit: NASA/George Shelton
MoCha: Molecular Characterization of Unknown Pathways.
Lobo, Daniel; Hammelman, Jennifer; Levin, Michael
2016-04-01
Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.
Reverse engineering biological networks :applications in immune responses to bio-toxins.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martino, Anthony A.; Sinclair, Michael B.; Davidson, George S.
Our aim is to determine the network of events, or the regulatory network, that defines an immune response to a bio-toxin. As a model system, we are studying T cell regulatory network triggered through tyrosine kinase receptor activation using a combination of pathway stimulation and time-series microarray experiments. Our approach is composed of five steps (1) microarray experiments and data error analysis, (2) data clustering, (3) data smoothing and discretization, (4) network reverse engineering, and (5) network dynamics analysis and fingerprint identification. The technological outcome of this study is a suite of experimental protocols and computational tools that reverse engineermore » regulatory networks provided gene expression data. The practical biological outcome of this work is an immune response fingerprint in terms of gene expression levels. Inferring regulatory networks from microarray data is a new field of investigation that is no more than five years old. To the best of our knowledge, this work is the first attempt that integrates experiments, error analyses, data clustering, inference, and network analysis to solve a practical problem. Our systematic approach of counting, enumeration, and sampling networks matching experimental data is new to the field of network reverse engineering. The resulting mathematical analyses and computational tools lead to new results on their own and should be useful to others who analyze and infer networks.« less
Hinds Community College MSEIP program
2005-06-24
Student Assistant Antoinette Davis (left) of Utica; Carmella Forsythe, 13, of Clinton; Terri Henderson, 14, of Clinton; Tyra Greer, 12, of Port Gibson; and Kala Battle, 14, of Edwards, answer curriculum questions about NASA's Return to Flight mission exhibit at StenniSphere, the visitor center at NASA's Stennis Space Center (SSC) near Bay St. Louis, Miss. The girls were on a field trip to StenniSphere with fellow participants in Hinds Community College's MSEIP (Minority Science Engineering Improvement Program) summer program. MSEIP encourages students to pursue and prepare for careers in science, technology, engineering and math.
Engineer pedals STS-37 CETA electrical cart along track in JSC MAIL Bldg 9A
NASA Technical Reports Server (NTRS)
1990-01-01
McDonnell Douglas engineer Gary Peters operates crew and equipment translation aid (CETA) electrical hand pedal cart in JSC's Mockup and Integration Laboratory (MAIL) Bldg 9A. Peters, wearing extravehicular mobility unit (EMU) boots and positioned in portable foot restraint (PFR), is suspended above CETA cart and track via harness to simulate weightlessness. The electrical cart is moved by electricity generated from turning hand pedals. CETA will be tested in orbit in the payload bay of Atlantis, Orbiter Vehicle (OV) 104, during STS-37.
MATERIALS TESTING REACTOR (MTR) BUILDING, TRA603. CONTEXTUAL VIEW OF MTR ...
MATERIALS TESTING REACTOR (MTR) BUILDING, TRA-603. CONTEXTUAL VIEW OF MTR BUILDING SHOWING NORTH SIDES OF THE HIGH-BAY REACTOR BUILDING, ITS SECOND/THIRD FLOOR BALCONY LEVEL, AND THE ATTACHED ONE-STORY OFFICE/LABORATORY BUILDING, TRA-604. CAMERA FACING SOUTHEAST. VERTICAL CONCRETE-SHROUDED BEAMS SUPPORT PRECAST CONCRETE PANELS. CONCRETE PROJECTION FORMED AS A BUNKER AT LEFT OF VIEW IS TRA-657, PLUG STORAGE BUILDING. INL NEGATIVE NO. HD46-42-1. Mike Crane, Photographer, 4/2005 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID
Hinds Community College MSEIP program
NASA Technical Reports Server (NTRS)
2005-01-01
Student Assistant Antoinette Davis (left) of Utica; Carmella Forsythe, 13, of Clinton; Terri Henderson, 14, of Clinton; Tyra Greer, 12, of Port Gibson; and Kala Battle, 14, of Edwards, answer curriculum questions about NASA's Return to Flight mission exhibit at StenniSphere, the visitor center at NASA's Stennis Space Center (SSC) near Bay St. Louis, Miss. The girls were on a field trip to StenniSphere with fellow participants in Hinds Community College's MSEIP (Minority Science Engineering Improvement Program) summer program. MSEIP encourages students to pursue and prepare for careers in science, technology, engineering and math.
114. Photocopy of original construction drawing, 14 August 1935. (Original ...
114. Photocopy of original construction drawing, 14 August 1935. (Original print in the possession of U.S. Army Corps of Engineers, Portland District, Portland, OR.) (M-5-8, Sheet No. 14) SPILLWAY DAM FISHWAY ENTRANCE BAY DIFFUSION CHAMBER BEAN DETAILS. - Bonneville Project, Bonneville Dam, Columbia River, Bonneville, Multnomah County, OR
PBF (PER620) interior. Detail view of door in north wall ...
PBF (PER-620) interior. Detail view of door in north wall of reactor bay. Camera facing north. Note tonnage weighting of hatch covers in floor. Date: May 2004. INEEL negative no. HD-41-8-2 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID
PBF Reactor Building (PER620). Camera faces north into highbay/reactor pit ...
PBF Reactor Building (PER-620). Camera faces north into high-bay/reactor pit area. Inside from for reactor enclosure is in place. Photographer: John Capek. Date: March 15, 1967. INEEL negative no. 67-1769 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID
PBF Reactor Building (PER620). Camera facing south end of high ...
PBF Reactor Building (PER-620). Camera facing south end of high bay. Vertical-lift door is being installed. Later, pneumatic seals will be installed around door. Photographer: Kirsh. Date: September 31, 1968. INEEL negative no. 68-3176 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID
6. INTERIOR VIEW TO THE EAST OF SOUTHEAST CORNER OF ...
6. INTERIOR VIEW TO THE EAST OF SOUTHEAST CORNER OF THE HOT BAY. A LARGE MANIPULATOR ARM AND HORIZONTAL TRACKING SYSTEM IS SHOWN ABOVE SMALLER MANIPULATOR ARM WORK STATIONS. ASSOCIATED WITH THE WORK STATIONS ARE OBSERVATION WINDOWS. - Nevada Test Site, Engine Maintenance Assembly & Disassembly Facility, Area 25, Jackass Flats, Mercury, Nye County, NV
Interpretation of a SAR (Synthetic Aperture Radar) Image of the Bay of Biscay.
1983-09-01
stimulating. Professor Cantin pursued a tradition of :renerous support bitween tIhe Mechanical Engineering and ths 3ceanographv Departments of the School...Laborator 2 California Institute of tchloqy 4800 Oik Drive, Pasadena, Ca 91109 20. SACLANT ASW Researsh Center 2 ,11 fcr R. Molcard, hoplied ):sanorga -iale
Investigation of Experimental Lightweight Firewall Materials for A/C Engine Bay Applications.
1985-04-01
umidity conditions. The metal clad Johns - Manville samples were basically qual. They do not provide a vapor barrier in the configurations tested, but an be...made to by the use of coatings or backings. The flexibility of the 3M nd Johns - Manville samples make them excellent choices for subsystem fire
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayhew, H.L.; Karle, L.M.; Gruendell, B.D.
The US Army Corps of Engineers was authorized to dredge Richmond Harbor to accomodate large, deep-draft vessels. An ecological evaluation of the Harbor sediments was performed describing the physical characteristics, toxic substances, effects on aquatic organisms,and potential for bioaccumulation of chemical contaminants. The objective of this report is to compare the sediment chemistry, acute toxicity, and bioaccumulation results of the Richmond Harbor sediments to each of the reference areas; i.e., the Deep Off-Shelf Reference Area, the Bay Farm Borrow Area, and the Alcatraz Environs Reference Area. This report will enable the US Army Corps of Engineers to determine whether disposalmore » at a reference area is appropriate for all or part of the dredged material from Richmond Harbor. Chemical analyses were performed on 30 sediment samples; 28 of those samples were then combined to form 7 composites. The seven composites plus sediment from two additional stations received both chemical and biological evaluations.« less
Statistical detection of EEG synchrony using empirical bayesian inference.
Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven
2015-01-01
There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries.
NASA Astrophysics Data System (ADS)
Ponton, C.; Giosan, L.; Eglinton, T. I.; Scientific Team Of Indian National Gas Hydrate Program Expedition 01
2010-12-01
The Asian monsoon, composed of the East Asian and Indian systems affects the most densely populated region of the planet. The Indian monsoon is one of the most energetic and dynamic climate processes that occur today on Earth, but we still do not have a detailed understanding of large-scale hydrological variability over the Indian peninsula during the Holocene. Previous studies of the salinity variations in the Bay of Bengal indicate that during the last glacial maximum the Indian monsoon system was weaker and precipitation over the area was lower than today. Here we provide the first high resolution Holocene climate record for central India measured on a sediment core recovered offshore the mouth of the Godavari River, on the eastern Indian shelf. The δ13C composition of leaf waxes preserved in the core shows a large range of variation suggesting a major change in the relative proportions of C3 and C4 plant-derived wax inputs during the Holocene. Using reported values for modern plants, we estimate that C3 plants suffered a reduction in the Godavari basin from ~45% to ~15% over the Holocene. Negative excursions of δ13C leaf wax suggest that short-lived events of C3 plant resurgence (and inferred higher precipitation) punctuated the process of aridification of peninsular India. The vegetation structure and inferred aridity in central India is consistent with reconstructions of Indian monsoon precipitation and wind intensity in the Arabian Sea, salinity in the Bay of Bengal, and precipitation proxy records for the East Asian monsoon, suggesting a coherent behavior of the Asian monsoon system over the Holocene.
NASA Astrophysics Data System (ADS)
Coote, Alisha; Shane, Phil; Stirling, Claudine; Reid, Malcolm
2018-02-01
Late Quaternary, porphyritic basalts erupted in the Kaikohe-Bay of Islands area, New Zealand, provide an opportunity to explore the crystallization and ascent history of small volume magmas in an intra-continental monogenetic volcano field. The plagioclase phenocrysts represent a diverse crystal cargo. Most of the crystals have a rim growth that is compositionally similar to groundmass plagioclase ( An65) and is in equilibrium with the host basalt rock. The rims surround a resorbed core that is either less calcic ( An20-45) or more calcic (> An70), having crystallized in more differentiated or more primitive melts, respectively. The relic cores, particularly those that are less calcic (< An45), have 87Sr/86Sr ratios that are either mantle-like ( 0.7030) or crustal-like ( 0.7040 to 0.7060), indicating some are antecrysts formed in melts fractionated from plutonic basaltic forerunners, while others are true xenocrysts from greywacke basement and/or Miocene arc volcanics. It is envisaged that intrusive basaltic forerunners produced a zone where various degrees of crustal assimilation and fractional crystallization occurred. The erupted basalts represent mafic recharge of this system, as indicated by the final crystal rim growths around the entrained antecrystic and xenocrystic cargo. The recharge also entrained cognate gabbros that occur as inclusions, and produced mingled groundmasses. Multi-stage magmatic ascent and interaction is indicated, and is consistent with the presence of a partial melt body in the lower crust detected by geophysical methods. This crystallization history contrasts with traditional concepts of low-flux basaltic systems where rapid ascent from the mantle is inferred. From a hazards perspective, the magmatic system inferred here increases the likelihood of detecting eruption precursor phenomena such as seismicity, degassing and surface deformation.
Collett, T.S.; Kvenvolden, K.A.; Magoon, L.B.
1990-01-01
In the Kuparuk River Unit 2D-15 well, on the North Slope of Alaska, a 60 m-thick stratigraphic interval that lies within the theoretical pressure-temperature field of gas-hydrate stability is inferred to contain methane hydrates. This inference is based on interpretations from well logs: (1) release of methane during drilling, as indicated by the mud log, (2) an increase in acoustic velocity on the sonic log, and (3) an increase of electrical resistivity on the electric logs. Our objective was to determine the composition and source of the gas within the shallow gas-hydrate-bearing interval based on analyses of cutting gas. Headspace gas from canned drill cuttings collected from within the gas-hydrate-bearing interval of this well has an average methane to ethane plus propane [C1/(C2 + C3)] ratio of about 7000 and an average methane ??13C value of -46% (relative to the PDB standard). These compositions are compared with those obtained at one well located to the north of 2D-15 along depositional strike and one down-dip well to the northeast. In the well located on depositional strike (Kuparuk River Unit 3K-9), gas compositions are similar to those found at 2D-15. At the down-dip well (Prudhoe Bay Unit R-1), the C1/(C2 + C3) ratios are lower (700) and the methane ??13C is heavier (-33%). We conclude that the methane within the stratigraphic interval of gas hydrate stability comes from two sources-in situ microbial gas and migrated thermogenic gas. The thermal component is greatest at Prudhoe Bay. Up-dip to the west, the thermogenic component decreases, and microbial gas assumes more importance. ?? 1990.
NASA Astrophysics Data System (ADS)
Aiello, Gemma; Marsella, Ennio; Fiore, Vincenzo Di
2012-06-01
A detailed reconstruction of the stratigraphic and tectonic setting of the Gulf of Pozzuoli (Naples Bay) is provided on the basis of newly acquired single channel seismic profiles coupled with already recorded marine magnetics gathering the volcanic nature of some seismic units. Inferences for the tectonic and magmatic setting of the Phlegrean Fields volcanic complex, a volcanic district surrounding the western part of the Gulf of Naples, where volcanism has been active since at least 50 ka, are also discussed. The Gulf of Pozzuoli represents the submerged border of the Phlegrean caldera, resulting from the volcano-tectonic collapse induced from the pyroclastic flow deposits of the Campanian Ignimbrite (35 ka). Several morpho-depositional units have been identified, i.e., the inner continental shelf, the central basin, the submerged volcanic banks and the outer continental shelf. The stratigraphic relationships between the Quaternary volcanic units related to the offshore caldera border and the overlying deposits of the Late Quaternary depositional sequence in the Gulf of Pozzuoli have been highlighted. Fourteen main seismic units, both volcanic and sedimentary, tectonically controlled due to contemporaneous folding and normal faulting have been revealed by geological interpretation. Volcanic dykes, characterized by acoustically transparent sub-vertical bodies, locally bounded by normal faults, testify to the magma uprising in correspondence with extensional structures. A large field of tuff cones interlayered with marine deposits off the island of Nisida, on the western rim of the gulf, is related to the emplacement of the Neapolitan Yellow Tuff deposits. A thick volcanic unit, exposed over a large area off the Capo Miseno volcanic edifice is connected with the Bacoli-Isola Pennata-Capo Miseno yellow tuffs, cropping out in the northern Phlegrean Fields.
Nelson, A.R.; Jennings, A.E.; Kashima, K.
1996-01-01
Much of the uncertainty in determining the number and magnitude of past great earthquakes in the Cascadia subduction zone of western North America stems from difficulties in using estuarine stratigraphy to infer the size and rate of late Holocene relative sea-level changes. A sequence of interbedded peaty and muddy intertidal sediment beneath a small, protected tidal marsh in a narrow inlet of Coos Bay, Oregon, records ten rapid to instantaneous rises in relative sea level. Each rise is marked by a contact that records an upward transition from peaty to muddy sediment. But only two contacts, dating from about 1700 and 2300 yr ago, show the site-wide extent and abrupt changes in lithology and foraminiferal and diatom assemblages that can be used to infer at least half a meter of sudden coseismic subsidence. Although the characteristics of a third, gradual contact do not differ from those of some contacts produced by nonseismic processes, regional correlation with other similar sequences and high-precision 14C dating suggest that the third contact records a great plate-boundary earthquake about 300 yr ago. A fourth contact formed too slowly to have been caused by coseismic subsidence. Because lithologic and microfossil data are not sufficient to distinguish a coseismic from a nonseismic origin for the other six peatmud contacts, we cannot determine earthquake recurrence intervals at this site. Similar uncertainties in great earthquake recurrence and magnitude prevail at similar sites elsewhere in the Cascadia subduction zone, except those with sequences showing changes in fossils indicative of > 1 m of sudden subsidence, sand sheets deposited by tsunamis, or liquefaction features.
Empirical Bayes conditional independence graphs for regulatory network recovery.
Mahdi, Rami; Madduri, Abishek S; Wang, Guoqing; Strulovici-Barel, Yael; Salit, Jacqueline; Hackett, Neil R; Crystal, Ronald G; Mezey, Jason G
2012-08-01
Computational inference methods that make use of graphical models to extract regulatory networks from gene expression data can have difficulty reconstructing dense regions of a network, a consequence of both computational complexity and unreliable parameter estimation when sample size is small. As a result, identification of hub genes is of special difficulty for these methods. We present a new algorithm, Empirical Light Mutual Min (ELMM), for large network reconstruction that has properties well suited for recovery of graphs with high-degree nodes. ELMM reconstructs the undirected graph of a regulatory network using empirical Bayes conditional independence testing with a heuristic relaxation of independence constraints in dense areas of the graph. This relaxation allows only one gene of a pair with a putative relation to be aware of the network connection, an approach that is aimed at easing multiple testing problems associated with recovering densely connected structures. Using in silico data, we show that ELMM has better performance than commonly used network inference algorithms including GeneNet, ARACNE, FOCI, GENIE3 and GLASSO. We also apply ELMM to reconstruct a network among 5492 genes expressed in human lung airway epithelium of healthy non-smokers, healthy smokers and individuals with chronic obstructive pulmonary disease assayed using microarrays. The analysis identifies dense sub-networks that are consistent with known regulatory relationships in the lung airway and also suggests novel hub regulatory relationships among a number of genes that play roles in oxidative stress and secretion. Software for running ELMM is made available at http://mezeylab.cb.bscb.cornell.edu/Software.aspx. ramimahdi@yahoo.com or jgm45@cornell.edu Supplementary data are available at Bioinformatics online.
The SSMEPF opens with a ribbon-cutting ceremony
NASA Technical Reports Server (NTRS)
1998-01-01
Participants in the ribbon cutting for KSC's new 34,600-square- foot Space Shuttle Main Engine Processing Facility (SSMEPF) gather to talk inside the facility following the ceremony. From left, they are Robert B. Sieck, director of Shuttle Processing; KSC Center Director Roy D. Bridges Jr.; U.S. Congressman Dave Weldon; John Plowden, vice president of Rocketdyne; and Donald R. McMonagle, manager of Launch Integration. A major addition to the existing Orbiter Processing Facility Bay 3, the SSMEPF replaces the Shuttle Main Engine Shop located in the Vehicle Assembly Building (VAB). The decision to move the shop out of the VAB was prompted by safety considerations and recent engine processing improvements. The first three main engines to be processed in the new facility will fly on Shuttle Endeavour's STS-88 mission in December 1998.
Oxygenation variability off Northern Chile during the last two centuries
NASA Astrophysics Data System (ADS)
Díaz-Ochoa, J. A.; Pantoja, S.; de Lange, G. J.; Lange, C. B.; Sánchez, G. E.; Acuña, V. R.; Muñoz, P.; Vargas, G.
2010-07-01
The Peru Chile Current ecosystem is characterized by high biological productivity and important fisheries. Although this system is likely to be severely affected by climate change, its response to current global warming is still uncertain. In this paper we analyze 10-166 year old sediments in two cores collected in Mejillones Bay, an anoxic sedimentary setting favorable for preservation of proxies. Based on a 166 year chronology we used indicators of bottom water oxygenation proxies (Mo, V, S, and the (lycopane+n-C35)/n-C31) ratio) and surface water productivity (biogenic opal, counts of diatom valves, biogenic Ba, organic carbon and chlorins) to reconstruct environmental variations in Mejillones Bay. We find that at decadal scales, and during the last two centuries, a shift in the coastal marine ecosystem off Northern Chile took place which was characterized by intense ENSO-like activity and large fluctuations in biological export productivity, in bottom water oxygenation, and increased eolic activity (inferred from Ti/Al and Zr/Al). On top of this short-term variability, a gradual increase of sulfidic conditions has occurred being even more intensified since the early 1960s.
Contrasting patterns in lichen diversity in the continental and maritime Antarctic
NASA Astrophysics Data System (ADS)
Singh, Shiv Mohan; Olech, Maria; Cannone, Nicoletta; Convey, Peter
2015-09-01
Systematic surveys of the lichen floras of Schirmacher Oasis (Queen Maud Land, continental Antarctic), Victoria Land (Ross Sector, continental Antarctic) and Admiralty Bay (South Shetland Islands, maritime Antarctic) were compared to help infer the major factors influencing patterns of diversity and biogeography in the three areas. Biogeographic patterns were determined using a variety of multivariate statistical tools. A total of 54 lichen species were documented from Schirmacher Oasis (SO), 48 from Victoria Land (VL) and 244 from Admiralty Bay (AB). Of these, 21 species were common to all areas. Most lichens from the SO and VL areas were microlichens, the dominant genus being Buellia. In AB, in contrast, many macrolichens were also present and the dominant genus was Caloplaca. In SO and VL large areas lacked any visible lichen cover, even where the ground was snow-free in summer. Small-scale diversity patterns were present in AB, where the number of species and genera was greater close to the coast. Most species recorded were rare in the study areas in which they were present and endemic to Antarctica.
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. In Orbiter Processing Facility bay 1, NASA Administrator Michael Griffin observes as technicians prepare Space Shuttle Atlantis for the second Return to Flight mission, STS-121. This is Griffin's first official visit to Kennedy Space Center. Griffin is the 11th administrator of NASA, a role he assumed on April 14, 2005. Griffin was nominated to the position in March while serving as the Space Department head at Johns Hopkins University's Applied Physics Laboratory in Baltimore. A registered professional engineer in Maryland and California, Griffin served as chief engineer at NASA earlier in his career. He holds numerous scientific and technical degrees including a Ph.D. in Aerospace Engineering from the University of Maryland.
A fuzzy logic intelligent diagnostic system for spacecraft integrated vehicle health management
NASA Technical Reports Server (NTRS)
Wu, G. Gordon
1995-01-01
Due to the complexity of future space missions and the large amount of data involved, greater autonomy in data processing is demanded for mission operations, training, and vehicle health management. In this paper, we develop a fuzzy logic intelligent diagnostic system to perform data reduction, data analysis, and fault diagnosis for spacecraft vehicle health management applications. The diagnostic system contains a data filter and an inference engine. The data filter is designed to intelligently select only the necessary data for analysis, while the inference engine is designed for failure detection, warning, and decision on corrective actions using fuzzy logic synthesis. Due to its adaptive nature and on-line learning ability, the diagnostic system is capable of dealing with environmental noise, uncertainties, conflict information, and sensor faults.
Opportunity Rolls Free Again (Left Front Wheel)
NASA Technical Reports Server (NTRS)
2006-01-01
This animated piece illustrates the recent escape of NASA's Mars Exploration Rover Opportunity from dangerous, loose material on the vast plains leading to the rover's next long-term target, 'Victoria Crater.' A series of images of the rover's left front wheel, taken by the front hazard-avoidance camera, make up this brief movie. It chronicles the challenge Opportunity faced to free itself from a ripple dubbed 'Jammerbugt.' The rover's wheels became partially embedded in the ripple at the end of a drive on Opportunity's 833rd Martian day, or sol (May 28, 2006). The images in this clip were taken on sols 836 through 841 (May 31 through June 5, 2006). Scientists and engineers who had been elated at the meters of progress the rover had been making in earlier drives were happy for even centimeters of advance per sol as they maneuvered their explorer through the slippery material of Jammerbugt. The wheels reached solid footing on a rock outcrop on the final sol of this sequence. The science and engineering teams appropriately chose the ripple's informal from name the name of a bay on the north coast of Denmark. Jammerbugt, or Jammerbugten, loosely translated, means Bay of Lamentation or Bay of Wailing. The shipping route from the North Sea to the Baltic passes Jammerbugt on its way around the northern tip of Jutland. This has always been an important trade route and many ships still pass by the bay. The prevailing wind directions are typically northwest to southwest with the strongest winds and storms tending to blow from the northwest. A northwesterly wind will blow straight into the Jammerbugt, towards shore. Therefore, in the age of sail, many ships sank there during storms. The shore is sandy, but can have strong waves, so running aground was very dangerous even though there are no rocks. Fortunately, Opportunity weathered its 'Jammerbugt' and is again on its way toward Victoria Crater.Opportunity Rolls Free Again (Right Front Wheel)
NASA Technical Reports Server (NTRS)
2006-01-01
This animated piece illustrates the recent escape of NASA's Mars Exploration Rover Opportunity from dangerous, loose material on the vast plains leading to the rover's next long-term target, 'Victoria Crater.' A series of images of the rover's right front wheel, taken by the front hazard-avoidance camera, make up this brief movie. It chronicles the challenge Opportunity faced to free itself from a ripple dubbed 'Jammerbugt.' The rover's wheels became partially embedded in the ripple at the end of a drive on Opportunity's 833rd Martian day, or sol (May 28, 2006). The images in this clip were taken on sols 836 through 841 (May 31 through June 5, 2006). Scientists and engineers who had been elated at the meters of progress the rover had been making in earlier drives were happy for even centimeters of advance per sol as they maneuvered their explorer through the slippery material of Jammerbugt. The wheels reached solid footing on a rock outcrop on the final sol of this sequence. The science and engineering teams appropriately chose the ripple's informal from name the name of a bay on the north coast of Denmark. Jammerbugt, or Jammerbugten, loosely translated, means Bay of Lamentation or Bay of Wailing. The shipping route from the North Sea to the Baltic passes Jammerbugt on its way around the northern tip of Jutland. This has always been an important trade route and many ships still pass by the bay. The prevailing wind directions are typically northwest to southwest with the strongest winds and storms tending to blow from the northwest. A northwesterly wind will blow straight into the Jammerbugt, towards shore. Therefore, in the age of sail, many ships sank there during storms. The shore is sandy, but can have strong waves, so running aground was very dangerous even though there are no rocks. Fortunately, Opportunity weathered its 'Jammerbugt' and is again on its way toward Victoria Crater.Selected Core Thinking Skills and Cognitive Strategy of an Expert and Novice Engineer
ERIC Educational Resources Information Center
Dixon, Raymond A.
2011-01-01
This exploratory study highlights certain differences in the way an expert and a novice engineer used their analyzing and generating skills while solving a fairly ill-structured design problem. The expert tends to use more inferences and elaboration when solving the design problem and the novice tend to use analysis that is focused on the…
Participatory Classification in a System for Assessing Multimodal Transportation Patterns
2015-02-17
Culler Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report No. UCB/EECS-2015-8 http...California at Berkeley,Electrical Engineering and Computer Sciences,Berkeley,CA,94720 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING...confirmation screen This section sketches the characteristics of the data that was collected, computes the accuracy of the auto- mated inference algorithm
Fuzzy/Neural Software Estimates Costs of Rocket-Engine Tests
NASA Technical Reports Server (NTRS)
Douglas, Freddie; Bourgeois, Edit Kaminsky
2005-01-01
The Highly Accurate Cost Estimating Model (HACEM) is a software system for estimating the costs of testing rocket engines and components at Stennis Space Center. HACEM is built on a foundation of adaptive-network-based fuzzy inference systems (ANFIS) a hybrid software concept that combines the adaptive capabilities of neural networks with the ease of development and additional benefits of fuzzy-logic-based systems. In ANFIS, fuzzy inference systems are trained by use of neural networks. HACEM includes selectable subsystems that utilize various numbers and types of inputs, various numbers of fuzzy membership functions, and various input-preprocessing techniques. The inputs to HACEM are parameters of specific tests or series of tests. These parameters include test type (component or engine test), number and duration of tests, and thrust level(s) (in the case of engine tests). The ANFIS in HACEM are trained by use of sets of these parameters, along with costs of past tests. Thereafter, the user feeds HACEM a simple input text file that contains the parameters of a planned test or series of tests, the user selects the desired HACEM subsystem, and the subsystem processes the parameters into an estimate of cost(s).
Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.
2009-01-01
Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.
NASA Technical Reports Server (NTRS)
Vonglahn, U. H.
1978-01-01
Combustion chamber acoustic power levels inferred from internal fluctuating pressure measurements are correlated with operating conditions and chamber geometries over a wide range. The variables include considerations of chamber design (can, annular, and reverse-flow annular) and size, number of fuel nozzles, burner staging and fuel split, airflow and heat release rates, and chamber inlet pressure and temperature levels. The correlated data include those obtained with combustion component development rigs as well as engines.
Reveal, A General Reverse Engineering Algorithm for Inference of Genetic Network Architectures
NASA Technical Reports Server (NTRS)
Liang, Shoudan; Fuhrman, Stefanie; Somogyi, Roland
1998-01-01
Given the immanent gene expression mapping covering whole genomes during development, health and disease, we seek computational methods to maximize functional inference from such large data sets. Is it possible, in principle, to completely infer a complex regulatory network architecture from input/output patterns of its variables? We investigated this possibility using binary models of genetic networks. Trajectories, or state transition tables of Boolean nets, resemble time series of gene expression. By systematically analyzing the mutual information between input states and output states, one is able to infer the sets of input elements controlling each element or gene in the network. This process is unequivocal and exact for complete state transition tables. We implemented this REVerse Engineering ALgorithm (REVEAL) in a C program, and found the problem to be tractable within the conditions tested so far. For n = 50 (elements) and k = 3 (inputs per element), the analysis of incomplete state transition tables (100 state transition pairs out of a possible 10(exp 15)) reliably produced the original rule and wiring sets. While this study is limited to synchronous Boolean networks, the algorithm is generalizable to include multi-state models, essentially allowing direct application to realistic biological data sets. The ability to adequately solve the inverse problem may enable in-depth analysis of complex dynamic systems in biology and other fields.
Petri Nets with Fuzzy Logic (PNFL): Reverse Engineering and Parametrization
Küffner, Robert; Petri, Tobias; Windhager, Lukas; Zimmer, Ralf
2010-01-01
Background The recent DREAM4 blind assessment provided a particularly realistic and challenging setting for network reverse engineering methods. The in silico part of DREAM4 solicited the inference of cycle-rich gene regulatory networks from heterogeneous, noisy expression data including time courses as well as knockout, knockdown and multifactorial perturbations. Methodology and Principal Findings We inferred and parametrized simulation models based on Petri Nets with Fuzzy Logic (PNFL). This completely automated approach correctly reconstructed networks with cycles as well as oscillating network motifs. PNFL was evaluated as the best performer on DREAM4 in silico networks of size 10 with an area under the precision-recall curve (AUPR) of 81%. Besides topology, we inferred a range of additional mechanistic details with good reliability, e.g. distinguishing activation from inhibition as well as dependent from independent regulation. Our models also performed well on new experimental conditions such as double knockout mutations that were not included in the provided datasets. Conclusions The inference of biological networks substantially benefits from methods that are expressive enough to deal with diverse datasets in a unified way. At the same time, overly complex approaches could generate multiple different models that explain the data equally well. PNFL appears to strike the balance between expressive power and complexity. This also applies to the intuitive representation of PNFL models combining a straightforward graphical notation with colloquial fuzzy parameters. PMID:20862218
NASA Astrophysics Data System (ADS)
Seifert, Annedore; Stegmann, Sylvia; Mörz, Tobias; Lange, Matthias; Wever, Thomas; Kopf, Achim
2008-08-01
We present in situ strength and pore-pressure measurements from 57 dynamic cone penetration tests in sediments of Mecklenburg ( n = 51), Eckernförde ( n = 2) and Gelting ( n = 4) bays, western Baltic Sea, characterised by thick mud layers and partially free microbial gas resulting from the degradation of organic material. In Mecklenburg and Eckernförde bays, sediment sampling by nine gravity cores served sedimentological characterisation, analyses of geotechnical properties, and laboratory shear tests. At selected localities, high-resolution echo-sounder profiles were acquired. Our aim was to deploy a dynamic cone penetrometer (CPT) to infer sediment shear strength and cohesion of the sea bottom as a function of fluid saturation. The results show very variable changes in pore pressure and sediment strength during the CPT deployments. The majority of the CPT measurements ( n = 54) show initially negative pore-pressure values during penetration, and a delayed response towards positive pressures thereafter. This so-called type B pore-pressure signal was recorded in all three bays, and is typically found in soft muds with high water contents and undrained shear strengths of 1.6-6.4 kPa. The type B signal is further affected by displacement of sediment and fluid upon penetration of the lance, skin effects during dynamic profiling, enhanced consolidation and strength of individual horizons, the presence of free gas, and a dilatory response of the sediment. In Mecklenburg Bay, the remaining small number of CPT measurements ( n = 3) show a well-defined peak in both pore pressure and cone resistance during penetration, i.e. an initial marked increase which is followed by exponential pore-pressure decay during dissipation. This so-called type A pore-pressure signal is associated with normally consolidated mud, with indurated clay layers showing significantly higher undrained shear strength (up to 19 kPa). In Eckernförde and Gelting bays pore-pressure response type B is exclusively found, while in Mecklenburg Bay types A and B were detected. Despite the striking similarities in incremental density increase and shear strength behaviour with depth, gas occurrence and subtle variations in the coarse-grained fraction cause distinct pore-pressure curves. Gaseous muds interbedded with silty and sandy layers are most common in the three bays, and the potential effect of free gas (i.e. undersaturated pore space) on in situ strength has to be explored further.
Stratification of Seismic Anisotropy Beneath Hudson Bay
NASA Astrophysics Data System (ADS)
Darbyshire, F. A.; Eaton, D. W.; Bastow, I. D.
2012-12-01
The Hudson Bay region has a complex tectonic history spanning ~4 Ga of Earth's evolution. During the ~1.8 Ga Trans-Hudson orogeny, the Archean Superior and Western Churchill cratons collided following the subduction of a Pacific-scale ocean. It is thought that a significant amount of juvenile material is preserved in the Trans-Hudson Orogen, in part due to the complex double-indentor geometry of the Superior-Churchill collision. In the region of interest, the orogen lies beneath a large but shallow Paleozoic intra-cratonic basin. Studies of the crust and upper mantle beneath this region have been enabled through the HuBLE (Hudson Bay Lithospheric Experiment) project, through the deployment of broadband seismographs around the Bay and across the islands to the north. A surface-wave tomography study has taken advantage of the data coverage, providing new information on phase velocity heterogeneity and anisotropy for wave periods of 25-200 seconds (equivalent to depths from the lower crust to ~300 km). On a large scale, our results show that the entire region is underlain by a seismically fast lithospheric lid corresponding to the continental keel. The lithospheric thickness ranges from ~180km in the northeast, beneath a zone of Paleozoic rifting, to ~280km beneath central Hudson Bay. Within the lithosphere, seismic velocities vary laterally, including high-velocity material wrapping around the Bay in the uppermost mantle. In the mid-lithosphere, two high-velocity cores are imaged, with a zone of lower velocity between them beneath the Bay. We interpret these high-velocity structures to represent the strongest central cores of the Superior and Churchill cratons, with more-juvenile material preserved between them. The near-vertical geometry of the lower-velocity zone suggests that it is only the effects of terminal collision of the cratonic cores, rather than any preceding subduction, that is preserved today. The lowermost lithosphere has a more uniform velocity, and may represent a pervasive zone of metasomatism or underplating. Anisotropy patterns across the region also vary with depth, suggesting ~3 layers of stratification of lithospheric fabric. At the shallowest depths, anisotropic fast directions wrap around the Bay in a similar fashion to the patterns of isotropic wavespeed. The upper lithospheric mantle below is characterized by relatively weak and incoherent anisotropy; however the mid-to-lower lithosphere shows stronger anisotropy, with a pattern of fast directions broadly consistent with the tectonics of the Superior-Churchill collision as inferred from potential-field data. This may suggest some degree of coherency of deformation between the crust, uppermost mantle and lower lithosphere. These models of seismic wavespeed variation beneath the Hudson Bay region reveal the preservation of a major collision zone during the assembly of the Laurentian continental mass, and also suggest that the Archean cratons can be subdivided into different lithospheric domains that reflect their tectonic history but do not necessarily correspond to surface geological boundaries.
Presser, Theresa S.; Luoma, Samuel N.
2006-01-01
Selenium discharges to the San Francisco Bay-Delta Estuary (Bay-Delta) could change significantly if federal and state agencies (1) approve an extension of the San Luis Drain to convey agricultural drainage from the western San Joaquin Valley to the North Bay (Suisun Bay, Carquinez Strait, and San Pablo Bay); (2) allow changes in flow patterns of the lower San Joaquin River and Bay-Delta while using an existing portion of the San Luis Drain to convey agricultural drainage to a tributary of the San Joaquin River; or (3) revise selenium criteria for the protection of aquatic life or issue criteria for the protection of wildlife. Understanding the biotransfer of selenium is essential to evaluating effects of selenium on Bay-Delta ecosystems. Confusion about selenium threats to fish and wildlife stem from (1) monitoring programs that do not address specific protocols necessary for an element that bioaccumulates; and (2) failure to consider the full complexity of the processes that result in selenium toxicity. Past studies show that predators are more at risk from selenium contamination than their prey, making it difficult to use traditional methods to predict risk from environmental concentrations alone. This report presents an approach to conceptualize and model the fate and effects of selenium under various load scenarios from the San Joaquin Valley. For each potential load, progressive forecasts show resulting (1) water-column concentration; (2) speciation; (3) transformation to particulate form; (4) particulate concentration; (5) bioaccumulation by invertebrates; (6) trophic transfer to predators; and (7) effects on those predators. Enough is known to establish a first-order understanding of relevant conditions, biological response, and ecological risks should selenium be discharged directly into the North Bay through a conveyance such as a proposed extension of the San Luis Drain. The approach presented here, the Bay-Delta selenium model, determines the mass, fate, and effects of selenium released to the Bay-Delta through use of (1) historical land-use, drainage, alluvial-fill, and runoff databases; (2) existing knowledge concerning biogeochemical reactions and physiological parameters of selenium (e.g., speciation, partitioning between dissolved and particulate forms, and bivalve assimilation efficiency); and (3) site-specific data mainly from 1986 to 1996 for clams and bottom-feeding fish and birds. Selenium load scenarios consider effluents from North Bay oil refineries and discharges of agricultural drainage from the San Joaquin Valley to enable calculation of (a) a composite freshwater endmember selenium concentration at the head of the estuary; and (b) a selenium concentration at a selected seawater location (Carquinez Strait) as a foundation for modeling. Analysis of selenium effects also takes into account the mode of conveyance for agricultural drainage (i.e., the San Luis Drain or San Joaquin River); and flows of the Sacramento River and San Joaquin River on a seasonal or monthly basis. Load scenarios for San Joaquin Valley mirror predictions made since 1955 of a worsening salt (and by inference, selenium) build-up exacerbated by an arid climate and massive irrigation. The reservoir of selenium in the San Joaquin Valley is sufficient to provide loading at an annual rate of approximately 42,500 pounds of selenium to a Bay-Delta disposal point for 63 to 304 years at the lower range of projections presented here, even if influx of selenium from the California Coast Ranges could be curtailed. Disposal of wastewaters on an annual basis outside of the San Joaquin Valley may slow the degradation of valley resources, but drainage alone cannot alleviate the salt and selenium build-up in the San Joaquin Valley, at least within a century. Load scenarios also show the different proportions of selenium loading to the Bay-Delta. Oil refinery loads from 1986 to 1992 ranged from 8.5 to 20 pounds of selenium per day;
2011-01-01
Background Molecular marker information is a common source to draw inferences about the relationship between genetic and phenotypic variation. Genetic effects are often modelled as additively acting marker allele effects. The true mode of biological action can, of course, be different from this plain assumption. One possibility to better understand the genetic architecture of complex traits is to include intra-locus (dominance) and inter-locus (epistasis) interaction of alleles as well as the additive genetic effects when fitting a model to a trait. Several Bayesian MCMC approaches exist for the genome-wide estimation of genetic effects with high accuracy of genetic value prediction. Including pairwise interaction for thousands of loci would probably go beyond the scope of such a sampling algorithm because then millions of effects are to be estimated simultaneously leading to months of computation time. Alternative solving strategies are required when epistasis is studied. Methods We extended a fast Bayesian method (fBayesB), which was previously proposed for a purely additive model, to include non-additive effects. The fBayesB approach was used to estimate genetic effects on the basis of simulated datasets. Different scenarios were simulated to study the loss of accuracy of prediction, if epistatic effects were not simulated but modelled and vice versa. Results If 23 QTL were simulated to cause additive and dominance effects, both fBayesB and a conventional MCMC sampler BayesB yielded similar results in terms of accuracy of genetic value prediction and bias of variance component estimation based on a model including additive and dominance effects. Applying fBayesB to data with epistasis, accuracy could be improved by 5% when all pairwise interactions were modelled as well. The accuracy decreased more than 20% if genetic variation was spread over 230 QTL. In this scenario, accuracy based on modelling only additive and dominance effects was generally superior to that of the complex model including epistatic effects. Conclusions This simulation study showed that the fBayesB approach is convenient for genetic value prediction. Jointly estimating additive and non-additive effects (especially dominance) has reasonable impact on the accuracy of prediction and the proportion of genetic variation assigned to the additive genetic source. PMID:21867519
Efficient Reverse-Engineering of a Developmental Gene Regulatory Network
Cicin-Sain, Damjan; Ashyraliyev, Maksat; Jaeger, Johannes
2012-01-01
Understanding the complex regulatory networks underlying development and evolution of multi-cellular organisms is a major problem in biology. Computational models can be used as tools to extract the regulatory structure and dynamics of such networks from gene expression data. This approach is called reverse engineering. It has been successfully applied to many gene networks in various biological systems. However, to reconstitute the structure and non-linear dynamics of a developmental gene network in its spatial context remains a considerable challenge. Here, we address this challenge using a case study: the gap gene network involved in segment determination during early development of Drosophila melanogaster. A major problem for reverse-engineering pattern-forming networks is the significant amount of time and effort required to acquire and quantify spatial gene expression data. We have developed a simplified data processing pipeline that considerably increases the throughput of the method, but results in data of reduced accuracy compared to those previously used for gap gene network inference. We demonstrate that we can infer the correct network structure using our reduced data set, and investigate minimal data requirements for successful reverse engineering. Our results show that timing and position of expression domain boundaries are the crucial features for determining regulatory network structure from data, while it is less important to precisely measure expression levels. Based on this, we define minimal data requirements for gap gene network inference. Our results demonstrate the feasibility of reverse-engineering with much reduced experimental effort. This enables more widespread use of the method in different developmental contexts and organisms. Such systematic application of data-driven models to real-world networks has enormous potential. Only the quantitative investigation of a large number of developmental gene regulatory networks will allow us to discover whether there are rules or regularities governing development and evolution of complex multi-cellular organisms. PMID:22807664
Ferragina, A; de los Campos, G; Vazquez, A I; Cecchinato, A; Bittante, G
2015-11-01
The aim of this study was to assess the performance of Bayesian models commonly used for genomic selection to predict "difficult-to-predict" dairy traits, such as milk fatty acid (FA) expressed as percentage of total fatty acids, and technological properties, such as fresh cheese yield and protein recovery, using Fourier-transform infrared (FTIR) spectral data. Our main hypothesis was that Bayesian models that can estimate shrinkage and perform variable selection may improve our ability to predict FA traits and technological traits above and beyond what can be achieved using the current calibration models (e.g., partial least squares, PLS). To this end, we assessed a series of Bayesian methods and compared their prediction performance with that of PLS. The comparison between models was done using the same sets of data (i.e., same samples, same variability, same spectral treatment) for each trait. Data consisted of 1,264 individual milk samples collected from Brown Swiss cows for which gas chromatographic FA composition, milk coagulation properties, and cheese-yield traits were available. For each sample, 2 spectra in the infrared region from 5,011 to 925 cm(-1) were available and averaged before data analysis. Three Bayesian models: Bayesian ridge regression (Bayes RR), Bayes A, and Bayes B, and 2 reference models: PLS and modified PLS (MPLS) procedures, were used to calibrate equations for each of the traits. The Bayesian models used were implemented in the R package BGLR (http://cran.r-project.org/web/packages/BGLR/index.html), whereas the PLS and MPLS were those implemented in the WinISI II software (Infrasoft International LLC, State College, PA). Prediction accuracy was estimated for each trait and model using 25 replicates of a training-testing validation procedure. Compared with PLS, which is currently the most widely used calibration method, MPLS and the 3 Bayesian methods showed significantly greater prediction accuracy. Accuracy increased in moving from calibration to external validation methods, and in moving from PLS and MPLS to Bayesian methods, particularly Bayes A and Bayes B. The maximum R(2) value of validation was obtained with Bayes B and Bayes A. For the FA, C10:0 (% of each FA on total FA basis) had the highest R(2) (0.75, achieved with Bayes A and Bayes B), and among the technological traits, fresh cheese yield R(2) of 0.82 (achieved with Bayes B). These 2 methods have proven to be useful instruments in shrinking and selecting very informative wavelengths and inferring the structure and functions of the analyzed traits. We conclude that Bayesian models are powerful tools for deriving calibration equations, and, importantly, these equations can be easily developed using existing open-source software. As part of our study, we provide scripts based on the open source R software BGLR, which can be used to train customized prediction equations for other traits or populations. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
McAdam, A. C.; Franz, H. B.; Archer, P. D., Jr.; Sutter, B.; Eigenbrode, J. L.; Freissinet, C.; Atreya, S. K.; Bish, D. L.; Blake, D. F.; Brunner, A.;
2014-01-01
Sulfate minerals have been directly detected or strongly inferred from several Mars datasets and indicate that aqueous alteration of martian surface materials has occurred. Indications of reduced sulfur phases (e.g., sulfides) from orbital and in situ investigations of martian materials have been fewer in number, but these phases are observed in martian meteorites and are likely because they are common minor phases in basaltic rocks. Here we discuss potential sources for the S-bearing compounds detected by the Mars Science Laboratory (MSL) Sample Analysis at Mars (SAM) instrument’s evolved gas analysis (EGA) experiments.
An integrated data model to estimate spatiotemporal occupancy, abundance, and colonization dynamics
Williams, Perry J.; Hooten, Mevin B.; Womble, Jamie N.; Esslinger, George G.; Bower, Michael R.; Hefley, Trevor J.
2017-01-01
Ecological invasions and colonizations occur dynamically through space and time. Estimating the distribution and abundance of colonizing species is critical for efficient management or conservation. We describe a statistical framework for simultaneously estimating spatiotemporal occupancy and abundance dynamics of a colonizing species. Our method accounts for several issues that are common when modeling spatiotemporal ecological data including multiple levels of detection probability, multiple data sources, and computational limitations that occur when making fine-scale inference over a large spatiotemporal domain. We apply the model to estimate the colonization dynamics of sea otters (Enhydra lutris) in Glacier Bay, in southeastern Alaska.
Software for Data Analysis with Graphical Models
NASA Technical Reports Server (NTRS)
Buntine, Wray L.; Roy, H. Scott
1994-01-01
Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
1999-08-01
KENNEDY SPACE CENTER, FLA. -- An orbiter has more than 300 miles of wires such as these shown here in the cable tray inside Columbia's payload bay. During launch of Columbia on mission STS-93, a damaged wire caused a short circuit in two separate main engine controllers. As a result of the findings, Shuttle program managers have decided to conduct inspections of the wiring in Endeavour's payload bay before its next mission, STS-99. The inspection and possible repair work will lead to a delayed launch date no earlier than Oct.7. The primary payload of the mission is the Shuttle Radar Topography Mission, a specially modified radar system that will gather data for the most accurate and complete topographic map of the Earth's surface that has ever been assembled
Occupational Skin Diseases in the San Francisco Bay Area
Gellin, Gerald A.; Wolf, C. Richard; Milby, Thomas H.
1970-01-01
From answers by one-third of the practicing dermatologists in the San Francisco Bay Area to a questionnaire on occupational skin diseases, contact dermatitis due to irritants and sensitizers was found to rank first. Poison oak, which is the leading reported cause on “Doctor's First Report of Work Injury” received by the California Department of Industrial Relations, was sixth on the list of the survey, trailing solvents, cleansing agents, petroleum products and epoxy resins. A history of atopic dermatitis was often noted in current cases of occupational diseases of the skin. Avoidance of exposure or limiting the contact with pathogenic substances—through engineering changes, observation of working conditions by physicians, education of workers—appeared to be the best preventive measures. PMID:4255687
Integrating Entropy-Based Naïve Bayes and GIS for Spatial Evaluation of Flood Hazard.
Liu, Rui; Chen, Yun; Wu, Jianping; Gao, Lei; Barrett, Damian; Xu, Tingbao; Li, Xiaojuan; Li, Linyi; Huang, Chang; Yu, Jia
2017-04-01
Regional flood risk caused by intensive rainfall under extreme climate conditions has increasingly attracted global attention. Mapping and evaluation of flood hazard are vital parts in flood risk assessment. This study develops an integrated framework for estimating spatial likelihood of flood hazard by coupling weighted naïve Bayes (WNB), geographic information system, and remote sensing. The north part of Fitzroy River Basin in Queensland, Australia, was selected as a case study site. The environmental indices, including extreme rainfall, evapotranspiration, net-water index, soil water retention, elevation, slope, drainage proximity, and density, were generated from spatial data representing climate, soil, vegetation, hydrology, and topography. These indices were weighted using the statistics-based entropy method. The weighted indices were input into the WNB-based model to delineate a regional flood risk map that indicates the likelihood of flood occurrence. The resultant map was validated by the maximum inundation extent extracted from moderate resolution imaging spectroradiometer (MODIS) imagery. The evaluation results, including mapping and evaluation of the distribution of flood hazard, are helpful in guiding flood inundation disaster responses for the region. The novel approach presented consists of weighted grid data, image-based sampling and validation, cell-by-cell probability inferring and spatial mapping. It is superior to an existing spatial naive Bayes (NB) method for regional flood hazard assessment. It can also be extended to other likelihood-related environmental hazard studies. © 2016 Society for Risk Analysis.
Guarini, Jean-Marc; Cloern, James E.; Edmunds, Jody L.; Gros, Philippe
2002-01-01
In this paper we describe a three-step procedure to infer the spatial heterogeneity in microphytobenthos primary productivity at the scale of tidal estuaries and embayments. The first step involves local measurement of the carbon assimilation rate of benthic microalgae to determine the parameters of the photosynthesis-irradiance (P-E) curves (using non-linear optimization methods). In the next step, a resampling technique is used to rebuild pseudo-sampling distributions of the local productivity estimates; these provide error estimates for determining the significance level of differences between sites. The third step combines the previous results with deterministic models of tidal elevation and solar irradiance to compute mean and variance of the daily areal primary productivity over an entire intertidal mudflat area within each embayment. This scheme was applied on three different intertidal mudflat regions of the San Francisco Bay estuary during autumn 1998. Microphytobenthos productivity exhibits strong (ca. 3-fold) significant differences among the major sub-basins of San Francisco Bay. This spatial heterogeneity is attributed to two main causes: significant differences in the photosynthetic competence (P-E parameters) of the microphytobenthos in the different sub-basins, and spatial differences in the phase shifts between the tidal and solar cycles controlling the exposure of intertidal areas to sunlight. The procedure is general and can be used in other estuaries to assess the magnitude and patterns of spatial variability of microphytobenthos productivity at the level of the ecosystems.
Guarini, J.-M.; Cloern, James E.; Edmunds, J.
2002-01-01
In this paper we describe a three-step procedure to infer the spatial heterogeneity in microphytobenthos primary productivity at the scale of tidal estuaries and embayments. The first step involves local measurement of the carbon assimilation rate of benthic microalgae to determine the parameters of the photosynthesis-irradiance (P-E) curves (using non-linear optimization methods). In the next step, a resampling technique is used to rebuild pseudo-sampling distributions of the local productivity estimates; these provide error estimates for determining the significance level of differences between sites. The third step combines the previous results with deterministic models of tidal elevation and solar irradiance to compute mean and variance of the daily areal primary productivity over an entire intertidal mudflat area within each embayment. This scheme was applied on three different intertidal mudflat regions of the San Francisco Bay estuary during autumn 1998. Microphytobenthos productivity exhibits strong (ca. 3-fold) significant differences among the major sub-basins of San Francisco Bay. This spatial heterogeneity is attributed to two main causes: significant differences in the photosynthetic competence (P-E parameters) of the microphytobenthos in the different sub-basins, and spatial differences in the phase shifts between the tidal and solar cycles controlling the exposure of intertidal areas to sunlight. The procedure is general and can be used in other estuaries to assess the magnitude and patterns of spatial variability of microphytobenthos productivity at the level of the ecosystems.
NASA Astrophysics Data System (ADS)
Mandel, Kaisey; Kirshner, R. P.; Narayan, G.; Wood-Vasey, W. M.; Friedman, A. S.; Hicken, M.
2010-01-01
I have constructed a comprehensive statistical model for Type Ia supernova light curves spanning optical through near infrared data simultaneously. The near infrared light curves are found to be excellent standard candles (sigma(MH) = 0.11 +/- 0.03 mag) that are less vulnerable to systematic error from dust extinction, a major confounding factor for cosmological studies. A hierarchical statistical framework incorporates coherently multiple sources of randomness and uncertainty, including photometric error, intrinsic supernova light curve variations and correlations, dust extinction and reddening, peculiar velocity dispersion and distances, for probabilistic inference with Type Ia SN light curves. Inferences are drawn from the full probability density over individual supernovae and the SN Ia and dust populations, conditioned on a dataset of SN Ia light curves and redshifts. To compute probabilistic inferences with hierarchical models, I have developed BayeSN, a Markov Chain Monte Carlo algorithm based on Gibbs sampling. This code explores and samples the global probability density of parameters describing individual supernovae and the population. I have applied this hierarchical model to optical and near infrared data of over 100 nearby Type Ia SN from PAIRITEL, the CfA3 sample, and the literature. Using this statistical model, I find that SN with optical and NIR data have a smaller residual scatter in the Hubble diagram than SN with only optical data. The continued study of Type Ia SN in the near infrared will be important for improving their utility as precise and accurate cosmological distance indicators.
43. Portion of Construction Drawing 2042F17, entitled 100 ft. Roundhouse, ...
43. Portion of Construction Drawing 2042-F-17, entitled 100 ft. Roundhouse, Plan and Section of Typical Bay. Shown is a section showing soil bearing foundation and plan. (Original drawing, in the possession of Wyre Dick and Company, Livingston, New Jersey.) - Central Railroad of New Jersey, Engine Terminal, Jersey City, Hudson County, NJ
Talking Trash on the Internet: Working Real Data into Your Classroom.
ERIC Educational Resources Information Center
Lynch, Maurice P.; Walton, Susan A.
1998-01-01
Describes how a middle school teacher used the Chesapeake Bay National Estuarine Research Reserve in Virginia (CBNERRVA) Web site to provide scientific data for a unit on recycling. Includes sample data sheets and tables, charts results of a Web search for marine debris using different search engines, and lists selected marine data Web sites. (PEN)
LPT. Shield test facility assembly and test building (TAN646), south ...
LPT. Shield test facility assembly and test building (TAN-646), south facade. Camera facing north. High-bay section is pool room. Single-story section at right is control building (TAN-645). Small metal building is post-1970 addition. INEEL negative no. HD-40-7-3 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID
We investigated the effect of the thalassinid mud shrimp Upogebia pugettensis on organic matter and nutrient cycling on Idaho Flat, an intertidal flat in the Yaquina River estuary, Oregon. Field studies were conducted to measure carbon and nitrogen remineralization rates and bent...
STS-26 Discovery, OV-103, artwork showing TDRS-C deployment
1987-11-16
STS-26 Discovery, Orbiter Vehicle (OV) 103, artwork depicts tracking and data relay satellite C (TDRS-C) deployment. OV-103 orbits above Earth in bottom-to-sun attitude, moments after TDRS-C's release into space. TDRS-C is seen just below open payload bay (PLB). Artwork was done by Pat Rawlings of Eagle Engineering.
145. ARAIII Control building (ARA607) Sections. Shows highbay section of ...
145. ARA-III Control building (ARA-607) Sections. Shows high-bay section of building over crane rail and beam. Indicates materials of exterior siding. Aerojet-general 880-area/GCRE-607-A-11. Date: February 1958. Ineel index code no. 063-0607-00-013-102556. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID
Technology Transfer Summary Report (FY92), Naval Surface Warfare Center Dahlgren Division
1994-04-20
communications; no formal records are kept of these. Community Technical Outreach NSWCDD participates in the "Science and Engineering Apprentice" and the " Bay ...ADMINISTRATOR’S OFFICE NAVAL UNDERSEA WARFARE CENTER PO BOX 545 DIVISION NEWPORT SEQUIM WA 98382 NEWPORI’ RI 02841-5047 ATTN GIFT AND EXCHANGE DIV 4 ATTN CODE 00
150. ARAIII Reactor building (ARA608) Sections. Show highbay section, heater ...
150. ARA-III Reactor building (ARA-608) Sections. Show high-bay section, heater stack, and depth of reactor, piping, and heater pits. Aerojet-general 880-area/GCRE-608-A-3. Date: February 1958. Ineel index code no. 063-0608-00-013-102613. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID
41. 'Firing Pier, Second Floor Plan, Section No. 2,' submitted ...
41. 'Firing Pier, Second Floor Plan, Section No. 2,' submitted 29 December 1941 by John Brackett, Consulting Engineer, to Public Works Department, Bureau of Yards & Docks. PW Drawing 3867-46, Y&D Drawing 190841. Scale 1/4' = 1'. - Naval Torpedo Station, Firing Pier, North end of Gould Island in Narragansett Bay, Newport, Newport County, RI
42. 'Firing Pier, Second Floor Plan, Section No. 3,' submitted ...
42. 'Firing Pier, Second Floor Plan, Section No. 3,' submitted 29 December 1941 by John Brackett, Consulting Engineer, to Public Works Department, Bureau of Yards & Docks. PW Drawing 3868-46, Y&D Drawing 190842. Scale 1/4' = 1'. - Naval Torpedo Station, Firing Pier, North end of Gould Island in Narragansett Bay, Newport, Newport County, RI
36. 'Firing Pier, First Floor Plan, Section No. 1,' submitted ...
36. 'Firing Pier, First Floor Plan, Section No. 1,' submitted 29 December 1941 by John Brackett, Consulting Engineer, to Public Works Department, Bureau of Yards & Docks. PW Drawing 3862-46, Y&D Drawing 190836. Scale 1/4' = 1'. - Naval Torpedo Station, Firing Pier, North end of Gould Island in Narragansett Bay, Newport, Newport County, RI
43. 'Firing Pier, Third and Fourth Floors and Roof Plan,' ...
43. 'Firing Pier, Third and Fourth Floors and Roof Plan,' submitted 29 December 1941 by John Brackett, Consulting Engineer, to Public Works Department, Bureau of Yards & Docks. PW Drawing 3869-46, Y&D Drawing 190843. Scale 1/4' = 1'. - Naval Torpedo Station, Firing Pier, North end of Gould Island in Narragansett Bay, Newport, Newport County, RI
38. 'Firing Pier, First Floor Plan, Section No. 3,' submitted ...
38. 'Firing Pier, First Floor Plan, Section No. 3,' submitted 29 December 1941 by John Brackett, Consulting Engineer, to Public Works Department, Bureau of Yards & Docks. PW Drawing 3864-46, Y&D Drawing 190838. Scale 1/4' = 1'. - Naval Torpedo Station, Firing Pier, North end of Gould Island in Narragansett Bay, Newport, Newport County, RI
40. 'Firing Pier, Second Floor Plan, Section No. 1,' submitted ...
40. 'Firing Pier, Second Floor Plan, Section No. 1,' submitted 29 December 1941 by John Brackett, Consulting Engineer, to Public Works Department, Bureau of Yards & Docks. PW Drawing 3866-46, Y&D Drawing 190840. Scale 1/4' = 1'. - Naval Torpedo Station, Firing Pier, North end of Gould Island in Narragansett Bay, Newport, Newport County, RI
30. 'Gould Island Facilities, General Plan,' submitted 29 December 1941 ...
30. 'Gould Island Facilities, General Plan,' submitted 29 December 1941 by John Brackett, Consulting Engineer, to Public Works Department, Bureau of Yards & Docks. PW Drawing 3859-46, Y&D Drawing 190833. Scales 1' = 50' and 1' = 10'. - Naval Torpedo Station, Firing Pier, North end of Gould Island in Narragansett Bay, Newport, Newport County, RI
39. 'Firing Pier, First Floor Plan, Section No. 4,' submitted ...
39. 'Firing Pier, First Floor Plan, Section No. 4,' submitted 29 December 1941 by John Brackett, Consulting Engineer, to Public Works Department, Bureau of Yards & Docks. PW Drawing 3865-46, Y&D Drawing 190839. Scale 1/4' = 1'. - Naval Torpedo Station, Firing Pier, North end of Gould Island in Narragansett Bay, Newport, Newport County, RI
37. 'Firing Pier, First Floor Plan, Section No. 2,' submitted ...
37. 'Firing Pier, First Floor Plan, Section No. 2,' submitted 29 December 1941 by John Brackett, Consulting Engineer, to Public Works Department, Bureau of Yards & Docks. PW Drawing 3863-46, Y&D Drawing 190837. Scale 1/4' = 1. - Naval Torpedo Station, Firing Pier, North end of Gould Island in Narragansett Bay, Newport, Newport County, RI
STUDY AND DEVELOPMENT OF SHOP-CENTERED TEAM TEACHING FOR POTENTIAL HIGH SCHOOL DROP-OUTS.
ERIC Educational Resources Information Center
ODELL, WILLIAM R.
A RATIONALE AND PROCEDURE FOR THE EFFECTIVE VOCATIONAL EDUCATION OF LOW ACHIEVING HIGH SCHOOL STUDENTS WAS DEVELOPED FROM AN ANALYSIS OF 13 HIGH SCHOOL PROGRAMS IN 10 SAN FRANCISCO BAY AREA SCHOOL SYSTEMS WHERE THE RICHMOND PRE-ENGINEERING TECHNOLOGY PROGRAM WAS UNDER OPERATION. EXPERIMENTAL EFFORTS WERE MADE TO ESTABLISH SHOP-CENTERED TEAM…
Storlazzi, Curt D.; Presto, M. Katherine; Logan, Joshua B.; Field, Michael E.
2010-01-01
High-resolution measurements of waves, currents, water levels, temperature, salinity and turbidity were made in Maunalua Bay, southern Oahu, Hawaii, during the 2008-2009 winter to better understand coastal circulation, water-column properties, and sediment dynamics during a range of conditions (trade winds, kona storms, relaxation of trade winds, and south swells). A series of bottom-mounted instrument packages were deployed in water depths of 20 m or less to collect long-term, high-resolution measurements of waves, currents, water levels, temperature, salinity, and turbidity. These data were supplemented with a series of profiles through the water column to characterize the vertical and spatial variability in water-column properties within the bay. These measurements support the ongoing process studies being done as part of the U.S. Geological Survey (USGS) Coastal and Marine Geology Program's Pacific Coral Reef Project; the ultimate goal of these studies is to better understand the transport mechanisms of sediment, larvae, pollutants, and other particles in coral reef settings. Project Objectives The objective of this study was to understand the temporal variations in currents, waves, tides, temperature, salinity and turbidity within a coral-lined embayment that receives periodic discharges of freshwater and sediment from multiple terrestrial sources in the Maunalua Bay. Instrument packages were deployed for a three-month period during the 2008-2009 winter and a series of vertical profiles were collected in November 2008, and again in February 2009, to characterize water-column properties within the bay. Measurements of flow and water-column properties in Maunalua Bay provided insight into the potential fate of terrestrial sediment, nutrient, or contaminant delivered to the marine environment and coral larval transport within the embayment. Such data are useful for providing baseline information for future watershed decisions and for establishing guidelines for the U.S. Coral Reef Task Force's (USCRTF) Hawaiian Local Action Strategy to address Land-Based Pollution (LAS-LBP) threats to coral reefs adjacent to the urbanized watersheds of Manualua Bay. Study Area Maunalua Bay is on the south side of Oahu, Hawaii, and is approximately 10 km long and 3 km wide. The bay is flanked by two large, dormant craters: Koko Head to the east and Diamond Head to the west. Rainfall in the watersheds that drain into Maunalua Bay ranges from more than 200 cm/year at the top of the Ko'olau Range that borders the northwestern part of the bay to less than 70 cm/year to the east at Koko Head. Seven major channels flow into the bay, and all but one have been altered by engineering structures.
Environmental aspects of engineering geological mapping in the United States
Radbruch-Hall, Dorothy H.
1979-01-01
Many engineering geological maps at different scales have been prepared for various engineering and environmental purposes in regions of diverse geological conditions in the United States. They include maps of individual geological hazards and maps showing the effect of land development on the environment. An approach to assessing the environmental impact of land development that is used increasingly in the United States is the study of a single area by scientists from several disciplines, including geology. A study of this type has been made for the National Petroleum Reserve in northern Alaska. In the San Francisco Bay area, a technique has been worked out for evaluating the cost of different types of construction and land development in terms of the cost of a number of kinds of earth science factors. ?? 1979 International Association of Engineering Geology.
2007-09-20
Core components of the J-2X engine being designed for NASA's Constellation Program recently were installed on the A-1 Test Stand at NASA's Stennis Space Center near Bay St. Louis, Miss. Tests of the components, known as Powerpack 1A, will be conducted from November 2007 through February 2008. The Powerpack 1A test article consists of a gas generator and engine turbopumps originally developed for the Apollo Program that put Americans on the moon in the late 1960s and early 1970s. Engineers are testing these heritage components to obtain data that will help them modify the turbomachinery to meet the higher performance requirements of the Ares I and Ares V launch vehicles. The upcoming tests will simulate inlet and outlet conditions that would be present on the turbomachinery during a full-up engine hot-fire test.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillis, C C; Ostrach, D J; Gras, M
2006-06-14
Otolith Sr/Ca has become a popular tool for hind casting habitat utilization and migration histories of euryhaline fish. It can readily identify habitat shifts of diadromous fish in most systems. Inferring movements of fish within estuarine habitat, however, requires a model of that accounts of the local water chemistry and the response of individual species to that water chemistry, which is poorly understood. Modeling is further complicated by the fact that high marine Sr and Ca concentrations results in a rapid, nonlinear increase in water Sr/Ca and {sup 87}Sr/{sup 86}Sr between fresh and marine waters. Here we demonstrate a novelmore » method for developing a salinity-otolith Sr/Ca model for the purpose of reconstructing striped bass (Morone saxatilis) habitat use in the San Francisco Bay estuary. We used correlated Sr/Ca and {sup 87}Sr/{sup 86}Sr ratios measurements from adult otoliths from striped bass that experienced a range of salinities to infer striped bass otolith Sr/Ca response to changes in salinity and water Sr/Ca ratio. Otolith {sup 87}Sr/{sup 86}Sr can be assumed to accurately record water {sup 87}Sr/{sup 86}Sr because there is no biological fractionation of Sr isotopes. Water {sup 87}Sr/{sup 86}Sr can in turn be used to estimate water salinity based on the mixing of fresh and marine water with known {sup 87}Sr/{sup 86}Sr ratios. The relationship between adjacent analyses on otoliths of Sr/Ca and {sup 87}Sr/{sup 86}Sr by LA-ICP-MS and MC-ICP-MS (r{sup 2} = 0.65, n = 66) is used to predict water salinity from a measured Sr/Ca ratio. The nature of this non-linear model lends itself well to identifying residence in the Delta and to a lesser extent Suisun Bay, but does not do well locating residence within the more saline bays west of Carquinez Strait. An increase in the number of analyses would improve model confidence, but ultimately the precision of the model is limited by the variability in the response of individual fish to water Sr/Ca.« less