Transfer entropy as a log-likelihood ratio.
Barnett, Lionel; Bossomaier, Terry
2012-09-28
Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense. PMID:23030125
Transfer Entropy as a Log-Likelihood Ratio
NASA Astrophysics Data System (ADS)
Barnett, Lionel; Bossomaier, Terry
2012-09-01
Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.
Fast inference in generalized linear models via expected log-likelihoods
Ramirez, Alexandro D.; Paninski, Liam
2015-01-01
Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting “expected log-likelihood” can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina. PMID:23832289
Posterior propriety for hierarchical models with log-likelihoods that have norm bounds
Michalak, Sarah E.; Morris, Carl N.
2015-07-17
Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonlymore » used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).« less
Posterior propriety for hierarchical models with log-likelihoods that have norm bounds
Michalak, Sarah E.; Morris, Carl N.
2015-07-17
Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonly used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).
MICCLLR: Multiple-Instance Learning Using Class Conditional Log Likelihood Ratio
NASA Astrophysics Data System (ADS)
El-Manzalawy, Yasser; Honavar, Vasant
Multiple-instance learning (MIL) is a generalization of the supervised learning problem where each training observation is a labeled bag of unlabeled instances. Several supervised learning algorithms have been successfully adapted for the multiple-instance learning settings. We explore the adaptation of the Naive Bayes (NB) classifier and the utilization of its sufficient statistics for developing novel multiple-instance learning methods. Specifically, we introduce MICCLLR (multiple-instance class conditional log likelihood ratio), a method for mapping each bag of instances as a single meta-instance using class conditional log likelihood ratio statistics such that any supervised base classifier can be applied to the meta-data. The results of our experiments with MICCLLR using different base classifiers suggest that no single base classifier consistently outperforms other base classifiers on all data sets. We show that a substantial improvement in performance is obtained using an ensemble of MICCLLR classifiers trained using different base learners. We also show that an extra gain in classification accuracy is obtained by applying AdaBoost.M1 to weak MICCLLR classifiers. Overall, our results suggest that the predictive performance of the three proposed variants of MICCLLR are competitive to some of the state-of-the-art MIL methods.
2. Onroom log cabin (right), log root cellar (center), tworoom ...
2. On-room log cabin (right), log root cellar (center), two-room log cabin (left), and post-and-beam garage (background). View to southwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT
Read, Randy J; McCoy, Airlie J
2016-03-01
The crystallographic diffraction experiment measures Bragg intensities; crystallographic electron-density maps and other crystallographic calculations in phasing require structure-factor amplitudes. If data were measured with no errors, the structure-factor amplitudes would be trivially proportional to the square roots of the intensities. When the experimental errors are large, and especially when random errors yield negative net intensities, the conversion of intensities and their error estimates into amplitudes and associated error estimates becomes nontrivial. Although this problem has been addressed intermittently in the history of crystallographic phasing, current approaches to accounting for experimental errors in macromolecular crystallography have numerous significant defects. These have been addressed with the formulation of LLGI, a log-likelihood-gain function in terms of the Bragg intensities and their associated experimental error estimates. LLGI has the correct asymptotic behaviour for data with large experimental error, appropriately downweighting these reflections without introducing bias. LLGI abrogates the need for the conversion of intensity data to amplitudes, which is usually performed with the French and Wilson method [French & Wilson (1978), Acta Cryst. A35, 517-525], wherever likelihood target functions are required. It has general applicability for a wide variety of algorithms in macromolecular crystallography, including scaling, characterizing anisotropy and translational noncrystallographic symmetry, detecting outliers, experimental phasing, molecular replacement and refinement. Because it is impossible to reliably recover the original intensity data from amplitudes, it is suggested that crystallographers should always deposit the intensity data in the Protein Data Bank. PMID:26960124
Read, Randy J.; McCoy, Airlie J.
2016-01-01
The crystallographic diffraction experiment measures Bragg intensities; crystallographic electron-density maps and other crystallographic calculations in phasing require structure-factor amplitudes. If data were measured with no errors, the structure-factor amplitudes would be trivially proportional to the square roots of the intensities. When the experimental errors are large, and especially when random errors yield negative net intensities, the conversion of intensities and their error estimates into amplitudes and associated error estimates becomes nontrivial. Although this problem has been addressed intermittently in the history of crystallographic phasing, current approaches to accounting for experimental errors in macromolecular crystallography have numerous significant defects. These have been addressed with the formulation of LLGI, a log-likelihood-gain function in terms of the Bragg intensities and their associated experimental error estimates. LLGI has the correct asymptotic behaviour for data with large experimental error, appropriately downweighting these reflections without introducing bias. LLGI abrogates the need for the conversion of intensity data to amplitudes, which is usually performed with the French and Wilson method [French & Wilson (1978 ▸), Acta Cryst. A35, 517–525], wherever likelihood target functions are required. It has general applicability for a wide variety of algorithms in macromolecular crystallography, including scaling, characterizing anisotropy and translational noncrystallographic symmetry, detecting outliers, experimental phasing, molecular replacement and refinement. Because it is impossible to reliably recover the original intensity data from amplitudes, it is suggested that crystallographers should always deposit the intensity data in the Protein Data Bank. PMID:26960124
An improved scheme based on log-likelihood-ratio for lattice reduction-aided MIMO detection
NASA Astrophysics Data System (ADS)
Song, Yunchao; Liu, Chen; Lu, Feng
2016-01-01
Lattice reduction (LR)-aided detectors have been shown great potentials in wireless communications for their low complexity and low bit-error-rate (BER) performance. The LR algorithms use the unimodular transformation to improve the orthogonality of the channel matrix. However, the LR algorithms only utilize the channel state information (CSI) and do not take account for the received signal, which is also important information in enhancing the performance of the detectors. In this paper, we make a readjustment of the received signal in the LR domain and propose a new scheme which is based on the log-likelihood-ratio (LLR) criterion to improve the LR-aided detectors. The motivation of using the LLR criterion is that it utilizes both the received signal and the CSI, so that it can provide exact pairwise error probabilities (PEPs) of the symbols. Then, in the proposed scheme, we design the LLR-based transformation algorithm (TA) which uses the unimodular transformation to minimize the PEPs of the symbols by the LLR criterion. Note that the PEPs of the symbols affect the error propagation in the vertical Bell Laboratories Layered Space-Time (VBLAST) detector, and decreasing the PEPs can reduce the error propagation in the VBLAST detectors; thus, our LLR-based TA-aided VBLAST detectors will exhibit better BER performance than the previous LR-aided VBLAST detectors. Both the BER performance and the computational complexity are demonstrated through the simulation results.
Geetha, Chinnaraj; Manjula, Puttabasappa
2014-03-01
The aim of the present study was to evaluate the use of the envelope difference index (EDI) and log-likelihood ratio (LLR) to quantify the independent and interactive effects of wide dynamic range compression, digital noise reduction and directionality, and to carry out self-rated quality measures. A recorded sentence embedded in speech spectrum noise at +5 dB signal to noise ratio was presented to a four channel digital hearing aid and the output was recorded with different combinations of algorithms at 30, 45 and 70 dB HL levels of presentation through a 2 cc coupler. EDI and LLR were obtained in comparison with the original signal using MATLAB software. In addition, thirty participants with normal hearing sensitivity rated the output on the loudness and clarity parameters of quality. The results revealed that the temporal changes happening at the output is independent of the number of algorithms activated together in a hearing aid. However, at a higher level of presentation, temporal cues are better preserved if all of these algorithms are deactivated. The spectral components speech tend to get affected by the presentation level. The results also indicate the importance of quality rating as this helps in considering whether the spectral and/or temporal deviations created in the hearing aid are desirable or not. PMID:26557357
Geetha, Chinnaraj; Manjula, Puttabasappa
2014-01-01
The aim of the present study was to evaluate the use of the envelope difference index (EDI) and log-likelihood ratio (LLR) to quantify the independent and interactive effects of wide dynamic range compression, digital noise reduction and directionality, and to carry out self-rated quality measures. A recorded sentence embedded in speech spectrum noise at +5 dB signal to noise ratio was presented to a four channel digital hearing aid and the output was recorded with different combinations of algorithms at 30, 45 and 70 dB HL levels of presentation through a 2 cc coupler. EDI and LLR were obtained in comparison with the original signal using MATLAB software. In addition, thirty participants with normal hearing sensitivity rated the output on the loudness and clarity parameters of quality. The results revealed that the temporal changes happening at the output is independent of the number of algorithms activated together in a hearing aid. However, at a higher level of presentation, temporal cues are better preserved if all of these algorithms are deactivated. The spectral components speech tend to get affected by the presentation level. The results also indicate the importance of quality rating as this helps in considering whether the spectral and/or temporal deviations created in the hearing aid are desirable or not. PMID:26557357
Pastor, José V; Arrègle, Jean; García, José M; Zapata, L Daniel
2007-02-20
A methodology for processing images of diesel sprays under different experimental situations is presented. The new approach has been developed for cases where the background does not follow a Gaussian distribution but a positive bias appears. In such cases, the lognormal and the gamma probability density functions have been considered for the background digital level distributions. Two different algorithms have been compared with the standard log-likelihood ratio test (LRT): a threshold defined from the cumulative probability density function of the background shows a sensitive improvement, but the best results are obtained with modified versions of the LRT algorithm adapted to non-Gaussian cases. PMID:17279134
KUPPER, Lawrence L.
2012-01-01
A common goal in environmental epidemiologic studies is to undertake logistic regression modeling to associate a continuous measure of exposure with binary disease status, adjusting for covariates. A frequent complication is that exposure may only be measurable indirectly, through a collection of subject-specific variables assumed associated with it. Motivated by a specific study to investigate the association between lung function and exposure to metal working fluids, we focus on a multiplicative-lognormal structural measurement error scenario and approaches to address it when external validation data are available. Conceptually, we emphasize the case in which true untransformed exposure is of interest in modeling disease status, but measurement error is additive on the log scale and thus multiplicative on the raw scale. Methodologically, we favor a pseudo-likelihood (PL) approach that exhibits fewer computational problems than direct full maximum likelihood (ML) yet maintains consistency under the assumed models without necessitating small exposure effects and/or small measurement error assumptions. Such assumptions are required by computationally convenient alternative methods like regression calibration (RC) and ML based on probit approximations. We summarize simulations demonstrating considerable potential for bias in the latter two approaches, while supporting the use of PL across a variety of scenarios. We also provide accessible strategies for obtaining adjusted standard errors to accompany RC and PL estimates. PMID:24027381
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS ...
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS FROM POND TO JACK LADDER--AN ENDLESS CHAIN CONVEYOR THAT MOVES LOGS INTO MILL - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR
Loaiza-Brito, Oscar; Martin, Johannes; Nilles, Hans Peter; Ratz, Michael
2005-12-02
Flux compactifications of string theory seem to require the presence of a fine-tuned constant in the superpotential. We discuss a scheme where this constant is replaced by a dynamical quantity which we argue to be a 'continuous Chern-Simons term'. In such a scheme, the gaugino condensate generates the hierarchically small scale of supersymmetry breakdown rather than adjusting its size to a constant. A crucial ingredient is the appearance of the hierarchically small quantity exp(-
MAIL LOG, program theory, volume 2
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.
Experiment 2030. EE-2 Temperature Log and Downhole Water Sample
Grigsby, Charles O.
1983-07-29
A temperature log and downhole water sample run were conducted in EE-2 on July 13, 1983. The temperature log was taken to show any changes which had occurred in the fracture-to-wellbore intersections as a result of the Experiment 2020 pumping and to locate fluid entries for taking the water sample. The water sample was requested primarily to determine the arsenic concentration in EE-2 fluids (see memo from C.Grigsby, June 28, 1983 concerning arsenic in EE-3 samples.) The temperature log was run using the thermistor in the ESS-6 water samples.
Core log: Valles caldera No. 2A, New Mexico
Starguist, V.L.
1988-01-01
Scientific core hole VC-2A was drilled into the western ring-fracture zone at Sulphur Springs in the Valles caldera, New Mexico. VC-2A, the second scientific core hole in the caldera, was cored through a faulted and brecciated sequence of intracauldron tuffs and volcaniclastic rocks to a depth of 528 m. As of November 1, 1986, the unequilibrated bottom-hole temperature was 212/degree/C. The rocks penetrated are intensely altered and host sub-ore grade stockwork molybdenite mineralization between 25 and 125 m. This report contains a detailed core log to aid researchers in their studies of the Valles caldera magma hydrothermal system. 3 refs., 2 figs.
A note on G2 log-aesthetic curves
NASA Astrophysics Data System (ADS)
Wo, Mei Seen; Gobithaasan R., U.; Miura, Kenjiro T.; Abbas, Muhammad
2015-12-01
Log-aesthetic curve (LAC) is a curve family composed of transcendental curves that includes logarithmic spiral, clothoid, circle involute and Nielsen's spiral. They have linear logarithmic curvature graphs (LCGs) and are highly aesthetic. In order to implement G2 LAC in industrial design successfully, one needs guidance on the existence and uniqueness whether a LAC segment satisfy given G2 Hermite data. This paper focuses shows the existence and uniqueness of solution for single segment G2 LAC. A LAC equation that incorporates both start and end curvatures, and end tangential angle is first derived. Then, the end points of the LAC segments are calculated using the derived LAC equation, which is also a representation of the solution region of LAC given a set of G2 Hermite data. The derived function is investigated for its existence and uniqueness. It is shown that the solution region is a curve that do not self-intersect anywhere, thus the solution of single segment G2 LAC is always unique.
NASA Technical Reports Server (NTRS)
Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.
1981-01-01
An experiment was performed in which a complete X-ray survey of the 8.2 steradians of the sky at galactic latitudes where the absolute value of b is 20 deg down to a limiting sensitivity of 3.1 x ten to the minus 11th power ergs/sq cm sec in the 2-10 keV band. Of the 85 detected sources 17 were identified with galactic objects, 61 were identified with extragalactic objects, and 7 remain unidentified. The log N - log S relation for the non-galactic objects is well fit by the Euclidean relationship. The X-ray spectra of these objects were used to construct log N - log S in physical units. The complete sample of identified sources was used to construct X-ray luminosity functions, using the absolute maximum likelihood method, for clusters galaxies and active galactic nuclei.
Maximum likelihood topographic map formation.
Van Hulle, Marc M
2005-03-01
We introduce a new unsupervised learning algorithm for kernel-based topographic map formation of heteroscedastic gaussian mixtures that allows for a unified account of distortion error (vector quantization), log-likelihood, and Kullback-Leibler divergence. PMID:15802004
Assessment of geophysical logs from borehole USW G-2, Yucca Mountain, Nevada
Nelson, P.H.; Schimschal, U.
1993-05-01
Commercial logging contractors, Western Atlas, Schlumberger, and Edcon obtained borehole geophysical logs at the site of a potential high level nuclear waste repository at Yucca Mountain, Nevada. Drill hole USW-G2 was picked for this test of suitable logging tools and logging technology, both representing state-of-the-art technology by these commercial companies. Experience gained by analysis of existing core data and a variety of logs obtained earlier by Birdwell and Dresser Atlas served as a guide to a choice of logs to be obtained. Logs were obtained in water-filled borehole in zeolitized tuff (saturated zone) and in air-filled borehole largely in unaltered welded tuff (unsaturated zone).
Drovandi, Christopher C; McCutchan, Roy A
2016-06-01
In this article we present a new method for performing Bayesian parameter inference and model choice for low- count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel exact-approximate algorithm, which we refer to as alive SMC2. The advantages of this approach over competing methods are that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo, and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series, and the cumulative number of prion disease cases in mule deer. PMID:26584211
Terwilliger, Thomas C.
2001-01-01
The recently developed technique of maximum-likelihood density modification [Terwilliger (2000 ▶), Acta Cryst. D56, 965–972] allows a calculation of phase probabilities based on the likelihood of the electron-density map to be carried out separately from the calculation of any prior phase probabilities. Here, it is shown that phase-probability distributions calculated from the map-likelihood function alone can be highly accurate and that they show minimal bias towards the phases used to initiate the calculation. Map-likelihood phase probabilities depend upon expected characteristics of the electron-density map, such as a defined solvent region and expected electron-density distributions within the solvent region and the region occupied by a macromolecule. In the simplest case, map-likelihood phase-probability distributions are largely based on the flatness of the solvent region. Though map-likelihood phases can be calculated without prior phase information, they are greatly enhanced by high-quality starting phases. This leads to the technique of prime-and-switch phasing for removing model bias. In prime-and-switch phasing, biased phases such as those from a model are used to prime or initiate map-likelihood phasing, then final phases are obtained from map-likelihood phasing alone. Map-likelihood phasing can be applied in cases with solvent content as low as 30%. Potential applications of map-likelihood phasing include unbiased phase calculation from molecular-replacement models, iterative model building, unbiased electron-density maps for cases where 2Fo − Fc or σA-weighted maps would currently be used, structure validation and ab initio phase determination from solvent masks, non-crystallographic symmetry or other knowledge about expected electron density. PMID:11717488
List-Mode Likelihood: EM Algorithm and Image Quality Estimation Demonstrated on 2-D PET
Barrett, Harrison H.
2010-01-01
Using a theory of list-mode maximum-likelihood (ML) source reconstruction presented recently by Barrett et al. [1], this paper formulates a corresponding expectation-maximization (EM) algorithm, as well as a method for estimating noise properties at the ML estimate. List-mode ML is of interest in cases where the dimensionality of the measurement space impedes a binning of the measurement data. It can be advantageous in cases where a better forward model can be obtained by including more measurement coordinates provided by a given detector. Different figures of merit for the detector performance can be computed from the Fisher information matrix (FIM). This paper uses the observed FIM, which requires a single data set, thus, avoiding costly ensemble statistics. The proposed techniques are demonstrated for an idealized two-dimensional (2-D) positron emission tomography (PET) [2-D PET] detector. We compute from simulation data the improved image quality obtained by including the time of flight of the coincident quanta. PMID:9688154
Functional conservation between mammalian MGRN1 and plant LOG2 ubiquitin ligases
Guerra, Damian D.; Pratelli, Réjane; Kraft, Edward; Callis, Judy; Pilot, Guillaume
2014-01-01
Plant LOSS OF GDU 2 (LOG2) and mammalian MAHOGUNIN RING FINGER 1 (MGRN1) proteins are RING-type E3 ligases sharing similarity N-terminal to the RING domain. Deletion of this region disrupts the interaction of LOG2 with the plant membrane protein GLUTAMINE DUMPER 1 (GDU1). Phylogenetic analysis identified two clades of LOG2/MGRN1-like proteins in vertebrates and plants. The ability of MGRN1 to functionally replace LOG2 was tested. MGRN1 ubiquitylates GDU1 in vitro and can partially substitute for LOG2 in the plant, partially restoring amino acid resistance to a GDU1-myc over-expression, log2-1 background. Altogether, these results suggest a conserved function for the N-terminal domain in evolution. Structured summary of protein interactionsGDU1 physically interactswith rnMGRN1by two hybrid (1, 2)rnMGRN1 binds to GDU1by pull down(View interaction)GDU6 physically interactswith rnMGRN1by two hybrid (1, 2)GDU3 physically interactswith hsMGRN1by two hybrid (1, 2)rnMGRN1 physically interactswith GDU7by two hybrid (1, 2)GDU5 physically interactswith hsMGRN1by two hybrid (1, 2)hsMGRN1 physically interactswith GDU1by two hybrid (1, 2)rnMGRN1 physically interactswith GDU3by two hybrid (1, 2)rnMGRN1 physically interactswith GDU5by two hybrid (1, 2)hsMGRN1 physically interactswith GDU7by two hybrid (1, 2)GDU6 physically interactswith hsMGRN1by two hybrid (1, 2)LOG2 binds to GDU1by pull down (1, 2)hsMGRN1 physically interactswith GDU4by two hybrid (1, 2)rnMGRN1 physically interactswith GDU2by two hybrid (1, 2)GDU4 physically interactswith rnMGRN1by two hybrid (1, 2)LOG2 physically interactswith GDU1by two hybrid (1, 2, 3, 4)GDU2 physically interactswith hsMGRN1by two hybrid (1, 2) PMID:24036454
In situ fluid typing and quantification with 1D and 2D NMR logging.
Sun, Boqin
2007-05-01
In situ nuclear magnetic resonance (NMR) fluid typing has recently gained momentum due to data acquisition and inversion algorithm enhancement of NMR logging tools. T(2) distributions derived from NMR logging contain information on bulk fluids and pore size distributions. However, the accuracy of fluid typing is greatly overshadowed by the overlap between T(2) peaks arising from different fluids with similar apparent T(2) relaxation times. Nevertheless, the shapes of T(2) distributions from different fluid components are often different and can be predetermined. Inversion with predetermined T(2) distributions allows us to perform fluid component decomposition to yield individual fluid volume ratios. Another effective method for in situ fluid typing is two-dimensional (2D) NMR logging, which results in proton population distribution as a function of T(2) relaxation time and fluid diffusion coefficient (or T(1) relaxation time). Since diffusion coefficients (or T(1) relaxation time) for different fluid components can be very different, it is relatively easy to separate oil (especially heavy oil) from water signal in a 2D NMR map and to perform accurate fluid typing. Combining NMR logging with resistivity and/or neutron/density logs provides a third method for in situ fluid typing. We shall describe these techniques with field examples. PMID:17466778
The Phylogenetic Likelihood Library
Flouri, T.; Izquierdo-Carrasco, F.; Darriba, D.; Aberer, A.J.; Nguyen, L.-T.; Minh, B.Q.; Von Haeseler, A.; Stamatakis, A.
2015-01-01
We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2–10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). PMID:25358969
The phylogenetic likelihood library.
Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A
2015-03-01
We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). PMID:25358969
Use of well logs to characterize fluid flow in the Maljamar CO/sub 2/ Pilot
Albright, J.C.
1984-09-01
The Maljamar CO/sub 2/ Pilot in Lea County, New Mexico, is a 5-acre inverted five spot. Two zones are being flooded. They are a Grayburg dolomitic sand at 3,700 feet and a San Andres dolomite at 4,050 feet. Two logging observation wells, completed with fiberglass casing through the section of interest, are located in line with the center injector and one of the corner producers. Nine months of freshwater injection in the center well was followed by nine months of brine. A series of induction logs monitored the passing of the fresh water/brine interface providing data for a preliminary characterization of flow in the zones. The brine also established a uniform salinity at the observation wells for saturation determination. Gamma emitting tracers were injected into each zone of the center well as part of a well-to-well tracer study. Frequent gamma ray logs were run in the observation wells to see whether the movement of the tracers could be detected and used to characterize water movement. The results were very encouraging and provided better vertical and time resolution than the induction logs. The numerous responding layers in each zone could be classified by tracer arrival times into only a few basic types. Injection of CO/sub 2/ and follow-up brine has been monitored with a series of induction and neutron logs to follow the changes in water and CO/sub 2/ saturation as the flood progressed.
NASA Astrophysics Data System (ADS)
He, Yi; Liwo, Adam; Scheraga, Harold A.
2015-12-01
Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field.
He, Yi; Scheraga, Harold A.; Liwo, Adam
2015-12-28
Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field.
Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods.
McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando
2010-01-01
Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2Din-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349
Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods
McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando
2010-01-01
Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2D in-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349
The Likelihood Function and Likelihood Statistics
NASA Astrophysics Data System (ADS)
Robinson, Edward L.
2016-01-01
The likelihood function is a necessary component of Bayesian statistics but not of frequentist statistics. The likelihood function can, however, serve as the foundation for an attractive variant of frequentist statistics sometimes called likelihood statistics. We will first discuss the definition and meaning of the likelihood function, giving some examples of its use and abuse - most notably in the so-called prosecutor's fallacy. Maximum likelihood estimation is the aspect of likelihood statistics familiar to most people. When data points are known to have Gaussian probability distributions, maximum likelihood parameter estimation leads directly to least-squares estimation. When the data points have non-Gaussian distributions, least-squares estimation is no longer appropriate. We will show how the maximum likelihood principle leads to logical alternatives to least squares estimation for non-Gaussian distributions, taking the Poisson distribution as an example.The likelihood ratio is the ratio of the likelihoods of, for example, two hypotheses or two parameters. Likelihood ratios can be treated much like un-normalized probability distributions, greatly extending the applicability and utility of likelihood statistics. Likelihood ratios are prone to the same complexities that afflict posterior probability distributions in Bayesian statistics. We will show how meaningful information can be extracted from likelihood ratios by the Laplace approximation, by marginalizing, or by Markov chain Monte Carlo sampling.
Rolling Deck to Repository (R2R): A Research Cruise Prototype Event Logging System
NASA Astrophysics Data System (ADS)
Chandler, C. L.; Dorsk, A.
2009-12-01
Data gathered aboard research vessels coordinated by the University-National Oceanographic Laboratory System (UNOLS) represent an important component of the overall oceanographic data collection. The nascent Rolling Deck to Repository (R2R) project aims to improve access to basic shipboard data and ultimately reduce the work required to provide that access. The ultimate vision of R2R is to assist in transforming the academic fleet into an integrated global observing system. One of the coordinated subprojects within the R2R project is development of a shipboard, scientific event logging system that would incorporate best practice guidelines, a metadata schema and new and existing applications to generate a scientific sampling event log in the form of a digital text file. A cruise event logging system enables researchers to record digitally all scientific sampling events and assign a unique event identifier to each entry. Decades of work conducted within large coordinated ocean research programs (JGOFS, GLOBEC, WOCE and RIDGE) have shown that creation of a shipboard sampling event log can facilitate greatly the subsequent integration of data sets from individual investigators. A prototype event logger application, based on ELOG, has been developed and tested and results and lessons learned from this development effort will be shared.
Rolling Deck to Repository (R2R): Research Cruise Event Logging System Update
NASA Astrophysics Data System (ADS)
Maffei, A. R.; Chandler, C. L.; Stolp, L.
2010-12-01
Data gathered aboard research vessels coordinated by the University-National Oceanographic Laboratory System (UNOLS) represent an important component of the overall oceanographic data collection. The Rolling Deck to Repository (R2R) project aims to improve access to basic shipboard data and ultimately reduce the work required to provide that access. The ultimate vision of R2R is to assist in transforming the academic fleet into an integrated global observing system. One of the coordinated subprojects within the R2R project is the development of a shipboard scientific event logging system that incorporates best practice guidelines, controlled vocabularies, a cruise metadata schema, and a scientific event log. The event log application will use new and existing components to generate a digital text file with a fleet-wide agreed upon format. A cruise event logging system enables researchers to record digitally all scientific sampling events and assign a unique event identifier to each entry. Decades of work conducted within large coordinated ocean research programs (JGOFS, GLOBEC, WOCE and RIDGE) have shown that creation of a shipboard sampling event log can facilitate greatly the ingestion of these data into oceanographic repositories and subsequent integration of data sets from individual investigators. An event logger application, based on ELOG, has been developed and is being tested by scientists at sea. An important aspect of this project is the development of a controlled vocabulary and ontology for documenting the scientific sampling events that occur during a research cruise. The controlled vocabulary development is being influenced by similar efforts in other communities. In addition, a Web application is being developed for oceanographers to use to pre-configure the R2R event logger application with vocabulary terms and entry forms appropriate to the specific needs of their upcoming research cruise. Results and lessons learned from the R2R event logger
ERIC Educational Resources Information Center
Fennell, Mary L.; And Others
This document is part of a series of chapters described in SO 011 759. This chapter reports the results of Monte Carlo simulations designed to analyze problems of using maximum likelihood estimation (MLE: see SO 011 767) in research models which combine longitudinal and dynamic behavior data in studies of change. Four complications--censoring of…
Glenn, W.E.; Hulen, J.B.; Nielson, D.L.
1981-02-01
Utah State Geothermal Well 9-1 in the Roosevelt Hot Springs KGRA, Beaver County, Utah, has been donated by Phillips Petroleum Company for calibration and testing of well-logging equipment in the hot, corrosive, geothermal environment. It is the second Calibration/Test Well (C/T-2) in the Geothermal Log Interpretation Program. A study of cuttings and well logs from Well C/T-2 was completed. This synthesis and data presentation contains most of the subsurface geologic information needed to effect the total evaluation of geophysical logs acquired in this geothermal calibration/test well, C/T-2.
Emulation of petroleum well-logging D-T2 correlations on a standard benchtop spectrometer.
Mitchell, J; Fordham, E J
2011-10-01
An experimental protocol is described that allows two-dimensional (2D) nuclear magnetic resonance (NMR) correlations of apparent diffusion coefficient D(app) and effective transverse relaxation time T(2,eff) to be acquired on a bench-top spectrometer using pulsed field gradients (PFG) in such a manner as to emulate D(app)-T(2,eff) correlations acquired using a well-logging tool with a fixed field gradient (FFG). This technique allows laboratory-scale NMR measurements of liquid-saturated cored rock to be compared directly to logging data obtained from the well by virtue of providing a comparable acquisition protocol and data format, and hence consistent data processing. This direct comparison supports the interpretation of the well-logging data, including a quantitative determination of the oil/brine saturation. The D-T(2) pulse sequence described here uses two spin echoes (2SE) with a variable echo time to encode for diffusion. The diffusion and relaxation contributions to the signal decay are then deconvolved using a 2D numerical inversion. This measurement allows shorter relaxation time components to be probed than in conventional diffusion measurements. A brief discussion of the numerical inversion algorithms available for inverting these non-rectangular data is included. The PFG-2SE sequence described is well suited to laboratory-scale studies of porous media and short T(2) samples in general. PMID:21875819
A new inversion method for (T2, D) 2D NMR logging and fluid typing
NASA Astrophysics Data System (ADS)
Tan, Maojin; Zou, Youlong; Zhou, Cancan
2013-02-01
One-dimensional nuclear magnetic resonance (1D NMR) logging technology has some significant limitations in fluid typing. However, not only can two-dimensional nuclear magnetic resonance (2D NMR) provide some accurate porosity parameters, but it can also identify fluids more accurately than 1D NMR. In this paper, based on the relaxation mechanism of (T2, D) 2D NMR in a gradient magnetic field, a hybrid inversion method that combines least-squares-based QR decomposition (LSQR) and truncated singular value decomposition (TSVD) is examined in the 2D NMR inversion of various fluid models. The forward modeling and inversion tests are performed in detail with different acquisition parameters, such as magnetic field gradients (G) and echo spacing (TE) groups. The simulated results are discussed and described in detail, the influence of the above-mentioned observation parameters on the inversion accuracy is investigated and analyzed, and the observation parameters in multi-TE activation are optimized. Furthermore, the hybrid inversion can be applied to quantitatively determine the fluid saturation. To study the effects of noise level on the hybrid method and inversion results, the numerical simulation experiments are performed using different signal-to-noise-ratios (SNRs), and the effect of different SNRs on fluid typing using three fluid models are discussed and analyzed in detail.
Goodall, John
2012-05-21
Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input it will input data from files, standard input, and syslog, (2) Parser it will parse the log file based on regular expressions into structured data (JSNO format), (3) Output it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.
The likelihood ratio and frequency of DQ2/DQ8 haplotypes in Iranian patients with celiac disease
Khosravi, Asghar; Mansouri, Masoume; Rostami-Nejad, Mohammad; Shahbazkhani, Bijan; Ekhlasi, Golnaz; Kalantari, Ebrahim
2016-01-01
Aim: The aim of this study was to evaluate the likelihood ratio and frequency of DQ2 and DQ8 in Iranian patients with celiac disease (CD). Background: The HLA DQ2 and HLA DQ8 are the important mediators in the development of celiac disease. A few studies evaluated the frequency of HLA DQ2 and HLA DQ8 haplotypes among the Iranian population with low sample size. Patients and methods: In this cross-sectional study, to predict HLA–DQ2 and DQ8 haplotypes, 141(73 male, 78 female) confirmed CD patients compared to 151 healthy controls were enrolled into this study during 2013-2014. HLA DQ2/ DQ8 haplotypes was determined in cases and controls using PCR-SSP technique. Results: DQ2 and DQ8 were positive in 80% (n=111) and 49% (n= 69) of CD patients and 36% (n=61) and 13% (n=21) of control group respectively. Moreover, 32% (n=45) of CD patients and 5.3% (n=8) of the control group were carrier of both haplotypes. In the case group about one-third of patients (32.2%) were positive for carrying both DQ2 and DQ8 heterodimers while only 5.3% (n=8) of the control group were carrier. In addition, the positive likelihood ratio of DQ2 and DQ8 were 1.74 (CI: 1.4- 2.1), and 2.6 (CI: 1.8– 2.7), respectively. Conclusion: The result of this study showed that the frequency of DQ8 among our population is higher than those reported by European countries, but it is close to those founded in South America and Middle East. This result suggests that the higher prevalence of HLA DQ8 pattern in Iranian CD patients is similar to non-European patients. PMID:26744610
Augmented Likelihood Image Reconstruction.
Stille, Maik; Kleine, Matthias; Hägele, Julian; Barkhausen, Jörg; Buzug, Thorsten M
2016-01-01
The presence of high-density objects remains an open problem in medical CT imaging. Data of projections passing through objects of high density, such as metal implants, are dominated by noise and are highly affected by beam hardening and scatter. Reconstructed images become less diagnostically conclusive because of pronounced artifacts that manifest as dark and bright streaks. A new reconstruction algorithm is proposed with the aim to reduce these artifacts by incorporating information about shape and known attenuation coefficients of a metal implant. Image reconstruction is considered as a variational optimization problem. The afore-mentioned prior knowledge is introduced in terms of equality constraints. An augmented Lagrangian approach is adapted in order to minimize the associated log-likelihood function for transmission CT. During iterations, temporally appearing artifacts are reduced with a bilateral filter and new projection values are calculated, which are used later on for the reconstruction. A detailed evaluation in cooperation with radiologists is performed on software and hardware phantoms, as well as on clinically relevant patient data of subjects with various metal implants. Results show that the proposed reconstruction algorithm is able to outperform contemporary metal artifact reduction methods such as normalized metal artifact reduction. PMID:26208310
2013-08-01
This is an executable python script which offers two different conversions for well log data: 1) Conversion from a BoreholeLASLogData.xls model to a LAS version 2.0 formatted XML file. 2) Conversion from a LAS 2.0 formatted XML file to an entry in the WellLog Content Model. Example templates for BoreholeLASLogData.xls and WellLogsTemplate.xls can be found in the package after download.
Paillet, Frederick L.; Morin, R.H.
1988-01-01
A complete suite of conventional geophysical well logs was obtained in the upper part of a 3220-m-deep borehole drilled into geothermally altered alluvial sediments on the southeastern edge of the Salton Sea. Geophysical logs obtained in the State 2-14 borehole indicate that neutron porosity, gamma-gamma, and deep-induction logs provide useful information on lithologic trends with depth. The natural gamma log contains almost continuous, high-frequency fluctuations that obscure lithologic trends and that may be related to recent radioisotope redistribution and departure from radiometric equilibrium. Acoustic transit time logs give unrealistically low in situ compressional velocities ranging from 1.8 to 3.0 km/s, whereas acoustic waveform logs indicate that sediment compressional velocities range from less than 3.0 km/s shallower than 1000 m in depth to almost 5.0 km/s at depths greater than 2000 m. Analyses indicate that most log values lie between two lithologic end points: an electrically conductive claystone with moderate neutron porosity, but no effective porosity, and an electrically nonconductive, fully cemented siltstone that has small but finite porosity. -from Authors
ERIC Educational Resources Information Center
Jones, S.; And Others
1997-01-01
Discusses the use of transaction logging in Okapi-related projects to allow search algorithms and user interfaces to be investigated, evaluated, and compared. A series of examples is presented, illustrating logging software for character-based and graphical user interface systems, and demonstrating the usefulness of relational database management…
Condition and fate of logged forests in the Brazilian Amazon
Asner, Gregory P.; Broadbent, Eben N.; Oliveira, Paulo J. C.; Keller, Michael; Knapp, David E.; Silva, José N. M.
2006-01-01
The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16 ± 1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained. PMID:16901980
Condition and fate of logged forests in the Brazilian Amazon.
Asner, Gregory P; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Knapp, David E; Silva, José N M
2006-08-22
The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16+/-1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained. PMID:16901980
Model Fit after Pairwise Maximum Likelihood.
Barendse, M T; Ligtvoet, R; Timmerman, M E; Oort, F J
2016-01-01
Maximum likelihood factor analysis of discrete data within the structural equation modeling framework rests on the assumption that the observed discrete responses are manifestations of underlying continuous scores that are normally distributed. As maximizing the likelihood of multivariate response patterns is computationally very intensive, the sum of the log-likelihoods of the bivariate response patterns is maximized instead. Little is yet known about how to assess model fit when the analysis is based on such a pairwise maximum likelihood (PML) of two-way contingency tables. We propose new fit criteria for the PML method and conduct a simulation study to evaluate their performance in model selection. With large sample sizes (500 or more), PML performs as well the robust weighted least squares analysis of polychoric correlations. PMID:27148136
LIKELIHOOD OF THE POWER SPECTRUM IN COSMOLOGICAL PARAMETER ESTIMATION
Sun, Lei; Wang, Qiao; Zhan, Hu
2013-11-01
The likelihood function is a crucial element of parameter estimation. In analyses of galaxy overdensities and weak lensing shear, one often approximates the likelihood of the power spectrum with a Gaussian distribution. The posterior probability derived from such a likelihood deviates considerably from the exact posterior on the largest scales probed by any survey, where the central limit theorem does not apply. We show that various forms of Gaussian likelihoods can have a significant impact on the estimation of the primordial non-Gaussianity parameter f{sub NL} from the galaxy angular power spectrum. The Gaussian plus log-normal likelihood, which has been applied successfully in analyses of the cosmic microwave background, outperforms the Gaussian likelihoods. Nevertheless, even if the exact likelihood of the power spectrum is used, the estimated parameters may be still biased. As such, the likelihoods and estimators need to be thoroughly examined for potential systematic errors.
Burns, K.L.
1987-07-01
Patterns in reflected sonic intensity recognized during examination of televiewer logs of basement gneiss at the Hot Dry Rock Site, Fenton Hill, New Mexico, are due to geological fractures and foliations and to incipient breakouts. These features are obscured by artifacts caused by wellbore ellipticity, tool off-centering, and tool oscillations. An interactive method, developed for extraction of the structural features (fractures and foliations), uses human perception as a pattern detector and a chi-square test of harmonic form as a pattern discriminator. From imagery of GT-2, 733 structures were recovered. The acceptance rate of the discriminator was 54%. Despite these positive results, the general conclusion of this study is that intensity-mode imagery from Fenton Hill is not directly invertible for geological information because of the complexity of the televiewer imaging process. Developing a forward model of the intensity-imaging process, or converting to caliper-mode imagery, or doing both, will be necessary for high-fidelity feature extraction from televiewer data.
Preliminary observations and logs of BARB 1 and BARB 2: komatiites from the Tjakastad site
NASA Astrophysics Data System (ADS)
Coetzee, Grace; Arndt, Nicholas; Wilson, Allan
2013-04-01
The BARB 1 and BARB 2 cores intersect a suite of komatiite flows and komatiitic basalts as well as fragmental rocks of the Komati Formation of the Onverwacht Group, Barberton Greenstone Belt. The cores give important and previously unattainable information on the structures, textures and contact relationships between individual komatiite flows and different lithological units within the flows. BARB 1 was drilled at -48° on a 5° azimuth to a depth of 419.9 m. This core contains a unique volcanic tumulus succession in the stratigraphically lower 100 m and the rest of the core consists of about 59 flows of spinifex-textured komatiite (1-3 m thick), massive komatiite (0.5-10 m thick), komatiitic basalt (1-9 m thick) and a single basalt layer (10 m thick), intruded by gabbro (0.5-2 m thick) and a single dolerite dyke (18 m thick). BARB 2, approximately 50 m from BARB 1 and parallel to it, was drilled at -45°on an 8° azimuth to a depth of 431.5 m. This core contains approximately 39 flows of komatiite (0.5-10 m thick) and komatiitic basalt (2-23 m thick) which contain possible selvages of pillows. Basalt flows are more numerous (0.3-4 m thick) in BARB 2 whilst gabbro (0.6-7 m thick) is less prevalent. The dolerite dyke observed in BARB 1 does not occur in BARB 2. As the Barberton strata young towards the east, the cores intersected the stratigraphy in a reverse sequence. The cores were drilled such that there exists a 141 m overlap in stratigraphy between them. The section 141 m from the base of BARB 1 should theoretically correlate with the top 141 m of BARB 2. However, this overlap is not evident in the core or in the core logs. A single gabbro layer appears to be lithologically correlatable between both holes. There is no apparent correlation between the pattern of the komatiite flows leading to an initial conclusion that the komatiite flows were not laterally extensive or changed laterally in form over short distances. In both cores the proportion of komatiitic
CORRELATOR 5.2 - A program for interactive lithostratigraphic correlation of wireline logs
Olea, R.A.
2004-01-01
The limited radius of investigation of petrophysical measurements made in boreholes and the relatively large distances between wells result in an incomplete sensing of the subsurface through well logging. CORRELATOR is a program for estimating geological properties between logged boreholes. An initial and fundamental step is the lithostratigraphic correlation of logs in different wells. The method employed by the program closely emulates the process of visual inspection used by experienced subsurface geologists in manual correlation. Mathematically, the determination of lithostratigraphical equivalence is based on the simultaneous assessment of similarity in shale content, similarity in the patterns of vertical variation in a petrophysical property that is measured with high vertical resolution, and spatial consistency of stratigraphic relationships as determined by an expert system. Multiple additional options for processing log readings allow maximization in the extraction of information from pairs of logs per well and great flexibility in the final display of results in the form of cross sections and dip diagrams. ?? 2004 Elsevier Ltd. All rights reserved.
2012-05-21
Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input it will input data from files, standard input, and syslog, (2) Parser it will parse the logmore » file based on regular expressions into structured data (JSNO format), (3) Output it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.« less
Lyle, W.D.
1989-03-07
This patent describes a method for conducting a gravimetry survey of an earth formation, comprising the steps of: continuously traversing the formation with a gravity logging tool having at least two piezoelectric force transducers mounted at spaced-apart positions within the tool, exciting the piezoelectric transducers to vibrate at a characteristic resonant frequency, measuring the periods of vibration of the piezoelectric transducers as the logging tool continuously traverses the formation, the periods of vibration changing in response to the force exerted on the piezoelectric transducer by the acceleration of gravity and acceleration due to tool motion along the formation, and determining the difference in the measured periods of vibration of the piezoelectric transducers compensated for temperature relating force to the periods of vibration within the formation.
Tibshirani, R.J.
1984-12-01
In this work, we extend the idea of local averaging to likelihood-based regression models. One application is in the class of generalized linear models (Nelder and Wedderburn (1972). We enlarge this class by replacing the covariate form chi..beta.. with an unspecified smooth function s(chi). This function is estimated from the data by a technique we call Local Likelihood Estimation - a type of local averaging. Multiple covariates are incorporated through a forward stepwise algorithm. In a number of real data examples, the local likelihood technique proves to be effective in uncovering non-linear dependencies. Finally, we give some asymptotic results for local likelihood estimates and provide some methods for inference.
Inductive resistivity logging in steel-cased boreholes. SBIR Phase 2 progress report
Wilt, M.
1999-07-01
SBIR Phase 2 project 40145-97-I calls for the design and construction of a prototype inductive logging device to measure formation resistivity from within a steel-cased borehole. The SCIL (Steel Casing Induction Logger) tool is intended for reservoir characterization and process monitoring in an oil field environment. This report summarizes findings from the initial project period. In this phase, bench model measurements were made to test casing compensation schemes, numerical models were calculated to optimize the tool configuration and associated formation sensitivity and the preliminary design of the tool was completed. The bench tests constitute fundamental research on determining the characteristics of steel well casing and on developing means of separating the effects of the casing and the formation. This technology is crucial to the success of the project and significant progress has been made towards the goal of recovering the formation resistivity from inside the casing. Next, a series of sensitivity and tool configuration studies have been completed through partner Dr. David Alumbaugh at Sandia National Laboratories. These numerical results help to optimize the tool configuration and allow one to calculate the expected formation sensitivity. These models are preliminary to data interpretation software to be developed in the next project period. The initial hardware design of the tool has been completed, and ordering parts has begun for later manufacture and assembly. The tool, which is designed for maximum flexibility of deployment, will have a powerful transmitter, an array of three component sensors and sufficient dynamic range to operate in standard oil field steel-cased boreholes.
McGee, Steven
2002-01-01
Likelihood ratios are one of the best measures of diagnostic accuracy, although they are seldom used, because interpreting them requires a calculator to convert back and forth between “probability” and “odds” of disease. This article describes a simpler method of interpreting likelihood ratios, one that avoids calculators, nomograms, and conversions to “odds” of disease. Several examples illustrate how the clinician can use this method to refine diagnostic decisions at the bedside.
Clow, G.D.; Saltus, R.W.; Waddington, E.D.
1996-01-01
We describe a high-precision (0.1-1.0 mK) borehole-temperature (BT) logging system developed at the United States Geological Survey (USGS) for use in remote polar regions. We discuss calibration, operational and data-processing procedures, and present an analysis of the measurement errors. The system is modular to facilitate calibration procedures and field repairs. By interchanging logging cables and temperature sensors, measurements can be made in either shallow air-filled boreholes or liquid-filled holes up to 7 km deep. Data can be acquired in either incremental or continuous-logging modes. The precision of data collected by the new logging system is high enough to detect and quantify various thermal effects at the milli-Kelvin level. To illustrate this capability, we present sample data from the 3 km deep borehole at GISP2, Greenland, and from a 130m deep air-filled hole at Taylor Dome, Antarctica. The precision of the processed GTSP2 continuous temperature logs is 0.25-0.34 mK, while the accuracy is estimated to be 4.5 mK. The effects of fluid convection and the dissipation of the thermal disturbance caused by drilling the borehole are clearly visible in the data. The precision of the incremental Taylor Dome measurements varies from 0.11 to 0.32mK, depending on the wind strength during the experiments. With this precision, we found that temperature fluctuations and multi-hour trends in the BT measurements correlate well with atmospheric-pressure changes.
Addendum 2: Logs of monitor wells drilled May 1988 through December 1992
Stout, J.; Qualheim, B.; McPherrin, R.; Barber, K.; Hedegaard, R.; McConihe, W.; Miller, T.
1993-11-01
The logs in this addendum were plotted in a new format by the same software package (LOGGER by Rockware, Denver, CO) that was used in the original publication. The scale remains the same, 1 inch = 15 foot. The header is totally automated with a subheading indexing the well-construction symbols. Geophysical curves are labeled in their respective channels, and percentage core recovery is plotted in a histogram. Lithologic symbols are plotted to scale in a channel similar to previous logs. The lithologic description also has been automated to assure consistency in terminology. Descriptions are more extensive and are referenced by leader lines to the lithologic symbol. Additional figures included for this Addendum are: a plot of all the monitoring well locations at the LLNL Main site and a plot detailing the gasoline spill area well locations in the vicinity of Building 403.
Whiley, Phillip J.; Parsons, Michael T.; Leary, Jennifer; Tucker, Kathy; Warwick, Linda; Dopita, Belinda; Thorne, Heather; Lakhani, Sunil R.; Goldgar, David E.; Brown, Melissa A.; Spurdle, Amanda B.
2014-01-01
Rare exonic, non-truncating variants in known cancer susceptibility genes such as BRCA1 and BRCA2 are problematic for genetic counseling and clinical management of relevant families. This study used multifactorial likelihood analysis and/or bioinformatically-directed mRNA assays to assess pathogenicity of 19 BRCA1 or BRCA2 variants identified following patient referral to clinical genetic services. Two variants were considered to be pathogenic (Class 5). BRCA1:c.4484G> C(p.Arg1495Thr) was shown to result in aberrant mRNA transcripts predicted to encode truncated proteins. The BRCA1:c.122A>G(p.His41Arg) RING-domain variant was found from multifactorial likelihood analysis to have a posterior probability of pathogenicity of 0.995, a result consistent with existing protein functional assay data indicating lost BARD1 binding and ubiquitin ligase activity. Of the remaining variants, seven were determined to be not clinically significant (Class 1), nine were likely not pathogenic (Class 2), and one was uncertain (Class 3).These results have implications for genetic counseling and medical management of families carrying these specific variants. They also provide additional multifactorial likelihood variant classifications as reference to evaluate the sensitivity and specificity of bioinformatic prediction tools and/or functional assay data in future studies. PMID:24489791
NASA Technical Reports Server (NTRS)
Peters, B. C., Jr.; Walker, H. F.
1975-01-01
A general iterative procedure is given for determining the consistent maximum likelihood estimates of normal distributions. In addition, a local maximum of the log-likelihood function, Newtons's method, a method of scoring, and modifications of these procedures are discussed.
Grid Logging: Best Practices Guide
Tierney, Brian L; Tierney, Brian L; Gunter, Dan
2008-04-01
The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.
Correlation of lithologic and sonic logs from the COST No. B-2 well with seismic reflection data
King, K.C.
1979-01-01
The purpose of this study was to correlate events recorded on seismic records with changes in lithology recorded from sample descriptions from the Continental Offshore Stratigraphic Test (COST) No. B-2 well. The well is located on the U.S. mid-Atlantic Outer Continental Shelf about 146 km east of Atlantic City, N.J. (see location map). Lithologic data are summarized from the sample descriptions of Smith and others (1976). Sonic travel times were read at 0.15 m intervals in the well using a long-space sonic logging tool. Interval velocities, reflection coefficients and a synthetic seismogram were calculated from the sonic log.
Atwood, J.W.; Killpack, T.J.; Glenn, W.E.; Nutter, C.
1980-11-01
WELLOG is a software system that has been developed to plot digitized well log data in a manner suitable for analysis. Multiple logs can be plotted side by side for correlation analysis, and up to three logs can be plotted on a cross plot. Data entry, editing, and modification functions are also provided by the program. Digitizing is accomplished by a TEKTRONIX 4954 (on-line) digitizing tablet, and plotting is done on a TEKTRONIX 4014 graphics terminal, a STATOS 42 electrostatic plotter, or a CALCOMP pen plotter using a device independent plotting system. This program (WELLOG.REV2) is not as system-dependent as the former version (WELLOG.REV1). The user must supply a program to digitize the data and supply subroutines to interface the program with file manipulation and plotting routines of their system. One major improvement is the use of an on-line digitizing system whereby the program accesses disk files rather than reading the data from tape. In REV2 the merge file has been automated such that the file is initialized automatically upon creation and also delete protected. The randomly spaced data capabilities have been greatly improved allowing the averaging and cross plotting of the data. Routines have been added which allow all of the cross plots excepting the Z-plot to be printed on a line printer. Dresser Atlas' A-K plot has also been added. The program is almost completely self-contained needing only a few interfacing and system subroutines.
Collett, T.S.
1999-01-01
The JAPEX/JNOC/GSC Mallik 2L-38 gas hydrate research well project was designed to investigate the occurrence of in situ natural gas hydrate in the Mallik area of the Mackenzie Delta of Canada. Because gas hydrate is unstable at surface pressure and temperature conditions, a major emphasis was placed on the downhole logging program to determine the in situ physical properties of the gas-hydrate-bearing sediments. Downhole logging tool strings deployed in the Mallik 2L-38 well included the Schlumberger Platform Express with a high resolution laterolog, Array Induction Imager Tool, Dipole Shear Sonic Imager, and a Fullbore Formation Microlmager. The downhole log data obtained from the log- and core-inferred gas-hydrate-bearing sedimentary interval (897.25-1109.5 m log depth) in the Mallik 2L-38 well is depicted in a series of well displays. Also shown are numerous reservoir parameters, including gas hydrate saturation and sediment porosity log traces, calculated from available downhole well-log and core data. The gas hydrate accumulation delineated by the Mallik 2L-38 well has been determined to contain as much as 4.15109 m3 of gas in the 1 km2 area surrounding the drill site.
An O(log sup 2 N) parallel algorithm for computing the eigenvalues of a symmetric tridiagonal matrix
NASA Technical Reports Server (NTRS)
Swarztrauber, Paul N.
1989-01-01
An O(log sup 2 N) parallel algorithm is presented for computing the eigenvalues of a symmetric tridiagonal matrix using a parallel algorithm for computing the zeros of the characteristic polynomial. The method is based on a quadratic recurrence in which the characteristic polynomial is constructed on a binary tree from polynomials whose degree doubles at each level. Intervals that contain exactly one zero are determined by the zeros of polynomials at the previous level which ensures that different processors compute different zeros. The exact behavior of the polynomials at the interval endpoints is used to eliminate the usual problems induced by finite precision arithmetic.
Likelihood and clinical trials.
Hill, G; Forbes, W; Kozak, J; MacNeill, I
2000-03-01
The history of the application of statistical theory to the analysis of clinical trials is reviewed. The current orthodoxy is a somewhat illogical hybrid of the original theory of significance tests of Edgeworth, Karl Pearson, and Fisher, and the subsequent decision theory approach of Neyman, Egon Pearson, and Wald. This hegemony is under threat from Bayesian statisticians. A third approach is that of likelihood, stemming from the work of Fisher and Barnard. This approach is illustrated using hypothetical data from the Lancet articles by Bradford Hill, which introduced clinicians to statistical theory. PMID:10760630
Likelihood Analysis for Mega Pixel Maps
NASA Technical Reports Server (NTRS)
Kogut, Alan J.
1999-01-01
The derivation of cosmological parameters from astrophysical data sets routinely involves operations counts which scale as O(N(exp 3) where N is the number of data points. Currently planned missions, including MAP and Planck, will generate sky maps with N(sub d) = 10(exp 6) or more pixels. Simple "brute force" analysis, applied to such mega-pixel data, would require years of computing even on the fastest computers. We describe an algorithm which allows estimation of the likelihood function in the direct pixel basis. The algorithm uses a conjugate gradient approach to evaluate X2 and a geometric approximation to evaluate the determinant. Monte Carlo simulations provide a correction to the determinant, yielding an unbiased estimate of the likelihood surface in an arbitrary region surrounding the likelihood peak. The algorithm requires O(N(sub d)(exp 3/2) operations and O(Nd) storage for each likelihood evaluation, and allows for significant parallel computation.
Likelihood of meeting the EU limit values for NO 2 and PM 10 concentrations in the Netherlands
NASA Astrophysics Data System (ADS)
Velders, Guus J. M.; Diederen, Hub S. M. A.
In 2007, the European limit values for annual average nitrogen dioxide (NO 2) concentration and for daily average particulate matter (PM 10) concentration were exceeded along motorways and city streets in the Netherlands. While the road length along which the exceedance occurred is uncertain, model calculations show that the NO 2 concentration was likely to have been exceeded (chance >66%) along about 300 km and PM 10 concentration along about 75 km. In addition, the limit values were exceeded 'about as likely as not' (chance 33-66%) along a total of 1000 km for NO 2 and 1600 km for PM 10. PM 10 and NO 2 concentrations must be below the limit values everywhere in Europe, ultimately by 2011 and 2015, respectively. Since estimates of future local concentrations have an uncertainty of about 15-20%, no absolute statements can be made whether concentrations will be below the limit values within the specified time. Model calculations accounting for the effects of current and proposed national and European legislation, and using average meteorology for large-scale and local traffic contributions show strong decreases in likely limit value exceedances in the Netherlands. However, limit value exceedances are still possible (chance >33%) along about 350 km for PM 10 by 2011, and about 150 km for NO 2, by 2015. These possible exceedances depend not only on the uncertainties and on national and European policies and their effectiveness, but also on contributions by specific additional local measures. The Netherlands Government has proposed a plan, which includes local measures to meet the limit values everywhere, in time. Although not assessed here due to their specific character, such local measures could reduce exceedances. As the effects of local measures and estimates of concentrations are uncertain, continuous monitoring - possibly together with additional measures - will be needed to adhere to the limit values.
Real Time Mud Gas Logging During Drilling of DFDP-2B
NASA Astrophysics Data System (ADS)
Mathewson, L. A.; Toy, V.; Menzies, C. D.; Zimmer, M.; Erzinger, J.; Niedermann, S.; Cox, S.
2015-12-01
The Deep Fault Drilling Project (DFDP) aims to improve our understanding of the Alpine Fault Zone, a tectonically active mature fault system in New Zealand known to rupture in large events, by deep scientific drilling. The borehole DFDP-2B approached the Alpine Fault at depth, reaching a final depth of 892 m (820 m true vertical depth). Online gas analysis (OLGA) while drilling tracked changes in the composition of gases extracted from the circulating drill mud. The composition of fluids from fault zones can provide information about their origins, flow rates and -paths, fluid-rock interactions along these paths, and the permeability structure of the faulted rock mass. Apart from an atmospheric input, the gases in drilling mud derive from the pore space of rock, crushed at the drill bit, and from permeable layers intersected by the borehole. The rapid formation of mud wall cake seals the borehole from further fluid inflow, hence formation-derived gases enter mostly at the depth of the drill bit. OLGA analyses N2, O2, Ar, CO2, CH4, He, and H2 on a mass spectrometer, hydrocarbons CH4, C2H6, C3H8, i-C4H10, and n-C4H10 on a gas chromatograph, and Rn using a lucas-cell detector. Gas was sampled for offline analyses on noble gas and stable isotopes to complement the OLGA dataset. The principle formation-derived gases found in drilling mud during drilling of DFDP-2 were CO2 and CH4, with smaller component of H2 and He2. High radon activity is interpreted to reflect intervals of active fluid flow through highly fractured and faulted rock. 3He/4He values in many samples were extremely air-contaminated, i.e. there was almost no excess of non-atmospheric He. The 3He/4He values measured at 236 m and 610 m, which are the only analyses with uncertainties <100%, are very similar to those measured in hot springs along the Alpine Fault, e.g. Fox River (0.64 Ra), Copland (0.42 Ra), Lower Wanganui (0.81 Ra). We will compare these data to those gathered using OLGA and discuss the
Hertel, Thomas W.; Ramankutty, Navin; Baldos, Uris Lantz C.
2014-01-01
There has been a resurgence of interest in the impacts of agricultural productivity on land use and the environment. At the center of this debate is the assertion that agricultural innovation is land sparing. However, numerous case studies and global empirical studies have found little evidence of higher yields being accompanied by reduced area. We find that these studies overlook two crucial factors: estimation of a true counterfactual scenario and a tendency to adopt a regional, rather than a global, perspective. This paper introduces a general framework for analyzing the impacts of regional and global innovation on long run crop output, prices, land rents, land use, and associated CO2 emissions. In so doing, it facilitates a reconciliation of the apparently conflicting views of the impacts of agricultural productivity growth on global land use and environmental quality. Our historical analysis demonstrates that the Green Revolution in Asia, Latin America, and the Middle East was unambiguously land and emissions sparing, compared with a counterfactual world without these innovations. In contrast, we find that the environmental impacts of a prospective African Green Revolution are potentially ambiguous. We trace these divergent outcomes to relative differences between the innovating region and the rest of the world in yields, emissions efficiencies, cropland supply response, and intensification potential. Globalization of agriculture raises the potential for adverse environmental consequences. However, if sustained for several decades, an African Green Revolution will eventually become land sparing. PMID:25201962
2010-01-01
Background The development, in the last decade, of stochastic heuristics implemented in robust application softwares has made large phylogeny inference a key step in most comparative studies involving molecular sequences. Still, the choice of a phylogeny inference software is often dictated by a combination of parameters not related to the raw performance of the implemented algorithm(s) but rather by practical issues such as ergonomics and/or the availability of specific functionalities. Results Here, we present MetaPIGA v2.0, a robust implementation of several stochastic heuristics for large phylogeny inference (under maximum likelihood), including a Simulated Annealing algorithm, a classical Genetic Algorithm, and the Metapopulation Genetic Algorithm (metaGA) together with complex substitution models, discrete Gamma rate heterogeneity, and the possibility to partition data. MetaPIGA v2.0 also implements the Likelihood Ratio Test, the Akaike Information Criterion, and the Bayesian Information Criterion for automated selection of substitution models that best fit the data. Heuristics and substitution models are highly customizable through manual batch files and command line processing. However, MetaPIGA v2.0 also offers an extensive graphical user interface for parameters setting, generating and running batch files, following run progress, and manipulating result trees. MetaPIGA v2.0 uses standard formats for data sets and trees, is platform independent, runs in 32 and 64-bits systems, and takes advantage of multiprocessor and multicore computers. Conclusions The metaGA resolves the major problem inherent to classical Genetic Algorithms by maintaining high inter-population variation even under strong intra-population selection. Implementation of the metaGA together with additional stochastic heuristics into a single software will allow rigorous optimization of each heuristic as well as a meaningful comparison of performances among these algorithms. MetaPIGA v2
Jerling, M; Merlé, Y; Mentré, F; Mallet, A
1994-01-01
Therapeutic drug monitoring data for nortriptyline (674 analyses from 578 patients) were evaluated with the nonparametric maximum likelihood (NPML) method in order to determine the population kinetic parameters of this drug and their relation to age, body weight and duration of treatment. Clearance of nortriptyline during monotherapy exhibited a large interindividual variability and a skewed distribution. A small, separate fraction with a very high clearance, constituting between 0.5% and 2% of the population, was seen in both men and women. This may be explained by the recent discovery of subjects with multiple copies of the gene encoding the cytochrome-P450-enzyme CYP2D6, which catalyses the hydroxylation of nortriptyline. However, erratic compliance with the prescription may also add to this finding. A separate distribution of low clearance values with a frequency corresponding to that of poor metabolizers of CYP2D6 (circa 7% in Caucasian populations) could not be detected. Concomitant therapy with drugs that inhibit CYP2D6 resulted in a major increase in the plasma nortriptyline concentrations. This was caused by a decrease in nortriptyline clearance, whereas the volume of distribution was unchanged. The demographic factors age and body weight had a minor influence on the clearance of nortriptyline which was also unaffected by the duration of treatment. PMID:7893588
NASA Astrophysics Data System (ADS)
Suh, Youngjoo; Kim, Hoirin
2014-12-01
In this paper, a new discriminative likelihood score weighting technique is proposed for speaker identification. The proposed method employs a discriminative weighting of frame-level log-likelihood scores with acoustic-phonetic classification in the Gaussian mixture model (GMM)-based speaker identification. Experiments performed on the Aurora noise-corrupted TIMIT database showed that the proposed approach provides meaningful performance improvement with an overall relative error reduction of 15.8% over the maximum likelihood-based baseline GMM approach.
Cetron, Martin
2016-03-18
Since May 2015, when Zika virus, a flavivirus transmitted primarily by Aedes aegypti mosquitoes, was reported in Brazil, the virus has rapidly spread across the Region of the Americas and the Caribbean. The association between maternal Zika virus infection and adverse fetal and reproductive outcomes, including microcephaly, prompted CDC to issue a Level 2 alert travel notice* for the 37 countries and U.S. territories (at the national and territorial level) that have reported recent Zika virus transmission as of March 11, 2016. In addition to mosquito bite precautions for all travelers, CDC advises that pregnant women postpone travel to affected countries and U.S. territories. Within a nation's borders, ecologic characteristics, which determine the distribution of mosquito vectors, can vary considerably. CDC conducted a spatial analysis, focusing on the probability of occurrence of Ae. aegypti, to support the demarcation for subnational travel alerts. Based on results of this analysis, travel that is limited to elevations higher than 2,000 m (6,562 ft) above sea level is considered to have minimal (approximately 1%) likelihood for mosquito-borne Zika virus transmission, even within countries reporting active transmission. Women who are pregnant should avoid travel to elevations <2,000 m in countries with active Zika virus transmission. PMID:26985965
SpaceOps 2012 Plus 2: Social Tools to Simplify ISS Flight Control Communications and Log Keeping
NASA Technical Reports Server (NTRS)
Cowart, Hugh S.; Scott, David W.
2014-01-01
A paper written for the SpaceOps 2012 Conference (Simplify ISS Flight Control Communications and Log Keeping via Social Tools and Techniques) identified three innovative concepts for real time flight control communications tools based on social mechanisms: a) Console Log Tool (CoLT) - A log keeping application at Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) that provides "anywhere" access, comment and notifications features similar to those found in Social Networking Systems (SNS), b) Cross-Log Communication via Social Techniques - A concept from Johnsson Space Center's (JSC) Mission Control Center Houston (MCC-H) that would use microblogging's @tag and #tag protocols to make information/requests visible and/or discoverable in logs owned by @Destination addressees, and c) Communications Dashboard (CommDash) - A MSFC concept for a Facebook-like interface to visually integrate and manage basic console log content, text chat streams analogous to voice loops, text chat streams dedicated to particular conversations, generic and position-specific status displays/streams, and a graphically based hailing display. CoLT was deployed operationally at nearly the same time as SpaceOps 2012, the Cross- Log Communications idea is currently waiting for a champion to carry it forward, and CommDash was approved as a NASA Iinformation Technoloby (IT) Labs project. This paper discusses lessons learned from two years of actual CoLT operations, updates CommDash prototype development status, and discusses potential for using Cross-Log Communications in both MCC-H and/or POIC environments, and considers other ways for synergizing console applcations.
Doughty, Christine; Takeuchi, Shinji; Amano, Kenji; Shimo, Michito; Tsang, Chin-Fu
2004-10-04
The flowing fluid electric conductivity (FEC) logging method, wellbore fluid is replaced with de-ionized water, following which FEC profiles in the wellbore are measured at a series of times while the well is pumped at a constant rate. Locations were fluid enters the wellbore show peaks in the FEC logs, which may be analyzed to infer inflow strengths and salinities of permeable features intersected by the wellbore. In multi-rate flowing FEC logging, the flowing FEC logging method is repeated using two or more pumping rates, which enables the transmissivities and inherent pressure heads of these features to be estimated as well. We perform multi-rate FEC logging on a deep borehole in fractured granitic rock, using three different pumping rates. Results identify 19 hydraulically conducting fractures and indicate that transmissivity, pressure head, and salinity vary significantly among them. By using three pumping rates rather than the minimum number of two, we obtain an internal consistency check on the analysis that provides a measure of the uncertainty of the results. Good comparisons against static FEC profiles and against independent chemical, geological, and hydrogeological data have further enhanced confidence in the results of the multi-rate flowing FEC logging method.
Maximum-likelihood density modification
Terwilliger, Thomas C.
2000-01-01
A likelihood-based approach to density modification is developed that can be applied to a wide variety of cases where some information about the electron density at various points in the unit cell is available. The key to the approach consists of developing likelihood functions that represent the probability that a particular value of electron density is consistent with prior expectations for the electron density at that point in the unit cell. These likelihood functions are then combined with likelihood functions based on experimental observations and with others containing any prior knowledge about structure factors to form a combined likelihood function for each structure factor. A simple and general approach to maximizing the combined likelihood function is developed. It is found that this likelihood-based approach yields greater phase improvement in model and real test cases than either conventional solvent flattening and histogram matching or a recent reciprocal-space solvent-flattening procedure [Terwilliger (1999 ▶), Acta Cryst. D55, 1863–1871]. PMID:10944333
Toyooka, Kiminori; Sato, Mayuko; Wakazaki, Mayumi; Matsuoka, Ken
2016-03-01
We developed a wide-range and high-resolution transmission electron microscope acquisition system and obtained giga-pixel images of tobacco BY-2 cells during the log and stationary phases of cell growth. We demonstrated that the distribution and ultrastructure of compartments involved in membrane traffic (i.e., Golgi apparatus, multivesicular body, and vesicle cluster) change during the log-to-stationary transition. Mitochondria, peroxisomes, and plastids were also enumerated. Electron densities of mitochondria and peroxisomes were altered during the growth-phase shift, while their numbers were reduced by nearly half. Plastid structure dramatically changed from atypical to spherical with starch granules. Nearly the same number of plastids was observed in both log and stationary phases. These results indicate that mechanisms regulating organelle populations differ from organelle to organelle. PMID:26855065
Toyooka, Kiminori; Sato, Mayuko; Wakazaki, Mayumi; Matsuoka, Ken
2016-01-01
ABSTRACT We developed a wide-range and high-resolution transmission electron microscope acquisition system and obtained giga-pixel images of tobacco BY-2 cells during the log and stationary phases of cell growth. We demonstrated that the distribution and ultrastructure of compartments involved in membrane traffic (i.e., Golgi apparatus, multivesicular body, and vesicle cluster) change during the log-to-stationary transition. Mitochondria, peroxisomes, and plastids were also enumerated. Electron densities of mitochondria and peroxisomes were altered during the growth-phase shift, while their numbers were reduced by nearly half. Plastid structure dramatically changed from atypical to spherical with starch granules. Nearly the same number of plastids was observed in both log and stationary phases. These results indicate that mechanisms regulating organelle populations differ from organelle to organelle. PMID:26855065
Likelihood alarm displays. [for human operator
NASA Technical Reports Server (NTRS)
Sorkin, Robert D.; Kantowitz, Barry H.; Kantowitz, Susan C.
1988-01-01
In a likelihood alarm display (LAD) information about event likelihood is computed by an automated monitoring system and encoded into an alerting signal for the human operator. Operator performance within a dual-task paradigm was evaluated with two LADs: a color-coded visual alarm and a linguistically coded synthetic speech alarm. The operator's primary task was one of tracking; the secondary task was to monitor a four-element numerical display and determine whether the data arose from a 'signal' or 'no-signal' condition. A simulated 'intelligent' monitoring system alerted the operator to the likelihood of a signal. The results indicated that (1) automated monitoring systems can improve performance on primary and secondary tasks; (2) LADs can improve the allocation of attention among tasks and provide information integrated into operator decisions; and (3) LADs do not necessarily add to the operator's attentional load.
ERIC Educational Resources Information Center
Hidalgo, Mª Dolores; Gómez-Benito, Juana; Zumbo, Bruno D.
2014-01-01
The authors analyze the effectiveness of the R[superscript 2] and delta log odds ratio effect size measures when using logistic regression analysis to detect differential item functioning (DIF) in dichotomous items. A simulation study was carried out, and the Type I error rate and power estimates under conditions in which only statistical testing…
A hybrid likelihood algorithm for risk modelling.
Kellerer, A M; Kreisheimer, M; Chmelevsky, D; Barclay, D
1995-03-01
The risk of radiation-induced cancer is assessed through the follow-up of large cohorts, such as atomic bomb survivors or underground miners who have been occupationally exposed to radon and its decay products. The models relate to the dose, age and time dependence of the excess tumour rates, and they contain parameters that are estimated in terms of maximum likelihood computations. The computations are performed with the software package EPI-CURE, which contains the two main options of person-by person regression or of Poisson regression with grouped data. The Poisson regression is most frequently employed, but there are certain models that require an excessive number of cells when grouped data are used. One example involves computations that account explicitly for the temporal distribution of continuous exposures, as they occur with underground miners. In past work such models had to be approximated, but it is shown here that they can be treated explicitly in a suitably reformulated person-by person computation of the likelihood. The algorithm uses the familiar partitioning of the log-likelihood into two terms, L1 and L0. The first term, L1, represents the contribution of the 'events' (tumours). It needs to be evaluated in the usual way, but constitutes no computational problem. The second term, L0, represents the event-free periods of observation. It is, in its usual form, unmanageable for large cohorts. However, it can be reduced to a simple form, in which the number of computational steps is independent of cohort size. The method requires less computing time and computer memory, but more importantly it leads to more stable numerical results by obviating the need for grouping the data. The algorithm may be most relevant to radiation risk modelling, but it can facilitate the modelling of failure-time data in general. PMID:7604154
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex
2012-01-01
LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).
The dud-alternative effect in likelihood judgment.
Windschitl, Paul D; Chambers, John R
2004-01-01
The judged likelihood of a focal outcome should generally decrease as the list of alternative possibilities increases. For example, the likelihood that a runner will win a race goes down when 2 new entries are added to the field. However, 6 experiments demonstrate that the presence of implausible alternatives (duds) often increases the judged likelihood of a focal outcome. This dud-alternative effect was detected for judgments involving uncertainty about trivia facts and stochastic events. Nonnumeric likelihood measures and betting measures reliably detected the effect, but numeric likelihood measures did not. Time pressure increased the magnitude of the effect. The results were consistent with a contrast-effect account: The inclusion of duds increases the perceived strength of the evidence for the focal outcome, thereby affecting its judged likelihood. PMID:14736307
High-temperature scintillation properties of orthorhombic Gd2Si2O7 aiming at well logging
NASA Astrophysics Data System (ADS)
Tsubota, Youichi; Kaneko, Junichi H.; Higuchi, Mikio; Nishiyama, Shusuke; Ishibashi, Hiroyuki
2015-06-01
Scintillation and luminescence properties of orthorhombic Gd2Si2O7:Ce (GPS:Ce) single-crystal scintillators were investigated for temperatures ranging from room temperature (RT) to 573 K. Orthorhombic GPS crystals were grown by using a top-seeded solution growth (TSSG) method. The scintillation light yield of the orthorhombic GPS at RT was ∼2.9 times higher than that of Gd2SiO5:Ce (GSO). The light yield values of the orthorhombic GPS (Ce = 2.5%) were almost unchanged for temperatures ranging from RT to 523 K, and at 523 K, were higher than twice the light yield of GSO at RT. These GPS scintillators are expected to contribute to oil exploration at greater depths.
NASA Astrophysics Data System (ADS)
Shang, Yilun
2016-08-01
How complex a network is crucially impacts its function and performance. In many modern applications, the networks involved have a growth property and sparse structures, which pose challenges to physicists and applied mathematicians. In this paper, we introduce the forest likelihood as a plausible measure to gauge how difficult it is to construct a forest in a non-preferential attachment way. Based on the notions of admittable labeling and path construction, we propose algorithms for computing the forest likelihood of a given forest. Concrete examples as well as the distributions of forest likelihoods for all forests with some fixed numbers of nodes are presented. Moreover, we illustrate the ideas on real-life networks, including a benzenoid tree, a mathematical family tree, and a peer-to-peer network.
Selective Logging, Fire, and Biomass in Amazonia
NASA Technical Reports Server (NTRS)
Houghton, R. A.
1999-01-01
Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.
Abulencia, A.; Acosta, D.; Adelman, Jahred A.; Affolder, Anthony A.; Akimoto, T.; Albrow, M.G.; Ambrose, D.; Amerio, S.; Amidei, D.; Anastassov, A.; Anikeev, K.; /Taiwan, Inst. Phys. /Argonne /Barcelona, IFAE /Baylor U. /INFN, Bologna /Bologna U. /Brandeis U. /UC, Davis /UCLA /UC, San Diego /UC, Santa Barbara
2005-12-01
This report describes a measurement of the top quark mass, M{sub top}, with the dynamical likelihood method (DLM) using the CDF II detector at the Fermilab Tevatron. The Tevatron produces top/anti-top (t{bar t}) pairs in p{bar p} collisions at a center-of-mass energy of 1.96 TeV. The data sample used in this analysis was accumulated from March 2002 through August 2004, which corresponds to an integrated luminosity of 318 pb{sup -1}. They use the t{bar t} candidates in the ''lepton+jets'' decay channel, requiring at least one jet identified as a b quark by finding an displaced secondary vertex. The DLM defines a likelihood for each event based on the differential cross section as a function of M{sub top} per unit phase space volume of the final partons, multiplied by the transfer functions from jet to parton energies. The method takes into account all possible jet combinations in an event, and the likelihood is multiplied event by event to derive the top quark mass by the maximum likelihood method. Using 63 t{bar t} candidates observed in the data, with 9.2 events expected from background, they measure the top quark mass to be 173.2{sub -2.4}{sup +2.6}(stat.) {+-} 3.2(syst.) GeV/c{sup 2}, or 173.2{sub -4.0}{sup +4.1} GeV/c{sup 2}.
NASA Astrophysics Data System (ADS)
Reid, Beth A.
2013-06-01
This software computes likelihoods for the Luminous Red Galaxies (LRG) data from the Sloan Digital Sky Survey (SDSS). It includes a patch to the existing CAMB software (the February 2009 release) to calculate the theoretical LRG halo power spectrum for various models. The code is written in Fortran 90 and has been tested with the Intel Fortran 90 and GFortran compilers.
NASA Astrophysics Data System (ADS)
Cherkasova, T.; Timkin, T.; Savinova, O.
2015-02-01
The nanomineralogic investigation results of ore minerals in metasomatites (garnet skarns) of Ribny Log- 2 gold ore in Topolninsk ore deposit (Gorny Altai) revealed the native silica impurities (Si) of 1 - 5 nm within the grains of native bismuth (Bi). Polished sections were examined by using Tescan Vega 3 scanning electron microscope (SEM) with Oxford energy-dispersive spectrometer at the Department of Geology and Mineral Exploration, Institute of Natural Resources, Tomsk Polytechnic University.
Approximate maximum likelihood estimation of scanning observer templates
NASA Astrophysics Data System (ADS)
Abbey, Craig K.; Samuelson, Frank W.; Wunderlich, Adam; Popescu, Lucretiu M.; Eckstein, Miguel P.; Boone, John M.
2015-03-01
In localization tasks, an observer is asked to give the location of some target or feature of interest in an image. Scanning linear observer models incorporate the search implicit in this task through convolution of an observer template with the image being evaluated. Such models are becoming increasingly popular as predictors of human performance for validating medical imaging methodology. In addition to convolution, scanning models may utilize internal noise components to model inconsistencies in human observer responses. In this work, we build a probabilistic mathematical model of this process and show how it can, in principle, be used to obtain estimates of the observer template using maximum likelihood methods. The main difficulty of this approach is that a closed form probability distribution for a maximal location response is not generally available in the presence of internal noise. However, for a given image we can generate an empirical distribution of maximal locations using Monte-Carlo sampling. We show that this probability is well approximated by applying an exponential function to the scanning template output. We also evaluate log-likelihood functions on the basis of this approximate distribution. Using 1,000 trials of simulated data as a validation test set, we find that a plot of the approximate log-likelihood function along a single parameter related to the template profile achieves its maximum value near the true value used in the simulation. This finding holds regardless of whether the trials are correctly localized or not. In a second validation study evaluating a parameter related to the relative magnitude of internal noise, only the incorrect localization images produces a maximum in the approximate log-likelihood function that is near the true value of the parameter.
The Dud-Alternative Effect in Likelihood Judgment
ERIC Educational Resources Information Center
Windschitl, Paul D.; Chambers, John R.
2004-01-01
The judged likelihood of a focal outcome should generally decrease as the list of alternative possibilities increases. For example, the likelihood that a runner will win a race goes down when 2 new entries are added to the field. However, 6 experiments demonstrate that the presence of implausible alternatives (duds) often increases the judged…
Well log and 2D seismic data character of the Wilcox Group in south-central Louisiana
Enomoto, Catherine B.
2014-01-01
The Wilcox Group is productive in updip areas of Texas and Louisiana from fluvial, deltaic, and near-shore marine shelf sandstones. The reported presence of porous sandstones at 29,000 feet within the Wilcox Group containing about 200 feet of gas in the Davy Jones 1 discovery well in the offshore Louisiana South Marsh Island area illustrates a sand-rich system developed during the Paleocene and early Eocene. This study describes some of the well log and reflection seismic data characteristics of the slope and basin-floor reservoirs with gas-discovery potential that may be in the area between the producing trend onshore Louisiana and the offshore discovery.
Mail LOG: Program operating instructions
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.
CRYPTOSPORIDIUM LOG INACTIVATION CALCULATION METHODS
Appendix O of the Surface Water Treatment Rule (SWTR) Guidance Manual introduces the CeffT10 (i.e., reaction zone outlet C value and T10 time) method for calculating ozone CT value and Giardia and virus log inactivation. The LT2ESWTR Pre-proposal Draft Regulatory Language for St...
Contrastive Pessimistic Likelihood Estimation for Semi-Supervised Classification.
Loog, Marco
2016-03-01
Improvement guarantees for semi-supervised classifiers can currently only be given under restrictive conditions on the data. We propose a general way to perform semi-supervised parameter estimation for likelihood-based classifiers for which, on the full training set, the estimates are never worse than the supervised solution in terms of the log-likelihood. We argue, moreover, that we may expect these solutions to really improve upon the supervised classifier in particular cases. In a worked-out example for LDA, we take it one step further and essentially prove that its semi-supervised version is strictly better than its supervised counterpart. The two new concepts that form the core of our estimation principle are contrast and pessimism. The former refers to the fact that our objective function takes the supervised estimates into account, enabling the semi-supervised solution to explicitly control the potential improvements over this estimate. The latter refers to the fact that our estimates are conservative and therefore resilient to whatever form the true labeling of the unlabeled data takes on. Experiments demonstrate the improvements in terms of both the log-likelihood and the classification error rate on independent test sets. PMID:27046491
Scandurra, D; Albertini, F; van der Meer, R; Meier, G; Weber, D C; Bolsi, A; Lomax, A
2016-02-01
Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within +/- 1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field. PMID:26767316
NASA Astrophysics Data System (ADS)
Scandurra, D.; Albertini, F.; van der Meer, R.; Meier, G.; Weber, D. C.; Bolsi, A.; Lomax, A.
2016-02-01
Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within +/- 1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field.
NASA Astrophysics Data System (ADS)
Higashinaka, Ryuji; Asano, Takuya; Nakashima, Takuya; Fushiya, Kengo; Mizuguchi, Yoshikazu; Miura, Osuke; Matsuda, Tatsuma D.; Aoki, Yuji
2015-02-01
The low-temperature properties of CeOBiS2 single crystals are studied by electrical resistivity, magnetization, and specific heat measurements. Ce 4f-electrons are found to be in a well-localized state split by crystalline-electric-field (CEF) effects. The CEF ground state is a pure Jz = ±1/2 doublet, and excited doublets are located far above. At low temperatures in zero field, we observe pronounced -log T divergence in the specific heat, revealing the presence of quantum critical fluctuations of 4f magnetic moments near a quantum critical point (QCP). Considering that CeOBiS2 is a nonmetal, this phenomenon cannot be attributed to the competition between Kondo and the Ruderman-Kittel-Kasuya-Yosida (RKKY) interactions as in numerous f-electron-based strongly correlated metals, indicating an unconventional mechanism. We suggest that CeOBiS2 is the first material found to be located at a QCP among geometrically frustrated nonmetallic magnets.
Postfire logging in riparian areas.
Reeves, Gordon H; Bisson, Peter A; Rieman, Bruce E; Benda, Lee E
2006-08-01
We reviewed the behavior of wildfire in riparian zones, primarily in the western United States, and the potential ecological consequences of postfire logging. Fire behavior in riparian zones is complex, but many aquatic and riparian organisms exhibit a suite of adaptations that allow relatively rapid recovery after fire. Unless constrained by other factors, fish tend to rebound relatively quickly, usually within a decade after a wildfire. Additionally, fire and subsequent erosion events contribute wood and coarse sediment that can create and maintain productive aquatic habitats over time. The potential effects of postfire logging in riparian areas depend on the landscape context and disturbance history of a site; however available evidence suggests two key management implications: (1) fire in riparian areas creates conditions that may not require intervention to sustain the long-term productivity of the aquatic network and (2) protection of burned riparian areas gives priority to what is left rather than what is removed. Research is needed to determine how postfire logging in riparian areas has affected the spread of invasive species and the vulnerability of upland forests to insect and disease outbreaks and how postfire logging will affect the frequency and behavior of future fires. The effectiveness of using postfire logging to restore desired riparian structure and function is therefore unproven, but such projects are gaining interest with the departure of forest conditions from those that existed prior to timber harvest, fire suppression, and climate change. In the absence of reliable information about the potential consequence of postfire timber harvest, we conclude that providing postfire riparian zones with the same environmental protections they received before they burned isjustified ecologically Without a commitment to monitor management experiments, the effects of postfire riparian logging will remain unknown and highly contentious. PMID:16922216
Rivenes, A S
2004-12-08
The purpose of this document is to investigate Oracle database log buffer queuing and its affect on the ability to load data using a specialized data loading system. Experiments were carried out on a Linux system using an Oracle 9.2 database. Previous experiments on a Sun 4800 running Solaris had shown that 100,000 entities per minute was an achievable rate. The question was then asked, can we do this on Linux, and where are the bottlenecks? A secondary question was also lurking, how can the loading be further scaled to handle even higher throughput requirements? Testing was conducted using a Dell PowerEdge 6650 server with four CPUs and a Dell PowerVault 220s RAID array with 14 36GB drives and 128 MB of cache. Oracle Enterprise Edition 9.2.0.4 was used for the database and Red Hat Linux Advanced Server 2.1 was used for the operating system. This document will detail the maximum observed throughputs using the same test suite that was used for the Sun tests. A detailed description of the testing performed along with an analysis of bottlenecks encountered will be made. Issues related to Oracle and Linux will also be detailed and some recommendations based on the findings.
6. Log calving barn. Interior view showing log postandbeam support ...
6. Log calving barn. Interior view showing log post-and-beam support system and animal stalls. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT
Anderson, David W.
1993-12-15
Two logs of EE-3A were performed during the last couple of weeks. The first of which, was a Temperature/Casing-Collar Locator (CCL) log, which took place on Friday, December 10th., 1993. The second log was a Caliper log which was done in cooperation with the Dia-Log Company, of Odessa, TX. on Monday, December, 13th., 1993.
Ehrmann, Brett J.; Anderson, Robert M.; Piatt, Gretchen A.; Funnell, Martha M.; Rashid, Hira; Shedden, Kerby; Douyon, Liselle
2014-01-01
Purpose The purpose of this pilot study is to investigate the utility of, and areas of refinement for, digital photography as an educational tool for food logging in obese patients with type 2 diabetes (T2DM). Methods Thirty-three patients aged 18-70 with T2DM, BMI at least 30 kg/m2, and A1C 7.5-9% were recruited from an endocrinology clinic and randomized to a week of food logging using a digital camera (DC) or paper diary (PD), crossing over for week two. Patients then viewed a presentation about dietary effects on blood glucose, using patient DC and blood glucose entries. Outcomes of adherence (based on number of weekly entries), changes in mean blood glucose and frequency of blood glucose checks, and patient satisfaction were compared between methods. Patient feedback on the DC intervention and presentation was also analyzed. Results Thirty patients completed the study. Adherence was identical across methods. The mean difference in number of entries was not significant between methods. This difference increased and neared statistical significance (favoring DC) among patients who were adherent for at least one week (21 entries, with 2 entries per day for 5 of 7 days, n=25). Mean blood glucose did not significantly decrease in either method. Patient satisfaction was similar between interventions. Feedback indicated concerns over photograph accuracy, forgetting to use the cameras, and embarrassment using them in public. Conclusion Though comparable to PD in adherence, blood glucose changes, and patient satisfaction in this pilot trial, patient feedback suggested specific areas of refinement to maximize utility of DC-based food logging as an educational tool in T2DM. PMID:24168836
Priiatkina, S N
2002-05-01
For mapping nonlinked interacting genes relative to marker loci, the recombination fractions can be calculated by using the log-likelihood functions were derived that permit estimation of recombinant fractions by solving the ML equations on the basis of F2 data at various types of interaction. In some cases, the recombinant fraction estimates are obtained in the analytical form while in others they are numerically calculated from concrete experimental data. With the same type of epistasis the log-functions were shown to differ depending on the functional role (suppression or epistasis) of the mapped gene. Methods for testing the correspondence of the model and the recombination fraction estimates to the experimental data are discussed. In ambiguous cases, analysis of the linked marker behavior makes it possible to differentiate gene interaction from distorted single-locus segregation, which at some forms of interaction imitate phenotypic ratios. PMID:12068553
Acoustic paramagnetic logging tool
Vail, III, William B.
1988-01-01
New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth3 s magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation . The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores.
NASA Technical Reports Server (NTRS)
Balch, D. T.; Lombardi, J.
1985-01-01
A model scale hover test was conducted in the Sikorsky Aircraft Model Rotor hover Facility to identify and quantify the impact of the tail rotor on the demonstrated advantages of advanced geometry tip configurations. The existence of mutual interference between hovering main rotor and a tail rotor was acknowledged in the test. The test was conducted using the Basic Model Test Rig and two scaled main rotor systems, one representing a 1/5.727 scale UH-60A BLACK HAWK and the others a 1/4.71 scale S-76. Eight alternate rotor tip configurations were tested, 3 on the BLACK HAWK rotor and 6 on the S-76 rotor. Four of these tips were then selected for testing in close proximity to an operating tail rotor (operating in both tractor and pusher modes) to determine if the performance advantages that could be obtained from the use of advanced geometry tips in a main rotor only environment would still exist in the more complex flow field involving a tail rotor. This volume contains the test run log and tabulated data.
Walsh, David O; Turner, Peter
2014-05-27
Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.
4. Log chicken house (far left foreground), log bunkhouse (far ...
4. Log chicken house (far left foreground), log bunkhouse (far left background), one-room log cabin (left of center background), log root cellar (center), post-and-beam center in foreground, and blacksmith shop (far right foreground). View to southeast. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT
Maximum Likelihood Estimation in Generalized Rasch Models.
ERIC Educational Resources Information Center
de Leeuw, Jan; Verhelst, Norman
1986-01-01
Maximum likelihood procedures are presented for a general model to unify the various models and techniques that have been proposed for item analysis. Unconditional maximum likelihood estimation, proposed by Wright and Haberman, and conditional maximum likelihood estimation, proposed by Rasch and Andersen, are shown as important special cases. (JAZ)
Structural basis for cytokinin production by LOG from Corynebacterium glutamicum.
Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin
2016-01-01
"Lonely guy" (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a "PGGXGTXXE" motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425
Structural basis for cytokinin production by LOG from Corynebacterium glutamicum
Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin
2016-01-01
“Lonely guy” (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a “PGGXGTXXE” motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425
Taylor, J.A.; Taylor, K.G.
1987-02-03
A method is described of logging earth formations traversed by a well bore and utilizing a logging tool having a neutron source and a short spaced and a long spaced thermal neutron detector which produce an independent response as a function of depth of the logging tool in a well bore. The method comprises: moving the logging tool through a well bore to locate a section of the earth formations which has minimum porosity and obtaining measurement responses from each of the long and short spaced detectors; normalizing the responses of the long and short spaced detectors by matching the sensitivity of response of the long spaced detector to the sensitivity of response of the short spaced detector for an earth formation which has minimum porosity so that the normalized responses track one another in an earth formation which has minimum porosity; and moving the tool over the length of the well bore to be surveyed while recording the normalized responses of the long and short spaced neutron detectors as a function of depth.
ERIC Educational Resources Information Center
Butler, Kevin
2010-01-01
A classroom lecture at Capistrano Connections Academy in Southern California involves booting up the home computer, logging on to a Web site, and observing a teacher conducting a PowerPoint presentation of that day's lesson entirely online. Through microphone headsets, students can watch on their home computers, respond to the teacher's questions,…
ERIC Educational Resources Information Center
Deaton, Cynthia Minchew; Deaton, Benjamin E.; Leland, Katina
2010-01-01
The authors created an interactive reflective log (IRL) to provide teachers with an opportunity to use a journal approach to record, evaluate, and communicate student understanding of science concepts. Unlike a traditional journal, the IRL incorporates prompts to encourage students to discuss their understanding of science content and science…
2007-04-04
A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.
ERIC Educational Resources Information Center
National Aeronautics and Space Administration, Washington, DC.
The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)
Coal-log pipeline system development
Liu, H.
1991-12-01
Project tasks include: (1) Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. (2) Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and tests the logs produced. (3) Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. (VC)
Maximum likelihood density modification by pattern recognition of structural motifs
Terwilliger, Thomas C.
2004-04-13
An electron density for a crystallographic structure having protein regions and solvent regions is improved by maximizing the log likelihood of a set of structures factors {F.sub.h } using a local log-likelihood function: (x)+p(.rho.(x).vertline.SOLV)p.sub.SOLV (x)+p(.rho.(x).vertline.H)p.sub.H (x)], where p.sub.PROT (x) is the probability that x is in the protein region, p(.rho.(x).vertline.PROT) is the conditional probability for .rho.(x) given that x is in the protein region, and p.sub.SOLV (x) and p(.rho.(x).vertline.SOLV) are the corresponding quantities for the solvent region, p.sub.H (x) refers to the probability that there is a structural motif at a known location, with a known orientation, in the vicinity of the point x; and p(.rho.(x).vertline.H) is the probability distribution for electron density at this point given that the structural motif actually is present. One appropriate structural motif is a helical structure within the crystallographic structure.
Log-Concavity and Strong Log-Concavity: a review
Saumard, Adrien; Wellner, Jon A.
2016-01-01
We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning. PMID:27134693
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
Energy saving and endurance log for a log building
Dolata, G.
1987-03-17
A log is described for construction of a log building which comprises: an elongated peeled log of substantially uniform diameter along its length with parallel end faces, a bottom surface of the log having a concave surface configuration centered on a diametrical line of the log, a rounded top surface directly opposite from the concave bottom surface which mates with a concave surface of another log when placed upon the rounded top surface, a vertically extending longitudinal slot in the top surface of the log that extends the length of the log, a vertically extending longitudinal slot along at least one side of the log with the slot extending vertically substantially parallel with the diametrical line with the slot being formed outwardly of the concave surface, the log including at least one butt end, the butt end including an end slot along the diametrical line which extends from a top of the log down through the butt end to the concave surface; and the butt includes at least one short, longitudinally extending arcuate groove near an outer surface of the log which extends from a line juxtaposed the end slot down to at least one longitudinal slot in the log.
Pitts, R.W.
1980-02-19
A well logging system includes a logging tool adapted to be passed through a borehole traversing an earth formation. The logging tool contains a sensor sensing a condition of the earth formation and providing electrical pulses corresponding in number and peak amplitude to the sensed condition. A first electrical pulse from the sensor occurring during each predetermined time period of a plurality of predetermined time periods, is stretched and then converted to parallel digital signals. A register receives the parallel digital signals and provides a serial digital signal in response to the shift pulses. A network provides an electrical synchronization pulse each time period prior to the occurrence of the shift pulses. A light emitting diode converts the synchronization pulses and the serial digital signals to corresponding light pulses. A cable including a fiber optic conductor transmits the light pulses uphole to the surface. Surface electronics includes a light-to-electrical converter for providing corresponding electrical pulses in accordance with the light pulses, so that the light-to-electrical converter provides a synchronization pulse followed by a serial digital signal each time period. Another circuit provides a set of shift pulses in response to the synchronizing pulse from the light-to-electrical converter, and an output circuit provides parallel output digital signals corresponding to the sensed condition in accordance with the shift pulses and the serial digital signals from the light-to -electrical converter.
West, H.I. Jr.; Glasgow, J.E.
1983-12-01
The feasibility of using oil-well bore-hole logging techniques for assaying the carbonate content of a potential shot site has been investigated. The procedure makes use of the detection of the 4439-keV ..gamma.. ray of carbon and the 6130-keV ..gamma.. ray of oxygen produced by the inelastic scattering of 14-MeV neutrons in the bore-hole medium. For the needs of the containment program, a sensitivity of detection of CO/sub 2/ to less than or equal to 0.02 weight fraction must be achieved. Laboratory measurements indicate that only with considerable development effort could a tool be made that would come close to achieving this goal.
NASA Astrophysics Data System (ADS)
Salehin, Z.; Woobaidullah, A. S. M.; Snigdha, S. S.
2015-12-01
Bengal Basin with its prolific gas rich province provides needed energy to Bangladesh. Present energy situation demands more Hydrocarbon explorations. Only 'Semutang' is discovered in the high amplitude structures, where rest of are in the gentle to moderate structures of western part of Chittagong-Tripura Fold Belt. But it has some major thrust faults which have strongly breached the reservoir zone. The major objectives of this research are interpretation of gas horizons and faults, then to perform velocity model, structural and property modeling to obtain reservoir properties. It is needed to properly identify the faults and reservoir heterogeneities. 3D modeling is widely used to reveal the subsurface structure in faulted zone where planning and development drilling is major challenge. Thirteen 2D seismic and six well logs have been used to identify six gas bearing horizons and a network of faults and to map the structure at reservoir level. Variance attributes were used to identify faults. Velocity model is performed for domain conversion. Synthetics were prepared from two wells where sonic and density logs are available. Well to seismic tie at reservoir zone shows good match with Direct Hydrocarbon Indicator on seismic section. Vsh, porosity, water saturation and permeability have been calculated and various cross plots among porosity logs have been shown. Structural modeling is used to make zone and layering accordance with minimum sand thickness. Fault model shows the possible fault network, those liable for several dry wells. Facies model have been constrained with Sequential Indicator Simulation method to show the facies distribution along the depth surfaces. Petrophysical models have been prepared with Sequential Gaussian Simulation to estimate petrophysical parameters away from the existing wells to other parts of the field and to observe heterogeneities in reservoir. Average porosity map for each gas zone were constructed. The outcomes of the research
12. Upstream view showing thelower log pond log chute in ...
12. Upstream view showing thelower log pond log chute in the main channel of the Hudson River. The log chute in the dam can be seen in the background. Facing southwest. - Glens Falls Dam, 100' to 450' West of U.S. Route 9 Bridge Spanning Hudson River, Glens Falls, Warren County, NY
Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation
NASA Astrophysics Data System (ADS)
Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.
2015-11-01
We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Agricultural Employment of H-2A Aliens in the United States, 75 FR 6884 (Feb. 12, 2010). The effect of... rulemaking (NPRM). 74 FR 45906 (Sept. 4, 2009). After considering comments from the public on the subject... errors and ambiguities. J. Plain Language The Department drafted this rule in plain language....
Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.; Goodrich, J.H.; Kumar, R.M.
1981-11-01
This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712 feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.
NASA Astrophysics Data System (ADS)
Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.
2015-12-01
An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.
Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete
2015-01-01
An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.
SedLog: A shareware program for drawing graphic logs and log data manipulation
NASA Astrophysics Data System (ADS)
Zervas, Dimitrios; Nichols, Gary J.; Hall, Robert; Smyth, Helen R.; Lüthje, Charlotta; Murtagh, Fionn
2009-10-01
SedLog is a free multi-platform software package for creating graphic sediment logs providing an intuitive graphical user interface. The graphic sediment logs generated by SedLog can be exported as PDF, Scalable Vector Graphics (SVG), or JPEG for use by other drawing applications or for publications. Log data can be imported and exported in Comma Separated Values (CSV) format. The logs can also be printed to any paper size the user wants. Zoom In, Zoom Out, Fit page, Fit Height and Fit Width facilities are also provided to enable the user to customise the workspace size.
Continuous gravity gradient logging
Fitch, J.L.; Lyle, W.D. Jr.
1986-07-29
A method is described for conducting a gravimetry survey of an earth formation, comprising the steps of: (a) continuously traversing the earth formation with a gravity logging tool having a column of fluid within the tool, (b) measuring a first pressure difference along a first interval within the column of fluid, (c) measuring a second pressure difference along a second interval within the column of fluid, (d) differencing the first and second pressure differences to determine the gravity gradient along the earth formation between the first and second intervals.
Oliver, R.L.; Lindblom, S.R.; Covell, J.R.
1991-02-01
The Rocky Mountain 1 (RM1) Underground Coal Gasification (UCG) site consisted of two different module configurations: the controlled retracting injection point (CRIP) and elongated linked well (ELW) configurations. The postburn coring of the RM1 UCG site was designed in two phases to fulfill seven objectives outlined in Western Research Institute`s Annual Project Plan for 1989 (Western Research Institute 1989). The seven objectives were to (1) delineate the areal extent of the cavities, (2) identify the extent of roof collapse, (3) obtain samples of all major cavity rock types, (4) characterize outflow channels and cavity stratigraphy, (5) characterize the area near CRIP points and ignition points, (6) further define the structural geology of the site, and (7) identify the vertical positioning of the horizontal process wells within the coal seam. Phase 1 of the coring was completed during the summer of 1989 and served to partially accomplish all seven objectives. A detailed description of Phase 1 results was presented in a separate report (Lindblom et al. 1990). Phase 2, completed during the summer of 1990, was designed to complete the seven objectives; more specifically, to further define the areal extent and location of the cavities, to evaluate the outflow channels for both modules, and to further characterize the structural geology in the ELW module area.
Oliver, R.L.; Lindblom, S.R.; Covell, J.R.
1991-02-01
The Rocky Mountain 1 (RM1) Underground Coal Gasification (UCG) site consisted of two different module configurations: the controlled retracting injection point (CRIP) and elongated linked well (ELW) configurations. The postburn coring of the RM1 UCG site was designed in two phases to fulfill seven objectives outlined in Western Research Institute's Annual Project Plan for 1989 (Western Research Institute 1989). The seven objectives were to (1) delineate the areal extent of the cavities, (2) identify the extent of roof collapse, (3) obtain samples of all major cavity rock types, (4) characterize outflow channels and cavity stratigraphy, (5) characterize the area near CRIP points and ignition points, (6) further define the structural geology of the site, and (7) identify the vertical positioning of the horizontal process wells within the coal seam. Phase 1 of the coring was completed during the summer of 1989 and served to partially accomplish all seven objectives. A detailed description of Phase 1 results was presented in a separate report (Lindblom et al. 1990). Phase 2, completed during the summer of 1990, was designed to complete the seven objectives; more specifically, to further define the areal extent and location of the cavities, to evaluate the outflow channels for both modules, and to further characterize the structural geology in the ELW module area.
Weekly Log Record Sort (WLSORT).
ERIC Educational Resources Information Center
Foote, Thomas
Computer routines to sort the weekly log records submitted by teachers participating in the Southwest Regional Laboratory's communications skills monitoring program are described. Written in Univac FORTRAN V, Weekly Log Record Sort (WLSORT) sorts log records on magnetic tape to enable subsequent computer programs to interpret the input data by…
NASA Astrophysics Data System (ADS)
Isyaku, Aminu A.; Rust, Derek; Teeuw, Richard; Whitworth, Malcolm
2016-09-01
Structural and stratigraphic mapping within the Bornu Basin in north east Nigeria was commonly carried out using traditional field geological methods. However, such traditional approaches remain inadequate in the semi-arid region characterised by topographically flat areas and lack of continuous bedrock outcrops that are mostly concealed beneath sand cover. Previous studies in the north-eastern part of the basin carried out using ditch cuttings from few wells and disconnected seismic data were largely inadequate and the resulting stratigraphic analyses were more often generalised. This paper presents an integrated structural and stratigraphic study of the basin using combined subsurface geophysical datasets. A Combined Log Pattern (CLP) method is a well log analysis, which utilises various well log data including gamma ray, resistivity, bulk density and sonic logs to identify lithology and stratigraphic boundaries of subsurface formations. This method is applied to constrain the subsurface stratigraphy of the north-eastern part of the Bornu Basin bordering the Lake Chad. In addition to qualitative combined well log analysis, the time-depth relationship of the sonic log and seismic data was quantitatively determined by tying a well with an intersecting seismic section to validate the stratigraphic facies horizons identified. Four well log facies and their environments of deposition were characterised from the combined well log analysis of the different log types. It is discovered that the Cretaceous basement structural features controlled the deposition of overlying formations in the basin. Without intact core data, the shallower wells were discovered to have bottomed over subsurface horst features while deeper wells penetrated into the basal facies contained mainly within the grabens. Main subsurface structural lineaments in the area include NW-SE, NE-SW and NNW-SSE trending faults, which mainly formed the horst and graben features. Some stratigraphic formations
NASA Astrophysics Data System (ADS)
Bloshanskaya, S. K.; Bloshanskii, I. L.; Y Roslova, T.
1998-06-01
For an arbitrary open set \\Omega\\subset I^2= \\lbrack 0,1)^2 and an arbitrary function f\\in L\\log^+L\\log^+\\log^+L(I^2) such that f=0 on \\Omega the double Fourier series of f with respect to the trigonometric system \\Psi=\\mathscr E and the Walsh-Paley system \\Psi=W is shown to converge to zero (over rectangles) almost everywhere on \\Omega. Thus, it is proved that generalized localization almost everywhere holds on arbitrary open subsets of the square I^2 for the double trigonometric Fourier series and the Walsh-Fourier series of functions in the class L\\log^+L\\log^+\\log^+L (in the case of summation over rectangles). It is also established that such localization breaks down on arbitrary sets that are not dense in I^2, in the classes \\Phi_\\Psi(L)(I^2) for the orthonormal system \\Psi=\\mathscr E and an arbitrary function such that \\Phi_{\\mathscr E}(u)=o(u\\log^+\\log^+u) as u\\to\\infty or for \\Phi_W(u)=u(\\log^+\\log^+u)^{1-\\varepsilon}, 0<\\varepsilon<1.
Schweder, Tore
2003-12-01
Maximum likelihood estimates of abundance are obtained from repeated photographic surveys of a closed stratified population with naturally marked and unmarked individuals. Capture intensities are assumed log-linear in stratum, year, and season. In the chosen model, an approximate confidence distribution for total abundance of bowhead whales, with an accompanying likelihood reduced of nuisance parameters, is found from a parametric bootstrap experiment. The confidence distribution depends on the assumed study protocol. A confidence distribution that is exact (except for the effect of discreteness) is found by conditioning in the unstratified case without unmarked individuals. PMID:14969476
NASA Astrophysics Data System (ADS)
Halabe, Udaya B.; Pyakurel, Sandeep
2007-03-01
There has been a lack of an effective NDE technique to locate internal defects within wooden logs. The few available elastic wave propagation based techniques are limited to predicting E values. Other techniques such as X-rays have not been very successful in detecting internal defects in logs. If defects such as embedded metals could be identified before the sawing process, the saw mills could significantly increase their production by reducing the probability of damage to the saw blade and the associated downtime and the repair cost. Also, if the internal defects such as knots and decayed areas could be identified in logs, the sawing blade can be oriented to exclude the defective portion and optimize the volume of high valued lumber that can be obtained from the logs. In this research, GPR has been successfully used to locate internal defects (knots, decays and embedded metals) within the logs. This paper discusses GPR imaging and mapping of the internal defects using both 2D and 3D interpretation methodology. Metal pieces were inserted in a log and the reflection patterns from these metals were interpreted from the radargrams acquired using 900 MHz antenna. Also, GPR was able to accurately identify the location of knots and decays. Scans from several orientations of the log were collected to generate 3D cylindrical volume. The actual location of the defects showed good correlation with the interpreted defects in the 3D volume. The time/depth slices from 3D cylindrical volume data were useful in understanding the extent of defects inside the log.
Salvage logging, ecosystem processes, and biodiversity conservation.
Lindenmayer, D B; Noss, R F
2006-08-01
We summarize the documented and potential impacts of salvage logging--a form of logging that removes trees and other biological material from sites after natural disturbance. Such operations may reduce or eliminate biological legacies, modify rare postdisturbance habitats, influence populations, alter community composition, impair natural vegetation recovery, facilitate the colonization of invasive species, alter soil properties and nutrient levels, increase erosion, modify hydrological regimes and aquatic ecosystems, and alter patterns of landscape heterogeneity These impacts can be assigned to three broad and interrelated effects: (1) altered stand structural complexity; (2) altered ecosystem processes and functions; and (3) altered populations of species and community composition. Some impacts may be different from or additional to the effects of traditional logging that is not preceded by a large natural disturbance because the conditions before, during, and after salvage logging may differ from those that characterize traditional timber harvesting. The potential impacts of salvage logging often have been overlooked, partly because the processes of ecosystem recovery after natural disturbance are still poorly understood and partly because potential cumulative effects of natural and human disturbance have not been well documented. Ecologically informed policies regarding salvage logging are needed prior to major natural disturbances so that when they occur ad hoc and crisis-mode decision making can be avoided. These policies should lead to salvage-exemption zones and limits on the amounts of disturbance-derived biological legacies (e.g., burned trees, logs) that are removed where salvage logging takes place. Finally, we believe new terminology is needed. The word salvage implies that something is being saved or recovered, whereas from an ecological perspective this is rarely the case. PMID:16922212
Refining clinical diagnosis with likelihood ratios.
Grimes, David A; Schulz, Kenneth F
Likelihood ratios can refine clinical diagnosis on the basis of signs and symptoms; however, they are underused for patients' care. A likelihood ratio is the percentage of ill people with a given test result divided by the percentage of well individuals with the same result. Ideally, abnormal test results should be much more typical in ill individuals than in those who are well (high likelihood ratio) and normal test results should be most frequent in well people than in sick people (low likelihood ratio). Likelihood ratios near unity have little effect on decision-making; by contrast, high or low ratios can greatly shift the clinician's estimate of the probability of disease. Likelihood ratios can be calculated not only for dichotomous (positive or negative) tests but also for tests with multiple levels of results, such as creatine kinase or ventilation-perfusion scans. When combined with an accurate clinical diagnosis, likelihood ratios from ancillary tests improve diagnostic accuracy in a synergistic manner. PMID:15850636
Coal log pipeline: Development status of the first commercial system
Marrero, T.R.
1996-12-31
The coal log pipeline (CLP) is an innovative means for long-distance transportation of coal. In the CLP concept, coal is pressed into the form of cylinders--coal logs--that are propelled by water flowing through underground pipe. A coal log pipeline has many advantages when compared to coal transport by unit train, slurry pipeline and long-distance trucking: low-cost, low energy consumption, low-water consumption, simple dewatering at pipeline exit, safe, and environmentally friendly. The coal logs travel butted together, as trains. Between the coal log {open_quotes}trains,{close_quotes} some space is allowed for valve switching. The optimum diameter of a coal log is approximately 90 to 95% the inside diameter of the pipe. The coal-to-water ratio is about 4 to 1. A 200 mm diameter CLP can transport about 2 million tonnes of coal per year. The coal logs at their destination come out of the pipeline onto a moving conveyer which transports the logs to a crusher or stock pile. Coal logs are crushed to match the size of existing fuel. The water effluent is treated and reused at the power plant; there is no need for its discharge. Coal logs can be manufactured with and without the use of binder. By using less than 2 percent emulsified asphalt as binder, no heat is required to compact coal logs. Binderless coal logs can be compacted at less than 90{degrees}C. Compaction pressures, for coal logs made with or without binder, are about 70 MPa. The coal particle size distribution and moisture content must be controlled. The economics of coal log pipeline system have been studied. Results indicate that a new coal log pipeline is cost-competitive with existing railroads for distances greater than 80 km, approximately. CLP is much more economical than coal slurry pipeline of the same diameter. This paper describes the current R&D and commercialization plan for CLP. 4 refs.
Swift, T.E.; Goodrich, J.H.; Kumar, R.M.; McCoy, R.L.; Wilhelm, M.H.; Glascock, M.R.
1982-01-01
The coring, logging and testing of Bennett Ranch Unit well No. 310 was a cooperative effort between Texas Pacific, owner of the well, and Gruy Federal, Inc. The requirements of the contract, which are summarized in Enclosure 1, Appendix A, include drilling and coring activities. The pressure-coring and associated logging and testing programs in selected wells are intended to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report presents detailed information on the first such project. This project demonstrates the usefulness of integrating pressure core, log and production data to realistically evaluate a reservoir for carbon dioxide flood. The engineering of tests and analysis of such experimental data requires original thinking, but the reliability of the results is higher than data derived from conventional tests.
Model Fit after Pairwise Maximum Likelihood
Barendse, M. T.; Ligtvoet, R.; Timmerman, M. E.; Oort, F. J.
2016-01-01
Maximum likelihood factor analysis of discrete data within the structural equation modeling framework rests on the assumption that the observed discrete responses are manifestations of underlying continuous scores that are normally distributed. As maximizing the likelihood of multivariate response patterns is computationally very intensive, the sum of the log–likelihoods of the bivariate response patterns is maximized instead. Little is yet known about how to assess model fit when the analysis is based on such a pairwise maximum likelihood (PML) of two–way contingency tables. We propose new fit criteria for the PML method and conduct a simulation study to evaluate their performance in model selection. With large sample sizes (500 or more), PML performs as well the robust weighted least squares analysis of polychoric correlations. PMID:27148136
Smith, H. D. Jr.; Schultz, W. E.
1985-01-08
This invention relates to an improved method for determining the oil saturation of subsurface earth formations in the vicinity of a well borehole. High energy neutrons irradiate the subsurface earth formations and gamma rays caused by inelastic scatter with the subsurface earth formation constituent materials are measured. For a chosen borehole depth, gamma ray logs are taken in different situations: first, with the formation fluid water and oil mixture in an undisturbed state; second, after flushing the formation with alcohol to displace the formation water and oil mixture; and, finally, after flushing the alcohol from the formation with water to obtain a measurement with no oil in the formation. The gamma ray measurements obtained are then used to determine the oil saturation without requiring knowledge of the porosity of the earth formation, borehole conditions or formation type. When the original oil content of the formation is at a naturally flushed, or residual, oil saturation, the present invention may be used to determine the residual oil saturation.
Maximum-Likelihood Detection Of Noncoherent CPM
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Simon, Marvin K.
1993-01-01
Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.
3. Log bunkhouse (far left), log chicken house (left of ...
3. Log bunkhouse (far left), log chicken house (left of center), equipment shed (center), and workshop (far right). View to northwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT
Quasi-likelihood for Spatial Point Processes
Guan, Yongtao; Jalilian, Abdollah; Waagepetersen, Rasmus
2014-01-01
Summary Fitting regression models for intensity functions of spatial point processes is of great interest in ecological and epidemiological studies of association between spatially referenced events and geographical or environmental covariates. When Cox or cluster process models are used to accommodate clustering not accounted for by the available covariates, likelihood based inference becomes computationally cumbersome due to the complicated nature of the likelihood function and the associated score function. It is therefore of interest to consider alternative more easily computable estimating functions. We derive the optimal estimating function in a class of first-order estimating functions. The optimal estimating function depends on the solution of a certain Fredholm integral equation which in practise is solved numerically. The derivation of the optimal estimating function has close similarities to the derivation of quasi-likelihood for standard data sets. The approximate solution is further equivalent to a quasi-likelihood score for binary spatial data. We therefore use the term quasi-likelihood for our optimal estimating function approach. We demonstrate in a simulation study and a data example that our quasi-likelihood method for spatial point processes is both statistically and computationally efficient. PMID:26041970
Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon
NASA Astrophysics Data System (ADS)
Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin
2014-04-01
The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.
Logging concessions enable illegal logging crisis in the Peruvian Amazon.
Finer, Matt; Jenkins, Clinton N; Sky, Melissa A Blue; Pine, Justin
2014-01-01
The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms. PMID:24743552
Cox, Murray P.; Mendez, Fernando L.; Karafet, Tatiana M.; Pilkington, Maya Metni; Kingan, Sarah B.; Destro-Bisol, Giovanni; Strassmann, Beverly I.; Hammer, Michael F.
2008-01-01
A 2.4-kb stretch within the RRM2P4 region of the X chromosome, previously sequenced in a sample of 41 globally distributed humans, displayed both an ancient time to the most recent common ancestor (e.g., a TMRCA of ∼2 million years) and a basal clade composed entirely of Asian sequences. This pattern was interpreted to reflect a history of introgressive hybridization from archaic hominins (most likely Asian Homo erectus) into the anatomically modern human genome. Here, we address this hypothesis by resequencing the 2.4-kb RRM2P4 region in 131 African and 122 non-African individuals and by extending the length of sequence in a window of 16.5 kb encompassing the RRM2P4 pseudogene in a subset of 90 individuals. We find that both the ancient TMRCA and the skew in non-African representation in one of the basal clades are essentially limited to the central 2.4-kb region. We define a new summary statistic called the minimum clade proportion (pmc), which quantifies the proportion of individuals from a specified geographic region in each of the two basal clades of a binary gene tree, and then employ coalescent simulations to assess the likelihood of the observed central RRM2P4 genealogy under two alternative views of human evolutionary history: recent African replacement (RAR) and archaic admixture (AA). A molecular-clock-based TMRCA estimate of 2.33 million years is a statistical outlier under the RAR model; however, the large variance associated with this estimate makes it difficult to distinguish the predictions of the human origins models tested here. The pmc summary statistic, which has improved power with larger samples of chromosomes, yields values that are significantly unlikely under the RAR model and fit expectations better under a range of archaic admixture scenarios. PMID:18202385
Log interpretation of shaly sandstones
Baker, J.F.
1988-01-01
The determination of water saturation from electrical resistivity measurements to evaluate the potential of reservoirs is a fundamental tool of the oil industry. Shaly sandstones are difficult to evaluate because clays are conductive and they lower the resistivity of the rock. A review of shaly-sandstone research concerning ''volume-of-shale'' equations reveals three theoretical categories: (1) laminated clay equations, (2) dispersed clay equations, and (3) equations that assume that the effect of the clays on the conductivity measurement is directly related to water saturation. A new model for predicting the relative amounts of laminated and dispersed shales and accounting for their effects according to their abundance can be used for any sandstone, clean or shaly. Equations representing each of the three theoretical categories and the new equation were tested on cored Wilcox sandstones from two wells. Cores were analyzed to determine the volume and distribution of clays and to correlate porosity with the well logs.
Well Logging with Californium-252
Boulogne, A.R.
2003-01-06
Californium-252 is an intense neutron emitter that has only recently become available for experimental well logging. The purpose of this research is to investigate the application of well logging to groundwater hydrology; however, most of the techniques and purposes are quite similar to applications in the petroleum industry.
Code of Federal Regulations, 2013 CFR
2013-10-01
... expressed in Coordinated Universal Time (UTC). (2) “ON WATCH” must be entered by the operator beginning a... until the claim or complaint has been satisfied or barred by statute limiting the time for filing suits... log by the operator's signature. (2) The date and time of making an entry must be shown opposite...
Code of Federal Regulations, 2012 CFR
2012-10-01
... expressed in Coordinated Universal Time (UTC). (2) “ON WATCH” must be entered by the operator beginning a... until the claim or complaint has been satisfied or barred by statute limiting the time for filing suits... log by the operator's signature. (2) The date and time of making an entry must be shown opposite...
Code of Federal Regulations, 2014 CFR
2014-10-01
... expressed in Coordinated Universal Time (UTC). (2) “ON WATCH” must be entered by the operator beginning a... until the claim or complaint has been satisfied or barred by statute limiting the time for filing suits... log by the operator's signature. (2) The date and time of making an entry must be shown opposite...
NASA Astrophysics Data System (ADS)
Schuchardt, A.; Morche, D.
2015-12-01
Rivers cover only a small part of the Earth`s surface, yet they transfer sediment in globally significant quantities. In mountainous regions, the majority of the total channel length occurs in headwater streams. Those mountain channels are influenced in terms of sediment connectivity by processes on the slopes. For example in such a sediment routing system, sediment originating from debris flows on the slopes is delivered along sediment pathways to the channel system and can be transported further downstream as solid load. Interruption of instream coarse sediment connectivity is closely related to the existence of channel blocking barriers which also can be formed by biota. By storing sediment large wood (LW) log jams disrupt in-channel sediment connectivity. We present a study design in order to decipher the short to long term effects (c. 10-2-102 years) of sediment (dis)connectivity effects of large wood. The study areas are two basins in mountain ranges in Germany and Austria. In Austria the drainage area of the river Fugnitz was chosen which is located in the National Park Thayatal. The other drainage area of the river Sieber in Saxony-Anhalt, Germany, is located in the Harz National Park. Since studies on LW and its geomorphological effects in Central European rivers are still rare the main goals of the project are: •to identify important triggers for LW transport from slopes into the channels •to examine the spatial distribution and characterization of LW in main and slope channels by mapping and dGPS measurements •to determine the effects of LW on channel hydraulic parameters (e.g. slope, width, grains size composition, roughness) by field measurements of channel long profiles and cross section with dGPS and Wolman particle counts •to quantify the direct effects of LW on discharge and bed load transport by measuring flow velocity with an Ott-Nautilus current meter and to measure bed load up- and downstream of log jams using a portable Helley
Boldman, K G; Van Vleck, L D
1991-12-01
Estimation of (co)variance components by derivative-free REML requires repeated evaluation of the log-likelihood function of the data. Gaussian elimination of the augmented mixed model coefficient matrix is often used to evaluate the likelihood function, but it can be costly for animal models with large coefficient matrices. This study investigated the use of a direct sparse matrix solver to obtain the log-likelihood function. The sparse matrix package SPARSPAK was used to reorder the mixed model equations once and then repeatedly to solve the equations by Cholesky factorization to generate the terms required to calculate the likelihood. The animal model used for comparison contained 19 fixed levels, 470 maternal permanent environmental effects, and 1586 direct and 1586 maternal genetic effects, resulting in a coefficient matrix of order 3661 with .3% nonzero elements after including numerator relationships. Compared with estimation via Gaussian elimination of the unordered system, utilization of SPARSPAK required 605 and 240 times less central processing unit time on mainframes and personal computers, respectively. The SPARSPAK package also required less memory and provided solutions for all effects in the model. PMID:1787202
Seasonal logging, process response, and geomorphic work
NASA Astrophysics Data System (ADS)
Mohr, C.; Zimmermann, A.; Korup, O.; Iroume, A.; Francke, T.; Bronstert, A.
2013-12-01
Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3-min) measurements of stream discharge and sediment concentrations in similar-sized (~0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors which in turn provides model uncertainties. We find that, where no logging occurred, ~80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. Particularly dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.
A new approach for deriving pseudovelocity logs from resistivity logs
Dos Santos, W.L.B.; Ulrych, T.J.; De Lima, O.A.L.
1988-01-01
This paper describes a method of generating pseudovelocity logs using measurements of electrical resistivity. A theoretical relation between electrical resistivity and transit time, which is applicable to a wide range of lithologies, has been developed. The application of this relation using a method which defines lithoresistivity zones as lithological intervals related to the same formation and showing small resistivity variations, has been tested in the Reconcavo sedimentary basin in Bahia, Brazil. A comparison of derived pseudovelocity logs with actual sonic logs for five wells shows the validity of the present approach.
NASA Technical Reports Server (NTRS)
Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.
1982-01-01
An all-sky survey of X-ray sources was performed, complete to a limiting sensitivity of 3.1 x 10 to the -11 ergs/sq cm/s in the 2-10 keV band. The complete sample has allowed construction of luminosity functions based on a flux-limited sample for clusters of galaxies and active galactic nuclei. Integration of the best-fit luminosity functions indicates that clusters of galaxies contribute about 4% of the 2-10 keV DXRB, and active galactic nuclei about 20%. It is predicted that many of the objects seen in the deep survey should be local, relatively low luminosity active galactic nuclei and clusters of galaxies.
Likelihoods for fixed rank nomination networks.
Hoff, Peter; Fosdick, Bailey; Volfovsky, Alex; Stovel, Katherine
2013-12-01
Many studies that gather social network data use survey methods that lead to censored, missing, or otherwise incomplete information. For example, the popular fixed rank nomination (FRN) scheme, often used in studies of schools and businesses, asks study participants to nominate and rank at most a small number of contacts or friends, leaving the existence of other relations uncertain. However, most statistical models are formulated in terms of completely observed binary networks. Statistical analyses of FRN data with such models ignore the censored and ranked nature of the data and could potentially result in misleading statistical inference. To investigate this possibility, we compare Bayesian parameter estimates obtained from a likelihood for complete binary networks with those obtained from likelihoods that are derived from the FRN scheme, and therefore accommodate the ranked and censored nature of the data. We show analytically and via simulation that the binary likelihood can provide misleading inference, particularly for certain model parameters that relate network ties to characteristics of individuals and pairs of individuals. We also compare these different likelihoods in a data analysis of several adolescent social networks. For some of these networks, the parameter estimates from the binary and FRN likelihoods lead to different conclusions, indicating the importance of analyzing FRN data with a method that accounts for the FRN survey design. PMID:25110586
NASA Astrophysics Data System (ADS)
G, A., Major; Fretwell, H. M.; Dugdale, S. B.; Alam, M. A.
1998-11-01
A novel method for reconstructing the Fermi surface from experimental two-dimensional angular correlation of positron annihilation radiation (2D-ACAR) projections is proposed. In this algorithm, the 3D electron momentum-density distribution is expanded in terms of a basis of wavelet-like functions. The parameters of the model, the wavelet coefficients, are determined by maximizing the likelihood function corresponding to the experimental data and the projections calculated from the model. In contrast to other expansions, in the case of that in terms of wavelets a relatively small number of model parameters are sufficient for representing the relevant parts of the 3D distribution, thus keeping computation times reasonably short. Unlike other reconstruction methods, this algorithm takes full account of the statistical information content of the data and therefore may help to reduce the amount of time needed for data acquisition. An additional advantage of wavelet expansion may be the possibility of retrieving the Fermi surface directly from the wavelet coefficients rather than indirectly using the reconstructed 3D distribution.
Bonner, B; Hutchings, L; Kasameyer, P
2006-06-14
We devise a strategy for analysis of Vp and Vs microearthquake tomography results in the Salton Sea geothermal field to identify important features of the geothermal reservoir. We first interpret rock properties in State 2-14 borehole based upon logged core through the reservoir. Then, we interpret seismic recordings in the well (Daley et al., 1988) to develop the strategy. We hypothesize that mapping Poisson's ratio has two applications for the Salton Sea geothermal reservoir: (1) to map the top of the reservoir, and (2) as a diagnostic for permeable zones. Poisson's ratio can be obtained from Vp and Vs. In the State 2-14 borehole, Poisson's ratio calculated from large scale averages ({approx} 150 m) shows a monotonic decrease with depth to about 1300 m, at which point it increases with depth. Our model is that the monotonic decrease is due to compaction, and the increase below 1300 m is due to the rocks being hydrothermally altered. We hypothesize we can map the depth to alteration by identifying the transition from decreasing to increasing values; and thus, map the top of the reservoir, which is associated with a known increase in sulfite, chlorite, and epidote alteration that may be indicative of hydrothermal activity. We also observe (from Daley et. al. plots) an anomalous drop in Poisson's ratio at a depth of about 900 m, within a sandstone formation. The sandstone has a P-wave velocity significantly higher than the siltstone above it but a lower velocity in the lower half of the formation relative to the upper half. We interpret the relative decrease in velocity to be due to fracturing and chemical alteration caused by permeability. We conclude that using Vp and Vs tomography results to obtain images of Poisson's ratio has the potential to identify significant features in the geothermal reservoir in this geologic setting. Seismic attenuation tomography results (mapped as Qp and Qs) should also be useful for evaluating geothermal reservoirs, but that is not
MAIL LOG, program summary and specifications
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.
Collaborative double robust targeted maximum likelihood estimation.
van der Laan, Mark J; Gruber, Susan
2010-01-01
Collaborative double robust targeted maximum likelihood estimators represent a fundamental further advance over standard targeted maximum likelihood estimators of a pathwise differentiable parameter of a data generating distribution in a semiparametric model, introduced in van der Laan, Rubin (2006). The targeted maximum likelihood approach involves fluctuating an initial estimate of a relevant factor (Q) of the density of the observed data, in order to make a bias/variance tradeoff targeted towards the parameter of interest. The fluctuation involves estimation of a nuisance parameter portion of the likelihood, g. TMLE has been shown to be consistent and asymptotically normally distributed (CAN) under regularity conditions, when either one of these two factors of the likelihood of the data is correctly specified, and it is semiparametric efficient if both are correctly specified. In this article we provide a template for applying collaborative targeted maximum likelihood estimation (C-TMLE) to the estimation of pathwise differentiable parameters in semi-parametric models. The procedure creates a sequence of candidate targeted maximum likelihood estimators based on an initial estimate for Q coupled with a succession of increasingly non-parametric estimates for g. In a departure from current state of the art nuisance parameter estimation, C-TMLE estimates of g are constructed based on a loss function for the targeted maximum likelihood estimator of the relevant factor Q that uses the nuisance parameter to carry out the fluctuation, instead of a loss function for the nuisance parameter itself. Likelihood-based cross-validation is used to select the best estimator among all candidate TMLE estimators of Q(0) in this sequence. A penalized-likelihood loss function for Q is suggested when the parameter of interest is borderline-identifiable. We present theoretical results for "collaborative double robustness," demonstrating that the collaborative targeted maximum
Collaborative Double Robust Targeted Maximum Likelihood Estimation*
van der Laan, Mark J.; Gruber, Susan
2010-01-01
Collaborative double robust targeted maximum likelihood estimators represent a fundamental further advance over standard targeted maximum likelihood estimators of a pathwise differentiable parameter of a data generating distribution in a semiparametric model, introduced in van der Laan, Rubin (2006). The targeted maximum likelihood approach involves fluctuating an initial estimate of a relevant factor (Q) of the density of the observed data, in order to make a bias/variance tradeoff targeted towards the parameter of interest. The fluctuation involves estimation of a nuisance parameter portion of the likelihood, g. TMLE has been shown to be consistent and asymptotically normally distributed (CAN) under regularity conditions, when either one of these two factors of the likelihood of the data is correctly specified, and it is semiparametric efficient if both are correctly specified. In this article we provide a template for applying collaborative targeted maximum likelihood estimation (C-TMLE) to the estimation of pathwise differentiable parameters in semi-parametric models. The procedure creates a sequence of candidate targeted maximum likelihood estimators based on an initial estimate for Q coupled with a succession of increasingly non-parametric estimates for g. In a departure from current state of the art nuisance parameter estimation, C-TMLE estimates of g are constructed based on a loss function for the targeted maximum likelihood estimator of the relevant factor Q that uses the nuisance parameter to carry out the fluctuation, instead of a loss function for the nuisance parameter itself. Likelihood-based cross-validation is used to select the best estimator among all candidate TMLE estimators of Q0 in this sequence. A penalized-likelihood loss function for Q is suggested when the parameter of interest is borderline-identifiable. We present theoretical results for “collaborative double robustness,” demonstrating that the collaborative targeted maximum
Enhanced carbon-oxygen log interpretations using supplemental log curves
Wyatt, D.F. Jr.; Jacobson, L.A.; Fox, P.
1994-12-31
Supplemental measurements from induced nuclear spectrometry tools are examined to demonstrate what additional information they provide about the well and reservoir conditions. Logs in shut-in wells from Indonesia provide examples of oxygen activation measurements showing cross-flow from one reservoir to another via open perforations. Leaking squeezed perforations were also observed. An example from Alaska shows radioactive scale build-up in the casing which spectral analysis identifies as a mixture of uranium and thorium salts. Another log, where the casing fluid was replaced with crude oil, demonstrates a technique for identifying cement channels. Logs from Nigeria comparing oil saturation estimates before and after a squeeze operation illustrate the effect of casing fluid flushing of the formation through open perforations. Understanding the diagnostic character of these curves leads to higher confidence in the overall log interpretation process.
Maximum likelihood clustering with dependent feature trees
NASA Technical Reports Server (NTRS)
Chittineni, C. B. (Principal Investigator)
1981-01-01
The decomposition of mixture density of the data into its normal component densities is considered. The densities are approximated with first order dependent feature trees using criteria of mutual information and distance measures. Expressions are presented for the criteria when the densities are Gaussian. By defining different typs of nodes in a general dependent feature tree, maximum likelihood equations are developed for the estimation of parameters using fixed point iterations. The field structure of the data is also taken into account in developing maximum likelihood equations. Experimental results from the processing of remotely sensed multispectral scanner imagery data are included.
Park, Chung Gyoo; Son, Joon-Ki; Lee, Byung-Ho; Cho, Jin Hoon; Ren, Yonglin
2014-12-01
The Bursaphelenchus xylophilus, commonly known as pinewood nematode in Japan, is a quarantine pest and is most often associated with beetles of the genus Monochamus, the pine sawyers, particularly Monochamus alternatus. Long-distance dispersal of the nematode and its vectors led to widespread losses in pine forests. Two fumigation trials were conducted for treatment of logs naturally infested with both M. alternatus and B. xylophilus. The logs were treated with ethanedinitrile or metam sodium at low temperature (-7-25.7°C and -3.7-23.1°C) for 3-d exposure in winter and early spring. Fumigation with ethanedinitrile at concentrations of 48, 68, 97 and 158 g/m(3) resulted in 34.6-58.3, 91.5-97.2, 100, and 100% mortality for M. alternatus and 88.4, 77.9, 96.4, and 98.0% mortality for B. xylophilus, respectively. With Metam sodium fumigation at a dose rate of 1162 g/m(3), 100% M. alternatus and 97.4% B. xylophilus were killed. These results suggest that 97 g/m(3) of ethanedinitrile is adequate for complete control of M. alternatus in pine wood and >158 g/m(3) is required for eradication of B. xylophilus at low temperature fumigation. These results suggest that 97 g/m(3) of ethanedinitrile offers complete control of M. alternatus in pine wood and control of >98% B. xylophilus in winter or spring fumigation at a dosage rate of 158 g/m(3). Therefore, ethanedinitrile has great potential for treatment of fresh pine wooden logs to manage the nematodes and the vector insects at low temperature. PMID:26470069
Mohaghegh, S.; Balan, B.; Ameri, S.
1995-12-31
The ultimate test for any technique that bears the claim of permeability prediction from well log data, is accurate and verifiable prediction of permeability for wells from which only the well log data is available. So far all the available models and techniques have been tried on data that includes both well logs and the corresponding permeability values. This approach at best is nothing more than linear or nonlinear curve fitting. The objective of this paper is to test the capability of the most promising of these techniques in independent (where corresponding permeability values are not available or have not been used in development of the model) prediction of permeability in a heterogeneous formation. These techniques are {open_quotes}Multiple Regression{close_quotes} and {open_quotes}Virtual Measurements using Artificial Neural Networks.{close_quotes} For the purposes of this study several wells from a heterogeneous formation in West Virginia were selected. Well log data and corresponding permeability values for these wells were available. The techniques were applied to the remaining data and a permeability model for the field was developed. The model was then applied to the well that was separated from the rest of the data earlier and the results were compared. This approach will test the generalization power of each technique. The result will show that although Multiple Regression provides acceptable results for wells that were used during model development, (good curve fitting,) it lacks a consistent generalization capability, meaning that it does not perform as well with data it has not been exposed to (the data from well that has been put aside). On the other hand, Virtual Measurement technique provides a steady generalization power. This technique is able to perform the permeability prediction task even for the entire wells with no prior exposure to their permeability profile.
Franco-Pedroso, Javier; Ramos, Daniel; Gonzalez-Rodriguez, Joaquin
2016-01-01
In forensic science, trace evidence found at a crime scene and on suspect has to be evaluated from the measurements performed on them, usually in the form of multivariate data (for example, several chemical compound or physical characteristics). In order to assess the strength of that evidence, the likelihood ratio framework is being increasingly adopted. Several methods have been derived in order to obtain likelihood ratios directly from univariate or multivariate data by modelling both the variation appearing between observations (or features) coming from the same source (within-source variation) and that appearing between observations coming from different sources (between-source variation). In the widely used multivariate kernel likelihood-ratio, the within-source distribution is assumed to be normally distributed and constant among different sources and the between-source variation is modelled through a kernel density function (KDF). In order to better fit the observed distribution of the between-source variation, this paper presents a different approach in which a Gaussian mixture model (GMM) is used instead of a KDF. As it will be shown, this approach provides better-calibrated likelihood ratios as measured by the log-likelihood ratio cost (Cllr) in experiments performed on freely available forensic datasets involving different trace evidences: inks, glass fragments and car paints. PMID:26901680
NASA Astrophysics Data System (ADS)
Fu, Qiang; Luk, Wai-Shing; Tao, Jun; Zeng, Xuan; Cai, Wei
In this paper, a novel intra-die spatial correlation extraction method referred to as MLEMTC (Maximum Likelihood Estimation for Multiple Test Chips) is presented. In the MLEMTC method, a joint likelihood function is formulated by multiplying the set of individual likelihood functions for all test chips. This joint likelihood function is then maximized to extract a unique group of parameter values of a single spatial correlation function, which can be used for statistical circuit analysis and design. Moreover, to deal with the purely random component and measurement error contained in measurement data, the spatial correlation function combined with the correlation of white noise is used in the extraction, which significantly improves the accuracy of the extraction results. Furthermore, an LU decomposition based technique is developed to calculate the log-determinant of the positive definite matrix within the likelihood function, which solves the numerical stability problem encountered in the direct calculation. Experimental results have shown that the proposed method is efficient and practical.
New materials for fireplace logs
NASA Technical Reports Server (NTRS)
Kieselback, D. J.; Smock, A. W.
1971-01-01
Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.
Growing local likelihood network: Emergence of communities
NASA Astrophysics Data System (ADS)
Chen, S.; Small, M.
2015-10-01
In many real situations, networks grow only via local interactions. New nodes are added to the growing network with information only pertaining to a small subset of existing nodes. Multilevel marketing, social networks, and disease models can all be depicted as growing networks based on local (network path-length) distance information. In these examples, all nodes whose distance from a chosen center is less than d form a subgraph. Hence, we grow networks with information only from these subgraphs. Moreover, we use a likelihood-based method, where at each step we modify the networks by changing their likelihood to be closer to the expected degree distribution. Combining the local information and the likelihood method, we grow networks that exhibit novel features. We discover that the likelihood method, over certain parameter ranges, can generate networks with highly modulated communities, even when global information is not available. Communities and clusters are abundant in real-life networks, and the method proposed here provides a natural mechanism for the emergence of communities in scale-free networks. In addition, the algorithmic implementation of network growth via local information is substantially faster than global methods and allows for the exploration of much larger networks.
Numerical likelihood analysis of cosmic ray anisotropies
Carlos Hojvat et al.
2003-07-02
A numerical likelihood approach to the determination of cosmic ray anisotropies is presented which offers many advantages over other approaches. It allows a wide range of statistically meaningful hypotheses to be compared even when full sky coverage is unavailable, can be readily extended in order to include measurement errors, and makes maximum unbiased use of all available information.
Efficient Bit-to-Symbol Likelihood Mappings
NASA Technical Reports Server (NTRS)
Moision, Bruce E.; Nakashima, Michael A.
2010-01-01
This innovation is an efficient algorithm designed to perform bit-to-symbol and symbol-to-bit likelihood mappings that represent a significant portion of the complexity of an error-correction code decoder for high-order constellations. Recent implementation of the algorithm in hardware has yielded an 8- percent reduction in overall area relative to the prior design.
Woody, Michael S; Lewis, John H; Greenberg, Michael J; Goldman, Yale E; Ostap, E Michael
2016-07-26
We present MEMLET (MATLAB-enabled maximum-likelihood estimation tool), a simple-to-use and powerful program for utilizing maximum-likelihood estimation (MLE) for parameter estimation from data produced by single-molecule and other biophysical experiments. The program is written in MATLAB and includes a graphical user interface, making it simple to integrate into the existing workflows of many users without requiring programming knowledge. We give a comparison of MLE and other fitting techniques (e.g., histograms and cumulative frequency distributions), showing how MLE often outperforms other fitting methods. The program includes a variety of features. 1) MEMLET fits probability density functions (PDFs) for many common distributions (exponential, multiexponential, Gaussian, etc.), as well as user-specified PDFs without the need for binning. 2) It can take into account experimental limits on the size of the shortest or longest detectable event (i.e., instrument "dead time") when fitting to PDFs. The proper modification of the PDFs occurs automatically in the program and greatly increases the accuracy of fitting the rates and relative amplitudes in multicomponent exponential fits. 3) MEMLET offers model testing (i.e., single-exponential versus double-exponential) using the log-likelihood ratio technique, which shows whether additional fitting parameters are statistically justifiable. 4) Global fitting can be used to fit data sets from multiple experiments to a common model. 5) Confidence intervals can be determined via bootstrapping utilizing parallel computation to increase performance. Easy-to-follow tutorials show how these features can be used. This program packages all of these techniques into a simple-to-use and well-documented interface to increase the accessibility of MLE fitting. PMID:27463130
Scientific LogAnalyzer: a web-based tool for analyses of server log files in psychological research.
Reips, Ulf-Dietrich; Stieger, Stefan
2004-05-01
Scientific LogAnalyzer is a platform-independent interactive Web service for the analysis of log files. Scientific LogAnalyzer offers several features not available in other log file analysis tools--for example, organizational criteria and computational algorithms suited to aid behavioral and social scientists. Scientific LogAnalyzer is highly flexible on the input side (unlimited types of log file formats), while strictly keeping a scientific output format. Features include (1) free definition of log file format, (2) searching and marking dependent on any combination of strings (necessary for identifying conditions in experiment data), (3) computation of response times, (4) detection of multiple sessions, (5) speedy analysis of large log files, (6) output in HTML and/or tab-delimited form, suitable for import into statistics software, and (7) a module for analyzing and visualizing drop-out. Several methodological features specifically needed in the analysis of data collected in Internet-based experiments have been implemented in the Web-based tool and are described in this article. A regression analysis with data from 44 log file analyses shows that the size of the log file and the domain name lookup are the two main factors determining the duration of an analysis. It is less than a minute for a standard experimental study with a 2 x 2 design, a dozen Web pages, and 48 participants (ca. 800 lines, including data from drop-outs). The current version of Scientific LogAnalyzer is freely available for small log files. Its Web address is http://genpsylab-logcrunsh.unizh.ch/. PMID:15354696
Likelihood approaches for proportional likelihood ratio model with right-censored data.
Zhu, Hong
2014-06-30
Regression methods for survival data with right censoring have been extensively studied under semiparametric transformation models such as the Cox regression model and the proportional odds model. However, their practical application could be limited because of possible violation of model assumption or lack of ready interpretation for the regression coefficients in some cases. As an alternative, in this paper, the proportional likelihood ratio model introduced by Luo and Tsai is extended to flexibly model the relationship between survival outcome and covariates. This model has a natural connection with many important semiparametric models such as generalized linear model and density ratio model and is closely related to biased sampling problems. Compared with the semiparametric transformation model, the proportional likelihood ratio model is appealing and practical in many ways because of its model flexibility and quite direct clinical interpretation. We present two likelihood approaches for the estimation and inference on the target regression parameters under independent and dependent censoring assumptions. Based on a conditional likelihood approach using uncensored failure times, a numerically simple estimation procedure is developed by maximizing a pairwise pseudo-likelihood. We also develop a full likelihood approach, and the most efficient maximum likelihood estimator is obtained by a profile likelihood. Simulation studies are conducted to assess the finite-sample properties of the proposed estimators and compare the efficiency of the two likelihood approaches. An application to survival data for bone marrow transplantation patients of acute leukemia is provided to illustrate the proposed method and other approaches for handling non-proportionality. The relative merits of these methods are discussed in concluding remarks. PMID:24500821
Odic, Darko; Im, Hee Yeon; Eisinger, Robert; Ly, Ryan; Halberda, Justin
2016-06-01
A simple and popular psychophysical model-usually described as overlapping Gaussian tuning curves arranged along an ordered internal scale-is capable of accurately describing both human and nonhuman behavioral performance and neural coding in magnitude estimation, production, and reproduction tasks for most psychological dimensions (e.g., time, space, number, or brightness). This model traditionally includes two parameters that determine how a physical stimulus is transformed into a psychological magnitude: (1) an exponent that describes the compression or expansion of the physical signal into the relevant psychological scale (β), and (2) an estimate of the amount of inherent variability (often called internal noise) in the Gaussian activations along the psychological scale (σ). To date, linear slopes on log-log plots have traditionally been used to estimate β, and a completely separate method of averaging coefficients of variance has been used to estimate σ. We provide a respectful, yet critical, review of these traditional methods, and offer a tutorial on a maximum-likelihood estimation (MLE) and a Bayesian estimation method for estimating both β and σ [PsiMLE(β,σ)], coupled with free software that researchers can use to implement it without a background in MLE or Bayesian statistics (R-PsiMLE). We demonstrate the validity, reliability, efficiency, and flexibility of this method through a series of simulations and behavioral experiments, and find the new method to be superior to the traditional methods in all respects. PMID:25987306
Predicting reservoir wettability via well logs
NASA Astrophysics Data System (ADS)
Feng, Cheng; Fu, Jinhua; Shi, Yujiang; Li, Gaoren; Mao, Zhiqiang
2016-06-01
Wettability is an important factor in controlling the distribution of oil and water. However, its evaluation has so far been a difficult problem because no log data can directly indicate it. In this paper, a new method is proposed for quantitatively predicting reservoir wettability via well log analysis. Specifically, based on the J function, diagenetic facies classification and the piecewise power functions, capillary pressure curves are constructed from conventional logs and a nuclear magnetic resonance (NMR) log respectively. Under the influence of wettability, the latter is distorted while the former remains unaffected. Therefore, the ratio of the median radius obtained from the two kinds of capillary pressure curve is calculated to reflect wettability, a quantitative relationship between the ratio and reservoir wettability is then established. According to the low-permeability core sample capillary pressure curve, NMR {{T}2} spectrum and contact angle experimental data from the bottom of the Upper Triassic reservoirs in western Ordos Basin, China, two kinds of constructing capillary pressure curve models and a predictive wettability model are calibrated. The wettability model is verified through the Amott wettability index and saturation exponent from resistivity measurement and their determined wettability levels are comparable, indicating that the proposed model is quite reliable. In addition, the model’s good application effect is exhibited in the field study. Thus, the quantitatively predicting reservoir wettability model proposed in this paper provides an effective tool for formation evaluation, field development and the improvement of oil recovery.
Well Logging and Logging Analysis of UHP metamorphic Rocks in CCSD Main Hole (0-2000m)
NASA Astrophysics Data System (ADS)
Pan, H.; Niu, Y.; Wang, W.; Zhu, L.; Xu, D.; Wu, H.; Li, S.; Luo, M.
2004-12-01
CCSD logging engineering gather many modern high technologies and employs various advanced logging tools to survey the sidewall continuously. This can obtain various physical, chemical, geometrical, etc in-situ information of the borehole's profile. So well logging is one of the most important parts and pivotal technologies in the project of CCSD. The main logging methods in CCSD-MH(0-2000m) are laterolog (Rd,Rs), gamma ray(GR), nature gamma spectrometry(U, TH, K), density(DEN), photo electric section exponent (Pe), compensated neutron(CNL), multipole array acoustic (Vp, Vs, Vst), Simultaneous Acoustic-Resistivity-image(Star-II), temperature(T),magnetic susceptibility(MS), three component borehole magnetic and redox potential log,etc. The various metamorphic rocks can be classified by logging curves,and their physical parameters can be acquired by analyzing the response characters of various metamorphic rocks and by statistics. According to the logging cross plot, We can research the clustering of metamorphite's physical property. Five lithologic segments can be obtainend by logging curves. The GR, Th, U, K logging values of segment 1 is lower than the third, fourth and fiveth segment, higher than segment 2; The DEN, Pe values of segment 1 higher than the third, fourth and fiveth segments. The main rocks in segment 1,2,3,4,5 are eclogites, serpentinites, paragneiss, orthogneiss, and eclogites(containing silicon and muscovite ) respectively. Generally, eclogite contain rutile, silicon, muscovite, etc. minerals. These minerals have response obviously on log curves.There are rutile,ilmenite, pyrite mineralized, etc. Making use of DEN, Pe, susceptibility log values, these mineralized layers can be goodly demarcation. For example, on the rutile mineralzed layer, the logging curve response characters are of high density and Pe obviously. The key data of the synthetical seismic record is wave impedance. In this paper, Utilize the data of AC, DEN curves to calculate the
cosmoabc: Likelihood-free inference for cosmology
NASA Astrophysics Data System (ADS)
Ishida, Emille E. O.; Vitenti, Sandro D. P.; Penna-Lima, Mariana; Trindade, Arlindo M.; Cisewski, Jessi; M.; de Souza, Rafael; Cameron, Ewan; Busti, Vinicius C.
2015-05-01
Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogs. cosmoabc is a Python Approximate Bayesian Computation (ABC) sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code can be coupled to an external simulator to allow incorporation of arbitrary distance and prior functions. When coupled with the numcosmo library, it has been used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function.
Spectral likelihood expansions for Bayesian inference
NASA Astrophysics Data System (ADS)
Nagel, Joseph B.; Sudret, Bruno
2016-03-01
A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.
Stepwise Signal Extraction via Marginal Likelihood
Du, Chao; Kao, Chu-Lan Michael
2015-01-01
This paper studies the estimation of stepwise signal. To determine the number and locations of change-points of the stepwise signal, we formulate a maximum marginal likelihood estimator, which can be computed with a quadratic cost using dynamic programming. We carry out extensive investigation on the choice of the prior distribution and study the asymptotic properties of the maximum marginal likelihood estimator. We propose to treat each possible set of change-points equally and adopt an empirical Bayes approach to specify the prior distribution of segment parameters. Detailed simulation study is performed to compare the effectiveness of this method with other existing methods. We demonstrate our method on single-molecule enzyme reaction data and on DNA array CGH data. Our study shows that this method is applicable to a wide range of models and offers appealing results in practice. PMID:27212739
Lienkaemper, James J.; Williams, Patrick L.; Sickler, Robert R.; Fumal, Thomas E.
2005-01-01
This publication makes available a detailed trench log (sheets 1 and 2) of a 110-m trench we excavated in 2004 across a tectonic sag pond in the Hayward fault zone. Also included are revised stratigraphic unit descriptions from this fifth field season of subsurface investigation of the Hayward fault at Tyson's Lagoon (Tule Pond). Preliminary findings based on fieldwork done in 2000 have been published (Lienkaemper and others: data archive, 2002a; report, 2002b), as were the logs and data for 2001-2003 (Lienkaemper and others, 2003, L03). A continuous exposure of the geologic section across the entire pond made in 2004 (Fig. 1, 04A) has revealed some critical miscorrelations of units made in the original on-line version of L03, hence users of these earlier trench data should only use the 2005 revised version 2.0 of L03 for correlation purposes. Lienkaemper, Williams, and Sickler interpreted the geology and logged the trenches. Fumal did most of the trench photography. The Hayward fault is recognized to be among the most hazardous in the United States (Working Group on California Earthquake Probabilities, 2003). Establishing a chronology of prehistoric or paleoearthquakes is of immediate use in resolving the likelihood of future large earthquakes Hayward fault. This document makes available geologic evidence for historical and prehistoric surface-rupturing earthquakes preserved at the site. A second, formal report on our conclusions based on these data is in preparation.
Likelihood-Based Climate Model Evaluation
NASA Technical Reports Server (NTRS)
Braverman, Amy; Cressie, Noel; Teixeira, Joao
2012-01-01
Climate models are deterministic, mathematical descriptions of the physics of climate. Confidence in predictions of future climate is increased if the physics are verifiably correct. A necessary, (but not sufficient) condition is that past and present climate be simulated well. Quantify the likelihood that a (summary statistic computed from a) set of observations arises from a physical system with the characteristics captured by a model generated time series. Given a prior on models, we can go further: posterior distribution of model given observations.
Seasonal logging, process response, and geomorphic work
NASA Astrophysics Data System (ADS)
Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.
2013-09-01
Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but potentially overlook important geomorphic responses on shorter time scales immediately following timber harvest. Sediments fluxes are commonly estimated from linear regression of intermittent measurements of water and sediment discharge using sediment rating curves (SRCs). However, these often unsatisfactorily reproduce non-linear effects such as discharge-load hystereses. We resolve such important dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3 min) measurements of stream discharge and sediment concentrations in similar-sized (~ 0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest (RF) algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ~ 80% of the total sediment load was transported during rare but high magnitude runoff events during only 5% of the monitoring period. The variability of sediment flux of these rare events spans four orders of magnitude. In particular dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more moderate events. We show that QRFs outperforms traditional SRCs in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment discharge at high temporal resolution.
Seasonal logging, process response, and geomorphic work
NASA Astrophysics Data System (ADS)
Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.
2014-03-01
Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux often focus on annual timescales but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics with non-parametric quantile regression forests (QRF) based on high-frequency (3 min) discharge measurements and sediment concentration data sampled every 30-60 min in similar-sized (˜0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the random forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ˜80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. In particular, dry-season logging dampened the relative role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We show that QRFs outperform traditional sediment rating curves (SRCs) in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.
Evaluating network models: A likelihood analysis
NASA Astrophysics Data System (ADS)
Wang, Wen-Qiang; Zhang, Qian-Ming; Zhou, Tao
2012-04-01
Many models are put forward to mimic the evolution of real networked systems. A well-accepted way to judge the validity is to compare the modeling results with real networks subject to several structural features. Even for a specific real network, we cannot fairly evaluate the goodness of different models since there are too many structural features while there is no criterion to select and assign weights on them. Motivated by the studies on link prediction algorithms, we propose a unified method to evaluate the network models via the comparison of the likelihoods of the currently observed network driven by different models, with an assumption that the higher the likelihood is, the more accurate the model is. We test our method on the real Internet at the Autonomous System (AS) level, and the results suggest that the Generalized Linear Preferential (GLP) model outperforms the Tel Aviv Network Generator (Tang), while both two models are better than the Barabási-Albert (BA) and Erdös-Rényi (ER) models. Our method can be further applied in determining the optimal values of parameters that correspond to the maximal likelihood. The experiment indicates that the parameters obtained by our method can better capture the characters of newly added nodes and links in the AS-level Internet than the original methods in the literature.
Maximum likelihood continuity mapping for fraud detection
Hogden, J.
1997-05-01
The author describes a novel time-series analysis technique called maximum likelihood continuity mapping (MALCOM), and focuses on one application of MALCOM: detecting fraud in medical insurance claims. Given a training data set composed of typical sequences, MALCOM creates a stochastic model of sequence generation, called a continuity map (CM). A CM maximizes the probability of sequences in the training set given the model constraints, CMs can be used to estimate the likelihood of sequences not found in the training set, enabling anomaly detection and sequence prediction--important aspects of data mining. Since MALCOM can be used on sequences of categorical data (e.g., sequences of words) as well as real valued data, MALCOM is also a potential replacement for database search tools such as N-gram analysis. In a recent experiment, MALCOM was used to evaluate the likelihood of patient medical histories, where ``medical history`` is used to mean the sequence of medical procedures performed on a patient. Physicians whose patients had anomalous medical histories (according to MALCOM) were evaluated for fraud by an independent agency. Of the small sample (12 physicians) that has been evaluated, 92% have been determined fraudulent or abusive. Despite the small sample, these results are encouraging.
Likelihood methods for point processes with refractoriness.
Citi, Luca; Ba, Demba; Brown, Emery N; Barbieri, Riccardo
2014-02-01
Likelihood-based encoding models founded on point processes have received significant attention in the literature because of their ability to reveal the information encoded by spiking neural populations. We propose an approximation to the likelihood of a point-process model of neurons that holds under assumptions about the continuous time process that are physiologically reasonable for neural spike trains: the presence of a refractory period, the predictability of the conditional intensity function, and its integrability. These are properties that apply to a large class of point processes arising in applications other than neuroscience. The proposed approach has several advantages over conventional ones. In particular, one can use standard fitting procedures for generalized linear models based on iteratively reweighted least squares while improving the accuracy of the approximation to the likelihood and reducing bias in the estimation of the parameters of the underlying continuous-time model. As a result, the proposed approach can use a larger bin size to achieve the same accuracy as conventional approaches would with a smaller bin size. This is particularly important when analyzing neural data with high mean and instantaneous firing rates. We demonstrate these claims on simulated and real neural spiking activity. By allowing a substantive increase in the required bin size, our algorithm has the potential to lower the barrier to the use of point-process methods in an increasing number of applications. PMID:24206384
Likelihood-based modification of experimental crystal structure electron density maps
Terwilliger, Thomas C.
2005-04-16
A maximum-likelihood method for improves an electron density map of an experimental crystal structure. A likelihood of a set of structure factors {F.sub.h } is formed for the experimental crystal structure as (1) the likelihood of having obtained an observed set of structure factors {F.sub.h.sup.OBS } if structure factor set {F.sub.h } was correct, and (2) the likelihood that an electron density map resulting from {F.sub.h } is consistent with selected prior knowledge about the experimental crystal structure. The set of structure factors {F.sub.h } is then adjusted to maximize the likelihood of {F.sub.h } for the experimental crystal structure. An improved electron density map is constructed with the maximized structure factors.
Method for induced polarization logging
Vinegar, H.J.; Waxman, M.H.
1987-04-14
A method is described for generating a log of the formation phase shift, resistivity and spontaneous potential of an earth formation from data obtained from the earth formation with a multi-electrode induced polarization logging tool. The method comprises obtaining data samples from the formation at measurement points equally spaced in time of the magnitude and phase of the induced voltage and the magnitude and phase of the current supplied by a circuit through a reference resistance R/sub 0/ to a survey current electrode associated with the tool.
Applying exclusion likelihoods from LHC searches to extended Higgs sectors
NASA Astrophysics Data System (ADS)
Bechtle, Philip; Heinemeyer, Sven; Stål, Oscar; Stefaniak, Tim; Weiglein, Georg
2015-09-01
LHC searches for non-standard Higgs bosons decaying into tau lepton pairs constitute a sensitive experimental probe for physics beyond the Standard Model (BSM), such as supersymmetry (SUSY). Recently, the limits obtained from these searches have been presented by the CMS collaboration in a nearly model-independent fashion - as a narrow resonance model - based on the full dataset. In addition to publishing a exclusion limit, the full likelihood information for the narrow resonance model has been released. This provides valuable information that can be incorporated into global BSM fits. We present a simple algorithm that maps an arbitrary model with multiple neutral Higgs bosons onto the narrow resonance model and derives the corresponding value for the exclusion likelihood from the CMS search. This procedure has been implemented into the public computer code HiggsBounds (version 4.2.0 and higher). We validate our implementation by cross-checking against the official CMS exclusion contours in three Higgs benchmark scenarios in the Minimal Supersymmetric Standard Model (MSSM), and find very good agreement. Going beyond validation, we discuss the combined constraints of the search and the rate measurements of the SM-like Higgs at in a recently proposed MSSM benchmark scenario, where the lightest Higgs boson obtains SM-like couplings independently of the decoupling of the heavier Higgs states. Technical details for how to access the likelihood information within HiggsBounds are given in the appendix. The program is available at http://higgsbounds.hepforge.org.
Maximal likelihood correspondence estimation for face recognition across pose.
Li, Shaoxin; Liu, Xin; Chai, Xiujuan; Zhang, Haihong; Lao, Shihong; Shan, Shiguang
2014-10-01
Due to the misalignment of image features, the performance of many conventional face recognition methods degrades considerably in across pose scenario. To address this problem, many image matching-based methods are proposed to estimate semantic correspondence between faces in different poses. In this paper, we aim to solve two critical problems in previous image matching-based correspondence learning methods: 1) fail to fully exploit face specific structure information in correspondence estimation and 2) fail to learn personalized correspondence for each probe image. To this end, we first build a model, termed as morphable displacement field (MDF), to encode face specific structure information of semantic correspondence from a set of real samples of correspondences calculated from 3D face models. Then, we propose a maximal likelihood correspondence estimation (MLCE) method to learn personalized correspondence based on maximal likelihood frontal face assumption. After obtaining the semantic correspondence encoded in the learned displacement, we can synthesize virtual frontal images of the profile faces for subsequent recognition. Using linear discriminant analysis method with pixel-intensity features, state-of-the-art performance is achieved on three multipose benchmarks, i.e., CMU-PIE, FERET, and MultiPIE databases. Owe to the rational MDF regularization and the usage of novel maximal likelihood objective, the proposed MLCE method can reliably learn correspondence between faces in different poses even in complex wild environment, i.e., labeled face in the wild database. PMID:25163062
[Log-linear model used in the hybrid design of case-parents triad/control-mother dyad].
Peng, W J; Zhang, H; Li, Y X; Li, C Y; Yan, W R
2016-06-01
This study introduced the application of a log-linear model in the hybrid design of case-parents triad/control-mother dyad. Data related to the association between cleft lip with palate (CLP) and methylenetetrahydrofolate reductase (MTHFR) gene A1298C diversity was analyzed. Log-linear model based on likelihood ratio tests (LRTs) was used to analyze the relationships between mother, offspring genotypes and CLP. Data from our study noticed that children of mothers carrying the CC genotype presented a lower risk of CLP, comparing with the children of mothers carrying the AA genotype, with S2=0.45 (95%CI: 0.26-0.79). Offspring that carrying the AC genotype presented a lower risk of CLP, comparing with the offspring that carrying the AA genotype, with R1=0.69 (95% CI: 0.48-0.97). However, no other types of relationships were found. The power of hybrid design was greater than the case-parents study (0.86>0.78). MTHFR A1298C polymorphism seemed to have played an important role in the etiology on both cleft lip and palate. Data from the hybrid design and the log-linear model could help researchers to explore the effects of genotypes from both mothers and the offspring. This study design would present stronger power than the regular case-parents studies thus suitable for studies on the etiology of diseases in early lives, as birth defects. PMID:27346122
Using Web Logs in the Science Classroom
ERIC Educational Resources Information Center
Duplichan, Staycle C.
2009-01-01
As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…
Outdoor Education Student Log Book.
ERIC Educational Resources Information Center
Garbutt, Barbara; And Others.
A student log book for outdoor education was developed to aid Oakland County (Michigan) teachers and supervisors of outdoor education in preparing student campers for their role and responsibilities in the total program. A sample letter to sixth graders explains the purpose of the booklet. General camp rules (10) are presented, followed by 6 woods…
ERIC Educational Resources Information Center
Miles, Donna
2001-01-01
In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…
Dual spectra well logging system
Nussbaum, T.W.
1982-09-07
A dual spectra well logging system includes a well logging tool which is adapted to pass through a bore hole in an earth formation. The well logging tool includes at least two sensors which sense at least one condition of the earth formation and provides corresponding pulse signals. A circuit connected to the sensors provides a combined pulse signal wherein the pulses of the pulse signal from one sensor has one polarity and the pulses of the pulse signal from the other sensor has pulses of an opposite polarity. A circuit applies the combined pulse signal to a well logging cable which conducts the combined pulse signal to the surface of the earth formation. Surface apparatus includes a network connected to the cable which provides control signals in accordance with the polarity of the pulses in the combined pulse signal. A network connected to the cable inverts the combined pulse signal and provides a combined pulse signal and an inverted combined pulse signal. A first switching network receiving the combined pulse signal passes the pulses derived from the pulses of the one polarity in acccordance with the control signals to provide a first pulse signal while a second switching network receiving the inverted combined pulse signal passes the pulses derived from the pulses of the opposite polarity in accordance with the control signals to provide a second pulse signal. An output network processes the two pulse signals to provide an indication of the earth's condition in accordance with the processed pulse signals.
Statistical log analysis made practical
Mitchell, W.K.; Nelson, R.J. )
1991-06-01
This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.
Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.
2015-01-01
We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.
West Pembina Nisku reef log-core correlations
McFadzean, T.B.
1983-01-01
A study was initiated to correlate log parameters and core data from West Pembina Nisku (D-2) Pinnacle reefs. The primary objective was to derive basic input data for making volumetric estimates of reserves and for providing initial input data for doing enhanced recovery model studies. A secondary objective was to determine if a set of log analysis parameters could be derived that would work universally in the many pinnacle reefs of the West Pembina area. The results of the study indicate that porosity log response in the West Pembina Nisku reefs deviates a fair amount from the standards used in log analysis chart books. A multilinear regression formula for determining porosity was derived which worked well for 18 wells studied, on which there was both log and core data. A brief description of the Nisku reef geology, a description of the methods used in doing the study, and a graphic presentation of the results are included.
A note on the asymptotic distribution of likelihood ratio tests to test variance components.
Visscher, Peter M
2006-08-01
When using maximum likelihood methods to estimate genetic and environmental components of (co)variance, it is common to test hypotheses using likelihood ratio tests, since such tests have desirable asymptotic properties. In particular, the standard likelihood ratio test statistic is assumed asymptotically to follow a chi2 distribution with degrees of freedom equal to the number of parameters tested. Using the relationship between least squares and maximum likelihood estimators for balanced designs, it is shown why the asymptotic distribution of the likelihood ratio test for variance components does not follow a chi2 distribution with degrees of freedom equal to the number of parameters tested when the null hypothesis is true. Instead, the distribution of the likelihood ratio test is a mixture of chi2 distributions with different degrees of freedom. Implications for testing variance components in twin designs and for quantitative trait loci mapping are discussed. The appropriate distribution of the likelihood ratio test statistic should be used in hypothesis testing and model selection. PMID:16899155
Model-free linkage analysis using likelihoods
Curtis, D.; Sham, P.C.
1995-09-01
Misspecification of transmission model parameters can produce artifactually lod scores at small recombination fractions and in multipoint analysis. To avoid this problem, we have tried to devise a test that aims to detect a genetic effect at a particular locus, rather than attempting to estimate the map position of a locus with specified effect. Maximizing likelihoods over transmission model parameters, as well as linkage parameters, can produce seriously biased parameter estimates and so yield tests that lack power for the detection of linkage. However, constraining the transmission model parameters to produce the correct population prevalence largely avoids this problem. For computational convenience, we recommend that the likelihoods under linkage and nonlinkage are independently maximized over a limited set of transmission models, ranging from Mendelian dominant to null effect and from null effect to Mendelian recessive. In order to test for a genetic effect at a given map position, the likelihood under linkage is maximized over admixture, the proportion of families linked. Application to simulated data for a wide range of transmission models in both affected sib pairs and pedigrees demonstrates that the new method is well behaved under the null hypothesis and provides a powerful test for linkage when it is present. This test requires no specification of transmission model parameters, apart from an approximate estimate of the population prevalence. It can be applied equally to sib pairs and pedigrees, and, since it does not diminish the lod score at test positions very close to a marker, it is suitable for application to multipoint data. 24 refs., 1 fig., 4 tabs.
Estimating the Likelihood of Extreme Seismogenic Tsunamis
NASA Astrophysics Data System (ADS)
Geist, E. L.
2011-12-01
Because of high levels of destruction to coastal communities and critical facilities from recent tsunamis, estimating the likelihood of extreme seismogenic tsunamis has gained increased attention. Seismogenic tsunami generating capacity is directly related to the scalar seismic moment of the earthquake. As such, earthquake size distributions and recurrence can inform the likelihood of tsunami occurrence. The probability of extreme tsunamis is dependent on how the right-hand tail of the earthquake size distribution is specified. As evidenced by the 2004 Sumatra-Andaman and 2011 Tohoku earthquakes, it is likely that there is insufficient historical information to estimate the maximum earthquake magnitude (Mmax) for any specific subduction zone. Mmax may in fact not be a useful concept for subduction zones of significant length. Earthquake size distributions with a soft corner moment appear more consistent with global observations. Estimating the likelihood of extreme local tsunami runup is complicated by the fact that there is significant uncertainty in the scaling relationship between seismic moment and maximum local tsunami runup. This uncertainty arises from variations in source parameters specific to tsunami generation and the near-shore hydrodynamic response. The primary source effect is how slip is distributed along the fault relative to the overlying water depth. For high slip beneath deep water, shoaling amplification of the tsunami increases substantially according to Green's Law, compared to an equivalent amount of slip beneath shallow water. Both stochastic slip models and dynamic rupture models of tsunamigenic earthquakes are explored in a probabilistic context. The nearshore hydrodynamic response includes attenuating mechanisms, such as wave breaking, and amplifying mechanisms, such as constructive interference of trapped and non-trapped modes. Probabilistic estimates of extreme tsunamis are therefore site specific, as indicated by significant variations
Modelling autoimmune rheumatic disease: a likelihood rationale.
Ulvestad, E
2003-07-01
Immunoglobulins (Igs) and autoantibodies are commonly tested in sera from patients with suspected rheumatic disease. To evaluate the clinical utility of the tests in combination, we investigated sera from 351 patients with autoimmune rheumatic disease (ARD) rheumatoid arthritis (RA), systemic lupus erythematosus (SLE) and Sjögren's syndrome (SS) and 96 patients with nonautoimmune rheumatic disease (NAD) (fibromyalgia, osteoarthritis, etc.). Antinuclear antibodies (ANA), rheumatoid factor (RF), antibodies against DNA and extractable nuclear antigens (anti-ENA), IgG, IgA and IgM were measured for all patients. Logistic regression analysis of test results was used to calculate each patient's probability for belonging to the ARD or NAD group as well as likelihood ratios for disease. Test accuracy was investigated using receiver-operating characteristic (ROC) plots and nonparametric ROC analysis. Neither concentrations of IgG, IgA, IgM, anti-DNA nor anti-ENA gave a significant effect on diagnostic outcome. Probabilities for disease and likelihood ratios calculated by combining RF and ANA performed significantly better at predicting ARD than utilization of the diagnostic tests in isolation (P < 0.001). At a cut-off level of P = 0.73 and likelihood ratio = 1, the logistic model gave a specificity of 93% and a sensitivity of 75% for the differentiation between ARD and NAD. When compared at the same level of specificity, ANA gave a sensitivity of 37% and RF gave a sensitivity of 56.6%. Dichotomizing ANA and RF as positive or negative did not reduce the performance characteristics of the model. Combining results obtained from serological analysis of ANA and RF according to this model will increase the diagnostic utility of the tests in rheumatological practice. PMID:12828565
Intelligence's likelihood and evolutionary time frame
NASA Astrophysics Data System (ADS)
Bogonovich, Marc
2011-04-01
This paper outlines hypotheses relevant to the evolution of intelligent life and encephalization in the Phanerozoic. If general principles are inferable from patterns of Earth life, implications could be drawn for astrobiology. Many of the outlined hypotheses, relevant data, and associated evolutionary and ecological theory are not frequently cited in astrobiological journals. Thus opportunity exists to evaluate reviewed hypotheses with an astrobiological perspective. A quantitative method is presented for testing one of the reviewed hypotheses (hypothesis i; the diffusion hypothesis). Questions are presented throughout, which illustrate that the question of intelligent life's likelihood can be expressed as multiple, broadly ranging, more tractable questions.
Score-based likelihood ratios for handwriting evidence.
Hepler, Amanda B; Saunders, Christopher P; Davis, Linda J; Buscaglia, JoAnn
2012-06-10
Score-based approaches for computing forensic likelihood ratios are becoming more prevalent in the forensic literature. When two items of evidential value are entangled via a scorefunction, several nuances arise when attempting to model the score behavior under the competing source-level propositions. Specific assumptions must be made in order to appropriately model the numerator and denominator probability distributions. This process is fairly straightforward for the numerator of the score-based likelihood ratio, entailing the generation of a database of scores obtained by pairing items of evidence from the same source. However, this process presents ambiguities for the denominator database generation - in particular, how best to generate a database of scores between two items of different sources. Many alternatives have appeared in the literature, three of which we will consider in detail. They differ in their approach to generating denominator databases, by pairing (1) the item of known source with randomly selected items from a relevant database; (2) the item of unknown source with randomly generated items from a relevant database; or (3) two randomly generated items. When the two items differ in type, perhaps one having higher information content, these three alternatives can produce very different denominator databases. While each of these alternatives has appeared in the literature, the decision of how to generate the denominator database is often made without calling attention to the subjective nature of this process. In this paper, we compare each of the three methods (and the resulting score-based likelihood ratios), which can be thought of as three distinct interpretations of the denominator proposition. Our goal in performing these comparisons is to illustrate the effect that subtle modifications of these propositions can have on inferences drawn from the evidence evaluation procedure. The study was performed using a data set composed of cursive writing
How much to trust the senses: likelihood learning.
Sato, Yoshiyuki; Kording, Konrad P
2014-01-01
Our brain often needs to estimate unknown variables from imperfect information. Our knowledge about the statistical distributions of quantities in our environment (called priors) and currently available information from sensory inputs (called likelihood) are the basis of all Bayesian models of perception and action. While we know that priors are learned, most studies of prior-likelihood integration simply assume that subjects know about the likelihood. However, as the quality of sensory inputs change over time, we also need to learn about new likelihoods. Here, we show that human subjects readily learn the distribution of visual cues (likelihood function) in a way that can be predicted by models of statistically optimal learning. Using a likelihood that depended on color context, we found that a learned likelihood generalized to new priors. Thus, we conclude that subjects learn about likelihood. PMID:25398975
Maximum likelihood decoding of Reed Solomon Codes
Sudan, M.
1996-12-31
We present a randomized algorithm which takes as input n distinct points ((x{sub i}, y{sub i})){sup n}{sub i=1} from F x F (where F is a field) and integer parameters t and d and returns a list of all univariate polynomials f over F in the variable x of degree at most d which agree with the given set of points in at least t places (i.e., y{sub i} = f (x{sub i}) for at least t values of i), provided t = {Omega}({radical}nd). The running time is bounded by a polynomial in n. This immediately provides a maximum likelihood decoding algorithm for Reed Solomon Codes, which works in a setting with a larger number of errors than any previously known algorithm. To the best of our knowledge, this is the first efficient (i.e., polynomial time bounded) algorithm which provides some maximum likelihood decoding for any efficient (i.e., constant or even polynomial rate) code.
Sensor registration using airlanes: maximum likelihood solution
NASA Astrophysics Data System (ADS)
Ong, Hwa-Tung
2004-01-01
In this contribution, the maximum likelihood estimation of sensor registration parameters, such as range, azimuth and elevation biases in radar measurements, using airlane information is proposed and studied. The motivation for using airlane information for sensor registration is that it is freely available as a source of reference and it provides an alternative to conventional techniques that rely on synchronised and correctly associated measurements from two or more sensors. In the paper, the problem is first formulated in terms of a measurement model that is a nonlinear function of the unknown target state and sensor parameters, plus sensor noise. A probabilistic model of the target state is developed based on airlane information. The maximum likelihood and also maximum a posteriori solutions are given. The Cramer-Rao lower bound is derived and simulation results are presented for the case of estimating the biases in radar range, azimuth and elevation measurements. The accuracy of the proposed method is compared against the Cramer-Rao lower bound and that of an existing two-sensor alignment method. It is concluded that sensor registration using airlane information is a feasible alternative to existing techniques.
Sensor registration using airlanes: maximum likelihood solution
NASA Astrophysics Data System (ADS)
Ong, Hwa-Tung
2003-12-01
In this contribution, the maximum likelihood estimation of sensor registration parameters, such as range, azimuth and elevation biases in radar measurements, using airlane information is proposed and studied. The motivation for using airlane information for sensor registration is that it is freely available as a source of reference and it provides an alternative to conventional techniques that rely on synchronised and correctly associated measurements from two or more sensors. In the paper, the problem is first formulated in terms of a measurement model that is a nonlinear function of the unknown target state and sensor parameters, plus sensor noise. A probabilistic model of the target state is developed based on airlane information. The maximum likelihood and also maximum a posteriori solutions are given. The Cramer-Rao lower bound is derived and simulation results are presented for the case of estimating the biases in radar range, azimuth and elevation measurements. The accuracy of the proposed method is compared against the Cramer-Rao lower bound and that of an existing two-sensor alignment method. It is concluded that sensor registration using airlane information is a feasible alternative to existing techniques.
CORA: Emission Line Fitting with Maximum Likelihood
NASA Astrophysics Data System (ADS)
Ness, Jan-Uwe; Wichmann, Rainer
2011-12-01
The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.
CORA - emission line fitting with Maximum Likelihood
NASA Astrophysics Data System (ADS)
Ness, J.-U.; Wichmann, R.
2002-07-01
The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.
Temperature compensated well logging tool
Riedesel, R.G.; Nussbaum, T.W.; Warren, W.F.
1984-01-24
A well logging tool adapted for use in a borehole traversing an earth formation includes at least one sensor sensing at least one characteristic of the earth formation. Another sensor senses the ambient temperature and provides a corresponding temperature signal. An output circuit provides a temperature compensated output signal corresponding to the sensed characteristic of the earth formation in accordance with the temperature signal and the characteristic signal.
Chemical logging of geothermal wells
Allen, C.A.; McAtee, R.E.
The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.
Chemical logging of geothermal wells
Allen, Charles A.; McAtee, Richard E.
1981-01-01
The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.
Audit Log for Forensic Photography
NASA Astrophysics Data System (ADS)
Neville, Timothy; Sorell, Matthew
We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.
Developmental Changes in Children's Understanding of Future Likelihood and Uncertainty
ERIC Educational Resources Information Center
Lagattuta, Kristin Hansen; Sayfan, Liat
2011-01-01
Two measures assessed 4-10-year-olds' and adults' (N = 201) understanding of future likelihood and uncertainty. In one task, participants sequenced sets of event pictures varying by one physical dimension according to increasing future likelihood. In a separate task, participants rated characters' thoughts about the likelihood of future events,…
ERIC Educational Resources Information Center
Wothke, Werner; Burket, George; Chen, Li-Sue; Gao, Furong; Shu, Lianghua; Chia, Mike
2011-01-01
It has been known for some time that item response theory (IRT) models may exhibit a likelihood function of a respondent's ability which may have multiple modes, flat modes, or both. These conditions, often associated with guessing of multiple-choice (MC) questions, can introduce uncertainty and bias to ability estimation by maximum likelihood…
TIOGA LAKE, HALL NATURAL AREA, LOG CABIN-SADDLEBAG, AND HORSE MEADOWS ROADLESS AREAS, CALIFORNIA.
Seitz, J.F.; Federspiel, F.E.
1984-01-01
Studies of the geology and mineral resources of the Tioga Lake, Hall Natural Area, Log Cabin-Saddlebag, and Horse Meadows Roadless Areas in California indicate that parts of the Log Cabin-Saddlebag and Hall Natural Roadless Areas have a substantiated resource potential for gold and (or) silver resources, and a probable potential for tungsten and molybdenum resources. Tioga Lake Roadless Area has little likelihood for the occurrence of mineral resources and the Horse Meadows Roadless Area has a probable potential for low-grade tungsten, gold, and (or) silver resources. The geologic terrane in the roadless areas precludes the occurrence of organic fuel resources.
Approximate maximum likelihood decoding of block codes
NASA Technical Reports Server (NTRS)
Greenberger, H. J.
1979-01-01
Approximate maximum likelihood decoding algorithms, based upon selecting a small set of candidate code words with the aid of the estimated probability of error of each received symbol, can give performance close to optimum with a reasonable amount of computation. By combining the best features of various algorithms and taking care to perform each step as efficiently as possible, a decoding scheme was developed which can decode codes which have better performance than those presently in use and yet not require an unreasonable amount of computation. The discussion of the details and tradeoffs of presently known efficient optimum and near optimum decoding algorithms leads, naturally, to the one which embodies the best features of all of them.
Groups, information theory, and Einstein's likelihood principle
NASA Astrophysics Data System (ADS)
Sicuro, Gabriele; Tempesta, Piergiulio
2016-04-01
We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.
Modelling default and likelihood reasoning as probabilistic
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
Groups, information theory, and Einstein's likelihood principle.
Sicuro, Gabriele; Tempesta, Piergiulio
2016-04-01
We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts. PMID:27176234
A Maximum-Likelihood Approach to Force-Field Calibration.
Zaborowski, Bartłomiej; Jagieła, Dawid; Czaplewski, Cezary; Hałabis, Anna; Lewandowska, Agnieszka; Żmudzińska, Wioletta; Ołdziej, Stanisław; Karczyńska, Agnieszka; Omieczynski, Christian; Wirecki, Tomasz; Liwo, Adam
2015-09-28
A new approach to the calibration of the force fields is proposed, in which the force-field parameters are obtained by maximum-likelihood fitting of the calculated conformational ensembles to the experimental ensembles of training system(s). The maximum-likelihood function is composed of logarithms of the Boltzmann probabilities of the experimental conformations, calculated with the current energy function. Because the theoretical distribution is given in the form of the simulated conformations only, the contributions from all of the simulated conformations, with Gaussian weights in the distances from a given experimental conformation, are added to give the contribution to the target function from this conformation. In contrast to earlier methods for force-field calibration, the approach does not suffer from the arbitrariness of dividing the decoy set into native-like and non-native structures; however, if such a division is made instead of using Gaussian weights, application of the maximum-likelihood method results in the well-known energy-gap maximization. The computational procedure consists of cycles of decoy generation and maximum-likelihood-function optimization, which are iterated until convergence is reached. The method was tested with Gaussian distributions and then applied to the physics-based coarse-grained UNRES force field for proteins. The NMR structures of the tryptophan cage, a small α-helical protein, determined at three temperatures (T = 280, 305, and 313 K) by Hałabis et al. ( J. Phys. Chem. B 2012 , 116 , 6898 - 6907 ), were used. Multiplexed replica-exchange molecular dynamics was used to generate the decoys. The iterative procedure exhibited steady convergence. Three variants of optimization were tried: optimization of the energy-term weights alone and use of the experimental ensemble of the folded protein only at T = 280 K (run 1); optimization of the energy-term weights and use of experimental ensembles at all three temperatures (run 2
A maximum likelihood framework for protein design
Kleinman, Claudia L; Rodrigue, Nicolas; Bonnard, Cécile; Philippe, Hervé; Lartillot, Nicolas
2006-01-01
Background The aim of protein design is to predict amino-acid sequences compatible with a given target structure. Traditionally envisioned as a purely thermodynamic question, this problem can also be understood in a wider context, where additional constraints are captured by learning the sequence patterns displayed by natural proteins of known conformation. In this latter perspective, however, we still need a theoretical formalization of the question, leading to general and efficient learning methods, and allowing for the selection of fast and accurate objective functions quantifying sequence/structure compatibility. Results We propose a formulation of the protein design problem in terms of model-based statistical inference. Our framework uses the maximum likelihood principle to optimize the unknown parameters of a statistical potential, which we call an inverse potential to contrast with classical potentials used for structure prediction. We propose an implementation based on Markov chain Monte Carlo, in which the likelihood is maximized by gradient descent and is numerically estimated by thermodynamic integration. The fit of the models is evaluated by cross-validation. We apply this to a simple pairwise contact potential, supplemented with a solvent-accessibility term, and show that the resulting models have a better predictive power than currently available pairwise potentials. Furthermore, the model comparison method presented here allows one to measure the relative contribution of each component of the potential, and to choose the optimal number of accessibility classes, which turns out to be much higher than classically considered. Conclusion Altogether, this reformulation makes it possible to test a wide diversity of models, using different forms of potentials, or accounting for other factors than just the constraint of thermodynamic stability. Ultimately, such model-based statistical analyses may help to understand the forces shaping protein sequences, and
Avian responses to selective logging shaped by species traits and logging practices.
Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S; Koh, Lian Pin
2015-06-01
Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging. PMID:25994673
Tucker Wireline Open Hole Wireline Logging
Milliken, M.
2002-05-23
The Tucker Wireline unit ran a suite of open hole logs right behind the RMOTC logging contractor for comparison purposes. The tools included Dual Laterolog, Phased Induction, BHC Sonic, and Density-Porosity.
Likelihood approach for evaluating bioequivalence of highly variable drugs
Du, Liping; Choi, Leena
2015-01-01
Bioequivalence (BE) is required for approving a generic drug. The Two-One-Sided-Test (TOST, or the 90% confidence interval approach) has been used as the mainstream methodology to test average BE (ABE) on pharmacokinetic parameters such as the area under the blood concentration-time curve and the peak concentration. However, for highly variable drugs (%CV > 30%), it is difficult to demonstrate ABE in a standard cross-over study with the typical number of subjects using the TOST due to lack of power. Recently, the US Food and Drug Administration and the European Medicines Agency recommended similar but not identical reference scaled average bioequivalence (RSABE) approaches to address this issue. Although the power is improved, the new approaches may not guarantee a high level of confidence for the true difference between two drugs at the ABE boundaries. It is also difficult for these approaches to address the issues of population BE (PBE) and individual BE (IBE). We advocate the use of a likelihood approach for representing and interpreting BE data as evidence. Using example data from a full replicate 2 × 4 cross-over study, we demonstrate how to present evidence using the profile likelihoods for the mean difference and standard deviation ratios of the two drugs for the pharmacokinetic parameters. With this approach, we present evidence for PBE and IBE as well as ABE within a unified framework. Our simulations show that the operating characteristics of the proposed likelihood approach are comparable with the RSABE approaches when the same criteria are applied. PMID:25408492
Maximum likelihood analysis of bubble incidence for mixed gas diving.
Tikuisis, P; Gault, K; Carrod, G
1990-03-01
The method of maximum likelihood has been applied to predict the incidence of bubbling in divers for both air and helium diving. Data were obtained from 108 air man-dives and 622 helium man-dives conducted experimentally in a hyperbaric chamber. Divers were monitored for bubbles using Doppler ultrasonics during the period from surfacing until approximately 2 h after surfacing. Bubble grades were recorded according to the K-M code, and the maximum value in the precordial region for each diver was used in the likelihood analysis. Prediction models were based on monoexponential gas kinetics using one and two parallel-compartment configurations. The model parameters were of three types: gas kinetics, gas potency, and compartment gain. When the potency of the gases was not distinguished, the risk criterion used was inherently based on the gas supersaturation ratio, otherwise it was based on the potential bubble volume. The two-compartment model gave a significantly better prediction than the one-compartment model only if the kinetics of nitrogen and helium were distinguished. A further significant improvement with the two-compartment model was obtained when the potency of the two gases was distinguished, thereby making the potential bubble volume criterion a better choice than the gas pressure criterion. The results suggest that when the method of maximum likelihood is applied for the prediction of the incidence of bubbling, more than one compartment should be used and if more than one is used consideration should be given to distinguishing the potencies of the inert gases. PMID:2181767
Limits on Log Odds Ratios for Unidimensional Item Response Theory Models
ERIC Educational Resources Information Center
Haberman, Shelby J.; Holland, Paul W.; Sinharay, Sandip
2007-01-01
Bounds are established for log odds ratios (log cross-product ratios) involving pairs of items for item response models. First, expressions for bounds on log odds ratios are provided for one-dimensional item response models in general. Then, explicit bounds are obtained for the Rasch model and the two-parameter logistic (2PL) model. Results are…
40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.
Code of Federal Regulations, 2013 CFR
2013-07-01
... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...
40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.
Code of Federal Regulations, 2014 CFR
2014-07-01
... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...
40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.
Code of Federal Regulations, 2012 CFR
2012-07-01
... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...
40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.
Code of Federal Regulations, 2014 CFR
2014-07-01
... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...
40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.
Code of Federal Regulations, 2013 CFR
2013-07-01
... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...
40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.
Code of Federal Regulations, 2012 CFR
2012-07-01
... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...
NASA Technical Reports Server (NTRS)
Collazo, Carlimar
2011-01-01
The statement of purpose is to analyze network monitoring logs to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.
Balloon logging with the inverted skyline
NASA Technical Reports Server (NTRS)
Mosher, C. F.
1975-01-01
There is a gap in aerial logging techniques that has to be filled. The need for a simple, safe, sizeable system has to be developed before aerial logging will become effective and accepted in the logging industry. This paper presents such a system designed on simple principles with realistic cost and ecological benefits.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 5 2010-10-01 2010-10-01 false Station logs. 80.409 Section 80.409 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES STATIONS IN THE MARITIME SERVICES Station Documents § 80.409 Station logs. (a) General requirements. Logs must be established and properly maintained as follows:...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 5 2011-10-01 2011-10-01 false Station logs. 80.409 Section 80.409 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES STATIONS IN THE MARITIME SERVICES Station Documents § 80.409 Station logs. (a) General requirements. Logs must be established and properly maintained as follows:...
NASA Technical Reports Server (NTRS)
Hoffbeck, Joseph P.; Landgrebe, David A.
1994-01-01
Many analysis algorithms for high-dimensional remote sensing data require that the remotely sensed radiance spectra be transformed to approximate reflectance to allow comparison with a library of laboratory reflectance spectra. In maximum likelihood classification, however, the remotely sensed spectra are compared to training samples, thus a transformation to reflectance may or may not be helpful. The effect of several radiance-to-reflectance transformations on maximum likelihood classification accuracy is investigated in this paper. We show that the empirical line approach, LOWTRAN7, flat-field correction, single spectrum method, and internal average reflectance are all non-singular affine transformations, and that non-singular affine transformations have no effect on discriminant analysis feature extraction and maximum likelihood classification accuracy. (An affine transformation is a linear transformation with an optional offset.) Since the Atmosphere Removal Program (ATREM) and the log residue method are not affine transformations, experiments with Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data were conducted to determine the effect of these transformations on maximum likelihood classification accuracy. The average classification accuracy of the data transformed by ATREM and the log residue method was slightly less than the accuracy of the original radiance data. Since the radiance-to-reflectance transformations allow direct comparison of remotely sensed spectra with laboratory reflectance spectra, they can be quite useful in labeling the training samples required by maximum likelihood classification, but these transformations have only a slight effect or no effect at all on discriminant analysis and maximum likelihood classification accuracy.
Rayan, D Mark; Mohamad, Shariff Wan; Dorward, Leejiah; Aziz, Sheema Abdul; Clements, Gopalasamy Reuben; Christopher, Wong Chai Thiam; Traeholt, Carl; Magintan, David
2012-12-01
The endangered Asian tapir (Tapirus indicus) is threatened by large-scale habitat loss, forest fragmentation and increased hunting pressure. Conservation planning for this species, however, is hampered by a severe paucity of information on its ecology and population status. We present the first Asian tapir population density estimate from a camera trapping study targeting tigers in a selectively logged forest within Peninsular Malaysia using a spatially explicit capture-recapture maximum likelihood based framework. With a trap effort of 2496 nights, 17 individuals were identified corresponding to a density (standard error) estimate of 9.49 (2.55) adult tapirs/100 km(2) . Although our results include several caveats, we believe that our density estimate still serves as an important baseline to facilitate the monitoring of tapir population trends in Peninsular Malaysia. Our study also highlights the potential of extracting vital ecological and population information for other cryptic individually identifiable animals from tiger-centric studies, especially with the use of a spatially explicit capture-recapture maximum likelihood based framework. PMID:23253368
Maximum-likelihood methods in wavefront sensing: stochastic models and likelihood functions
Barrett, Harrison H.; Dainty, Christopher; Lara, David
2008-01-01
Maximum-likelihood (ML) estimation in wavefront sensing requires careful attention to all noise sources and all factors that influence the sensor data. We present detailed probability density functions for the output of the image detector in a wavefront sensor, conditional not only on wavefront parameters but also on various nuisance parameters. Practical ways of dealing with nuisance parameters are described, and final expressions for likelihoods and Fisher information matrices are derived. The theory is illustrated by discussing Shack–Hartmann sensors, and computational requirements are discussed. Simulation results show that ML estimation can significantly increase the dynamic range of a Shack–Hartmann sensor with four detectors and that it can reduce the residual wavefront error when compared with traditional methods. PMID:17206255
Williamson, Ross S.; Sahani, Maneesh; Pillow, Jonathan W.
2015-01-01
Stimulus dimensionality-reduction methods in neuroscience seek to identify a low-dimensional space of stimulus features that affect a neuron’s probability of spiking. One popular method, known as maximally informative dimensions (MID), uses an information-theoretic quantity known as “single-spike information” to identify this space. Here we examine MID from a model-based perspective. We show that MID is a maximum-likelihood estimator for the parameters of a linear-nonlinear-Poisson (LNP) model, and that the empirical single-spike information corresponds to the normalized log-likelihood under a Poisson model. This equivalence implies that MID does not necessarily find maximally informative stimulus dimensions when spiking is not well described as Poisson. We provide several examples to illustrate this shortcoming, and derive a lower bound on the information lost when spiking is Bernoulli in discrete time bins. To overcome this limitation, we introduce model-based dimensionality reduction methods for neurons with non-Poisson firing statistics, and show that they can be framed equivalently in likelihood-based or information-theoretic terms. Finally, we show how to overcome practical limitations on the number of stimulus dimensions that MID can estimate by constraining the form of the non-parametric nonlinearity in an LNP model. We illustrate these methods with simulations and data from primate visual cortex. PMID:25831448
Accuracy of maximum likelihood estimates of a two-state model in single-molecule FRET
Gopich, Irina V.
2015-01-21
Photon sequences from single-molecule Förster resonance energy transfer (FRET) experiments can be analyzed using a maximum likelihood method. Parameters of the underlying kinetic model (FRET efficiencies of the states and transition rates between conformational states) are obtained by maximizing the appropriate likelihood function. In addition, the errors (uncertainties) of the extracted parameters can be obtained from the curvature of the likelihood function at the maximum. We study the standard deviations of the parameters of a two-state model obtained from photon sequences with recorded colors and arrival times. The standard deviations can be obtained analytically in a special case when the FRET efficiencies of the states are 0 and 1 and in the limiting cases of fast and slow conformational dynamics. These results are compared with the results of numerical simulations. The accuracy and, therefore, the ability to predict model parameters depend on how fast the transition rates are compared to the photon count rate. In the limit of slow transitions, the key parameters that determine the accuracy are the number of transitions between the states and the number of independent photon sequences. In the fast transition limit, the accuracy is determined by the small fraction of photons that are correlated with their neighbors. The relative standard deviation of the relaxation rate has a “chevron” shape as a function of the transition rate in the log-log scale. The location of the minimum of this function dramatically depends on how well the FRET efficiencies of the states are separated.
Logging while fishing technique results in substantial savings
Tollefsen, E.; Everett, M.
1996-12-01
During wireline logging operations, tools occasionally become stuck in the borehole and require fishing. A typical fishing job can take anywhere from 1{1/2}--4 days. In the Gulf of Mexico, a fishing job can easily cost between $100,000 and $500,000. These costs result from nonproductive time during the fishing trip, associated wiper trip and relogging the well. Logging while fishing (LWF) technology is a patented system capable of retrieving a stuck fish and completing the logging run during the same pipe descent. Completing logging operations using LWF method saves time and money. The technique also provides well information where data may not otherwise have been obtained. Other benefits include reduced fishing time and an increased level of safety.
The logN-logS relationship of normal X-ray emitting galaxies
NASA Astrophysics Data System (ADS)
Tajer, M.; Trinchieri, G.; Wolter A.; Campana, S.; Moretti, A.; Tagliaferri, G.
We have selected a flux limited serendipitous sample of galaxies from the cross-correlation of the BMW (Brera Multiscale Wavelet) ROSAT HRI and the LEDA (Lyon - Meudon Extragalactic Database) Catalogues. This sample is used to study the X-ray properties of normal galaxies in the local universe. We also find that the logN-logS distribution we derived for a serendipitous subsample, optically and X-ray flux limited, is consistent with the euclidean slope in the flux range FX(0.5 - 2) ˜ 1.1 - 110 × 10-14 erg cm-2 s-1. We further show that the same law is valid over 4 decades, from the bright sample derived from the RASS data to the very faint detections in deep XMM-Newton fields.
Maximum likelihood versus likelihood-free quantum system identification in the atom maser
NASA Astrophysics Data System (ADS)
Catana, Catalin; Kypraios, Theodore; Guţă, Mădălin
2014-10-01
We consider the problem of estimating a dynamical parameter of a Markovian quantum open system (the atom maser), by performing continuous time measurements in the system's output (outgoing atoms). Two estimation methods are investigated and compared. Firstly, the maximum likelihood estimator (MLE) takes into account the full measurement data and is asymptotically optimal in terms of its mean square error. Secondly, the ‘likelihood-free’ method of approximate Bayesian computation (ABC) produces an approximation of the posterior distribution for a given set of summary statistics, by sampling trajectories at different parameter values and comparing them with the measurement data via chosen statistics. Building on previous results which showed that atom counts are poor statistics for certain values of the Rabi angle, we apply MLE to the full measurement data and estimate its Fisher information. We then select several correlation statistics such as waiting times, distribution of successive identical detections, and use them as input of the ABC algorithm. The resulting posterior distribution follows closely the data likelihood, showing that the selected statistics capture ‘most’ statistical information about the Rabi angle.
Dimension-independent likelihood-informed MCMC
Cui, Tiangang; Law, Kody J. H.; Marzouk, Youssef M.
2015-10-08
Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. Our work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. There are two distinct lines of research that intersect in the methods we develop here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian informationmore » and any associated lowdimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Finally, we use two nonlinear inverse problems in order to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.« less
Dimension-independent likelihood-informed MCMC
Cui, Tiangang; Law, Kody J. H.; Marzouk, Youssef M.
2015-10-08
Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. Our work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. There are two distinct lines of research that intersect in the methods we develop here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated lowdimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Finally, we use two nonlinear inverse problems in order to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.
The maximum likelihood dating of magnetostratigraphic sections
NASA Astrophysics Data System (ADS)
Man, Otakar
2011-04-01
In general, stratigraphic sections are dated by biostratigraphy and magnetic polarity stratigraphy (MPS) is subsequently used to improve the dating of specific section horizons or to correlate these horizons in different sections of similar age. This paper shows, however, that the identification of a record of a sufficient number of geomagnetic polarity reversals against a reference scale often does not require any complementary information. The deposition and possible subsequent erosion of the section is herein regarded as a stochastic process, whose discrete time increments are independent and normally distributed. This model enables the expression of the time dependence of the magnetic record of section increments in terms of probability. To date samples bracketing the geomagnetic polarity reversal horizons, their levels are combined with various sequences of successive polarity reversals drawn from the reference scale. Each particular combination gives rise to specific constraints on the unknown ages of the primary remanent magnetization of samples. The problem is solved by the constrained maximization of the likelihood function with respect to these ages and parameters of the model, and by subsequent maximization of this function over the set of possible combinations. A statistical test of the significance of this solution is given. The application of this algorithm to various published magnetostratigraphic sections that included nine or more polarity reversals gave satisfactory results. This possible self-sufficiency makes MPS less dependent on other dating techniques.
Targeted maximum likelihood estimation in safety analysis
Lendle, Samuel D.; Fireman, Bruce; van der Laan, Mark J.
2013-01-01
Objectives To compare the performance of a targeted maximum likelihood estimator (TMLE) and a collaborative TMLE (CTMLE) to other estimators in a drug safety analysis, including a regression-based estimator, propensity score (PS)–based estimators, and an alternate doubly robust (DR) estimator in a real example and simulations. Study Design and Setting The real data set is a subset of observational data from Kaiser Permanente Northern California formatted for use in active drug safety surveillance. Both the real and simulated data sets include potential confounders, a treatment variable indicating use of one of two antidiabetic treatments and an outcome variable indicating occurrence of an acute myocardial infarction (AMI). Results In the real data example, there is no difference in AMI rates between treatments. In simulations, the double robustness property is demonstrated: DR estimators are consistent if either the initial outcome regression or PS estimator is consistent, whereas other estimators are inconsistent if the initial estimator is not consistent. In simulations with near-positivity violations, CTMLE performs well relative to other estimators by adaptively estimating the PS. Conclusion Each of the DR estimators was consistent, and TMLE and CTMLE had the smallest mean squared error in simulations. PMID:23849159
Dimension-independent likelihood-informed MCMC
NASA Astrophysics Data System (ADS)
Cui, Tiangang; Law, Kody J. H.; Marzouk, Youssef M.
2016-01-01
Many Bayesian inference problems require exploring the posterior distribution of high-dimensional parameters that represent the discretization of an underlying function. This work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. Two distinct lines of research intersect in the methods developed here. First, we introduce a general class of operator-weighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated low-dimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operator-weighted proposals adapted to the non-Gaussian structure of the posterior. The resulting dimension-independent and likelihood-informed (DILI) MCMC samplers may be useful for a large class of high-dimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Two nonlinear inverse problems are used to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.
Multiscale likelihood analysis and image reconstruction
NASA Astrophysics Data System (ADS)
Willett, Rebecca M.; Nowak, Robert D.
2003-11-01
The nonparametric multiscale polynomial and platelet methods presented here are powerful new tools for signal and image denoising and reconstruction. Unlike traditional wavelet-based multiscale methods, these methods are both well suited to processing Poisson or multinomial data and capable of preserving image edges. At the heart of these new methods lie multiscale signal decompositions based on polynomials in one dimension and multiscale image decompositions based on what the authors call platelets in two dimensions. Platelets are localized functions at various positions, scales and orientations that can produce highly accurate, piecewise linear approximations to images consisting of smooth regions separated by smooth boundaries. Polynomial and platelet-based maximum penalized likelihood methods for signal and image analysis are both tractable and computationally efficient. Polynomial methods offer near minimax convergence rates for broad classes of functions including Besov spaces. Upper bounds on the estimation error are derived using an information-theoretic risk bound based on squared Hellinger loss. Simulations establish the practical effectiveness of these methods in applications such as density estimation, medical imaging, and astronomy.
Disequilibrium mapping: Composite likelihood for pairwise disequilibrium
Devlin, B.; Roeder, K.; Risch, N.
1996-08-15
The pattern of linkage disequilibrium between a disease locus and a set of marker loci has been shown to be a useful tool for geneticists searching for disease genes. Several methods have been advanced to utilize the pairwise disequilibrium between the disease locus and each of a set of marker loci. However, none of the methods take into account the information from all pairs simultaneously while also modeling the variability in the disequilibrium values due to the evolutionary dynamics of the population. We propose a Composite Likelihood CL model that has these features when the physical distances between the marker loci are known or can be approximated. In this instance, and assuming that there is a single disease mutation, the CL model depends on only three parameters, the recombination fraction between the disease locus and an arbitrary marker locus, {theta}, the age of the mutation, and a variance parameter. When the CL is maximized over a grid of {theta}, it provides a graph that can direct the search for the disease locus. We also show how the CL model can be generalized to account for multiple disease mutations. Evolutionary simulations demonstrate the power of the analyses, as well as their potential weaknesses. Finally, we analyze the data from two mapped diseases, cystic fibrosis and diastrophic dysplasia, finding that the CL method performs well in both cases. 28 refs., 6 figs., 4 tabs.
Quantifying the likelihood of a continued hiatus in global warming
NASA Astrophysics Data System (ADS)
Roberts, C. D.; Palmer, M. D.; McNeall, D.; Collins, M.
2015-04-01
Since the end of the twentieth century, global mean surface temperature has not risen as rapidly as predicted by global climate models (GCMs). This discrepancy has become known as the global warming `hiatus’ and a variety of mechanisms have been proposed to explain the observed slowdown in warming. Focusing on internally generated variability, we use pre-industrial control simulations from an observationally constrained ensemble of GCMs and a statistical approach to evaluate the expected frequency and characteristics of variability-driven hiatus periods and their likelihood of future continuation. Given an expected forced warming trend of ~0.2 K per decade, our constrained ensemble of GCMs implies that the probability of a variability-driven 10-year hiatus is ~10%, but less than 1% for a 20-year hiatus. Although the absolute probability of a 20-year hiatus is small, the probability that an existing 15-year hiatus will continue another five years is much higher (up to 25%). Therefore, given the recognized contribution of internal climate variability to the reduced rate of global warming during the past 15 years, we should not be surprised if the current hiatus continues until the end of the decade. Following the termination of a variability-driven hiatus, we also show that there is an increased likelihood of accelerated global warming associated with release of heat from the sub-surface ocean and a reversal of the phase of decadal variability in the Pacific Ocean.
Likelihood of attending to the color word modulates Stroop interference.
Cho, Yang Seok; Choi, Jong Moon; Proctor, Robert W
2012-02-01
Three experiments investigated whether the Stroop color-naming effect is modulated by the likelihood of a color word capturing visual attention. In Experiment 1, a bar or a neutral word was presented at fixation as a color carrier, along with a color word randomly appearing in either an achromatic color (white in the main experiment, gray in a follow-up) or purple. Reduction of the Stroop effect (known as Stroop dilution) occurred when the color word was achromatic but not (or to a lesser extent) when it was in purple. In Experiment 2, the color of the color word remained constant throughout trial blocks, and Stroop dilution was equally evident when the word was always in purple and when it was always in white. In Experiment 3, a color bar was presented as the color carrier with both a color word and a neutral word. In this case, the Stroop effect was larger when the color word appeared in purple, and smaller when the neutral word appeared in purple, than when neither word did. These results imply that the extent to which processing of a color word occurs is determined by the likelihood of the word capturing attention. PMID:22147533
CMBFIT: Rapid WMAP likelihood calculations with normal parameters
NASA Astrophysics Data System (ADS)
Sandvik, Håvard B.; Tegmark, Max; Wang, Xiaomin; Zaldarriaga, Matias
2004-03-01
We present a method for ultrafast confrontation of the Wilkinson Microwave Anisotropy Probe (WMAP) cosmic microwave background observations with theoretical models, implemented as a publicly available software package called CMBFIT, useful for anyone wishing to measure cosmological parameters by combining WMAP with other observations. The method takes advantage of the underlying physics by transforming into a set of parameters where the WMAP likelihood surface is accurately fit by the exponential of a quartic or sextic polynomial. Building on previous physics based approximations by Hu et al., Kosowsky et al., and Chu et al., it combines their speed with precision cosmology grade accuracy. A FORTRAN code for computing the WMAP likelihood for a given set of parameters is provided, precalibrated against CMBFAST, accurate to Δ ln L˜0.05 over the entire 2σ region of the parameter space for 6 parameter “vanilla” ΛCDM models. We also provide 7-parameter fits including spatial curvature, gravitational waves and a running spectral index.
Maximum-likelihood approach to strain imaging using ultrasound
Insana, M. F.; Cook, L. T.; Bilgen, M.; Chaturvedi, P.; Zhu, Y.
2009-01-01
A maximum-likelihood (ML) strategy for strain estimation is presented as a framework for designing and evaluating bioelasticity imaging systems. Concepts from continuum mechanics, signal analysis, and acoustic scattering are combined to develop a mathematical model of the ultrasonic waveforms used to form strain images. The model includes three-dimensional (3-D) object motion described by affine transformations, Rayleigh scattering from random media, and 3-D system response functions. The likelihood function for these waveforms is derived to express the Fisher information matrix and variance bounds for displacement and strain estimation. The ML estimator is a generalized cross correlator for pre- and post-compression echo waveforms that is realized by waveform warping and filtering prior to cross correlation and peak detection. Experiments involving soft tissuelike media show the ML estimator approaches the Cramér–Rao error bound for small scaling deformations: at 5 MHz and 1.2% compression, the predicted lower bound for displacement errors is 4.4 µm and the measured standard deviation is 5.7 µm. PMID:10738797
The SWTR-LT2 Rule addresses the use of membranes that may be used to achieve credits for the reduction of Cryptosporidium. The LT2 Rule requires product-specific testing of membranes, considered an alternative filtration technology by the rule. LT2 Rule allows the product speci...
A Universal Logging System for LHCb Online
NASA Astrophysics Data System (ADS)
Nikolaidis, Fotis; Brarda, Loic; Garnier, Jean-Christophe; Neufeld, Niko
2011-12-01
A log is recording of system's activity, aimed to help system administrator to traceback an attack, find the causes of a malfunction and generally with troubleshooting. The fact that logs are the only information an administrator may have for an incident, makes logging system a crucial part of an IT infrastructure. In large scale infrastructures, such as LHCb Online, where quite a few GB of logs are produced daily, it is impossible for a human to review all of these logs. Moreover, a great percentage of them as just "noise". That makes clear that a more automated and sophisticated approach is needed. In this paper, we present a low-cost centralized logging system which allow us to do in-depth analysis of every log.
Planck 2013 results. XV. CMB power spectra and likelihood
NASA Astrophysics Data System (ADS)
Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Armitage-Caplan, C.; Arnaud, M.; Ashdown, M.; Atrio-Barandela, F.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bobin, J.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bridges, M.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Chiang, L.-Y.; Christensen, P. R.; Church, S.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Dunkley, J.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Gaier, T. C.; Galeotta, S.; Galli, S.; Ganga, K.; Giard, M.; Giardino, G.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D.; Helou, G.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jewell, J.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Knox, L.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lähteenmäki, A.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Laureijs, R. J.; Lawrence, C. R.; Le Jeune, M.; Leach, S.; Leahy, J. P.; Leonardi, R.; León-Tavares, J.; Lesgourgues, J.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maino, D.; Mandolesi, N.; Marinucci, D.; Maris, M.; Marshall, D. J.; Martin, P. G.; Martínez-González, E.; Masi, S.; Massardi, M.; Matarrese, S.; Matthai, F.; Mazzotta, P.; Meinhold, P. R.; Melchiorri, A.; Mendes, L.; Menegoni, E.; Mennella, A.; Migliaccio, M.; Millea, M.; Mitra, S.; Miville-Deschênes, M.-A.; Molinari, D.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; O'Dwyer, I. J.; Orieux, F.; Osborne, S.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Partridge, B.; Pasian, F.; Patanchon, G.; Paykari, P.; Perdereau, O.; Perotto, L.; Perrotta, F.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Popa, L.; Poutanen, T.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rahlin, A.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Ricciardi, S.; Riller, T.; Ringeval, C.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Roudier, G.; Rowan-Robinson, M.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Sanselme, L.; Santos, D.; Savini, G.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Starck, J.-L.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sureau, F.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Türler, M.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Varis, J.; Vielva, P.; Villa, F.; Vittorio, N.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; White, M.; White, S. D. M.; Yvon, D.; Zacchei, A.; Zonca, A.
2014-11-01
This paper presents the Planck 2013 likelihood, a complete statistical description of the two-point correlation function of the CMB temperature fluctuations that accounts for all known relevant uncertainties, both instrumental and astrophysical in nature. We use this likelihood to derive our best estimate of the CMB angular power spectrum from Planck over three decades in multipole moment, ℓ, covering 2 ≤ ℓ ≤ 2500. The main source of uncertainty at ℓ ≲ 1500 is cosmic variance. Uncertainties in small-scale foreground modelling and instrumental noise dominate the error budget at higher ℓs. For ℓ < 50, our likelihood exploits all Planck frequency channels from 30 to 353 GHz, separating the cosmological CMB signal from diffuse Galactic foregrounds through a physically motivated Bayesian component separation technique. At ℓ ≥ 50, we employ a correlated Gaussian likelihood approximation based on a fine-grained set of angular cross-spectra derived from multiple detector combinations between the 100, 143, and 217 GHz frequency channels, marginalising over power spectrum foreground templates. We validate our likelihood through an extensive suite of consistency tests, and assess the impact of residual foreground and instrumental uncertainties on the final cosmological parameters. We find good internal agreement among the high-ℓ cross-spectra with residuals below a few μK2 at ℓ ≲ 1000, in agreement with estimated calibration uncertainties. We compare our results with foreground-cleaned CMB maps derived from all Planck frequencies, as well as with cross-spectra derived from the 70 GHz Planck map, and find broad agreement in terms of spectrum residuals and cosmological parameters. We further show that the best-fit ΛCDM cosmology is in excellent agreement with preliminary PlanckEE and TE polarisation spectra. We find that the standard ΛCDM cosmology is well constrained by Planck from the measurements at ℓ ≲ 1500. One specific example is the
Maximum Likelihood Bayesian Averaging of Spatial Variability Models in Unsaturated Fractured Tuff
Ye, Ming; Neuman, Shlomo P.; Meyer, Philip D.
2004-05-25
Hydrologic analyses typically rely on a single conceptual-mathematical model. Yet hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Adopting only one of these may lead to statistical bias and underestimation of uncertainty. Bayesian Model Averaging (BMA) provides an optimal way to combine the predictions of several competing models and to assess their joint predictive uncertainty. However, it tends to be computationally demanding and relies heavily on prior information about model parameters. We apply a maximum likelihood (ML) version of BMA (MLBMA) to seven alternative variogram models of log air permeability data from single-hole pneumatic injection tests in six boreholes at the Apache Leap Research Site (ALRS) in central Arizona. Unbiased ML estimates of variogram and drift parameters are obtained using Adjoint State Maximum Likelihood Cross Validation in conjunction with Universal Kriging and Generalized L east Squares. Standard information criteria provide an ambiguous ranking of the models, which does not justify selecting one of them and discarding all others as is commonly done in practice. Instead, we eliminate some of the models based on their negligibly small posterior probabilities and use the rest to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. We then average these four projections, and associated kriging variances, using the posterior probability of each model as weight. Finally, we cross-validate the results by eliminating from consideration all data from one borehole at a time, repeating the above process, and comparing the predictive capability of MLBMA with that of each individual model. We find that MLBMA is superior to any individual geostatistical model of log permeability among those we consider at the ALRS.
Two-stage recovery of amphibian assemblages following selective logging of tropical forests.
Adum, Gilbert Baase; Eichhorn, Markus Peter; Oduro, William; Ofori-Boateng, Caleb; Rödel, Mark-Oliver
2013-04-01
There is a lack of quantitative information on the effectiveness of selective-logging practices in ameliorating effects of logging on faunal communities. We conducted a large-scale replicated field study in 3 selectively logged moist semideciduous forests in West Africa at varying times after timber extraction to assess post logging effects on amphibian assemblages. Specifically, we assessed whether the diversity, abundance, and assemblage composition of amphibians changed over time for forest-dependent species and those tolerant of forest disturbance. In 2009, we sampled amphibians in 3 forests (total of 48 study plots, each 2 ha) in southwestern Ghana. In each forest, we established plots in undisturbed forest, recently logged forest, and forest logged 10 and 20 years previously. Logging intensity was constant across sites with 3 trees/ha removed. Recently logged forests supported substantially more species than unlogged forests. This was due to an influx of disturbance-tolerant species after logging. Simultaneously Simpson's index decreased, with increased in dominance of a few species. As time since logging increased richness of disturbance-tolerant species decreased until 10 years after logging when their composition was indistinguishable from unlogged forests. Simpson's index increased with time since logging and was indistinguishable from unlogged forest 20 years after logging. Forest specialists decreased after logging and recovered slowly. However, after 20 years amphibian assemblages had returned to a state indistinguishable from that of undisturbed forest in both abundance and composition. These results demonstrate that even with low-intensity logging (≤3 trees/ha) a minimum 20-year rotation of logging is required for effective conservation of amphibian assemblages in moist semideciduous forests. Furthermore, remnant patches of intact forests retained in the landscape and the presence of permanent brooks may aid in the effective recovery of amphibian
Leak checker data logging system
Gannon, J.C.; Payne, J.J.
1996-09-03
A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time. 18 figs.
NASA Astrophysics Data System (ADS)
Fei, Bennie; Eloff, Jan; Olivier, Martin; Venter, Hein
Network forensics involves capturing, recording and analysing network audit trails. A crucial part of network forensics is to gather evidence at the server level, proxy level and from other sources. A web proxy relays URL requests from clients to a server. Analysing web proxy logs can give unobtrusive insights to the browsing behavior of computer users and provide an overview of the Internet usage in an organisation. More importantly, in terms of network forensics, it can aid in detecting anomalous browsing behavior. This paper demonstrates the use of a self-organising map (SOM), a powerful data mining technique, in network forensics. In particular, it focuses on how a SOM can be used to analyse data gathered at the web proxy level.
Leak checker data logging system
Gannon, Jeffrey C.; Payne, John J.
1996-01-01
A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.
Leak checker data logging system
Payne, J.J.; Gannon, J.C.
1994-12-31
A portable, high speed, computer-based data logging system for field testing systems or components located some distance apart employs a plurality of spaced mass spectrometers and is particularly adapted for monitoring the vacuum integrity of a long string of a superconducting magnets such as used in high energy particle accelerators. The system provides precise tracking of a gas such as helium through the magnet string when the helium is released into the vacuum by monitoring the spaced mass spectrometers allowing for control, display and storage of various parameters involved with leak detection and localization. A system user can observe the flow of helium through the magnet string on a real-time basis hour the exact moment of opening of the helium input valve. Graph reading can be normalized to compensate for magnet sections that deplete vacuum faster than other sections between testing to permit repetitive testing of vacuum integrity in reduced time.
NASA Astrophysics Data System (ADS)
2012-07-01
WE RECOMMEND Data logger Fourier NOVA LINK: data logging and analysis To Engineer is Human Engineering: essays and insights Soap, Science, & Flat-Screen TVs People, politics, business and science overlap uLog sensors and sensor adapter A new addition to the LogIT range offers simplicity and ease of use WORTH A LOOK Imagined Worlds Socio-scientific predictions for the future Mini light data logger and mini temperature data logger Small-scale equipment for schools SensorLab Plus LogIT's supporting software, with extra features HANDLE WITH CARE CAXE110P PICAXE-18M2 data logger Data logger 'on view' but disappoints Engineering: A Very Short Introduction A broad-brush treatment fails to satisfy WEB WATCH Two very different websites for students: advanced physics questions answered and a more general BBC science resource
Interactive, Collaborative, Electronic Learning Logs in the Physics Classroom
NASA Astrophysics Data System (ADS)
Gosling, Chris
2006-12-01
I describe my experiences using Hickman's Interactive Collaborative Electronic Learning Logs teaching HS Physics. ICE Learning Logs are written in student groups to answer questions posed by the instructor, who then in turn responds to each group’s entry before the next class. These logs were used with non-physics majors in both algebra and calculus-based introductory physics courses, and also at the high school level. I found ICE Learning Logs were found to be a clear improvement over traditional student journals. Excerpts from group entries will be presented to demonstrate the group identities that formed as well as the utility of the journals to probe for conceptual understanding. In addition, the ICE Learning Logs served as an excellent resource for students to review before exams and also to examine for critical moments to reflect on in formal essays. Hickman, P. (2000). Assessing student understanding with interactive-electronic-collaborative learning logs. ENC Focus, 7(2), 24-27. Sponsored by the National Science Foundation DUE0302097 and SUNY-Buffalo State Physics
Four Decades of Forest Persistence, Clearance and Logging on Borneo
Gaveau, David L. A.; Sloan, Sean; Molidena, Elis; Yaen, Husna; Sheil, Doug; Abram, Nicola K.; Ancrenaz, Marc; Nasi, Robert; Quinones, Marcela; Wielaard, Niels; Meijaard, Erik
2014-01-01
The native forests of Borneo have been impacted by selective logging, fire, and conversion to plantations at unprecedented scales since industrial-scale extractive industries began in the early 1970s. There is no island-wide documentation of forest clearance or logging since the 1970s. This creates an information gap for conservation planning, especially with regard to selectively logged forests that maintain high conservation potential. Analysing LANDSAT images, we estimate that 75.7% (558,060 km2) of Borneo's area (737,188 km2) was forested around 1973. Based upon a forest cover map for 2010 derived using ALOS-PALSAR and visually reviewing LANDSAT images, we estimate that the 1973 forest area had declined by 168,493 km2 (30.2%) in 2010. The highest losses were recorded in Sabah and Kalimantan with 39.5% and 30.7% of their total forest area in 1973 becoming non-forest in 2010, and the lowest in Brunei and Sarawak (8.4%, and 23.1%). We estimate that the combined area planted in industrial oil palm and timber plantations in 2010 was 75,480 km2, representing 10% of Borneo. We mapped 271,819 km of primary logging roads that were created between 1973 and 2010. The greatest density of logging roads was found in Sarawak, at 0.89 km km−2, and the lowest density in Brunei, at 0.18 km km−2. Analyzing MODIS-based tree cover maps, we estimate that logging operated within 700 m of primary logging roads. Using this distance, we estimate that 266,257 km2 of 1973 forest cover has been logged. With 389,566 km2 (52.8%) of the island remaining forested, of which 209,649 km2 remains intact. There is still hope for biodiversity conservation in Borneo. Protecting logged forests from fire and conversion to plantations is an urgent priority for reducing rates of deforestation in Borneo. PMID:25029192
Four decades of forest persistence, clearance and logging on Borneo.
Gaveau, David L A; Sloan, Sean; Molidena, Elis; Yaen, Husna; Sheil, Doug; Abram, Nicola K; Ancrenaz, Marc; Nasi, Robert; Quinones, Marcela; Wielaard, Niels; Meijaard, Erik
2014-01-01
The native forests of Borneo have been impacted by selective logging, fire, and conversion to plantations at unprecedented scales since industrial-scale extractive industries began in the early 1970s. There is no island-wide documentation of forest clearance or logging since the 1970s. This creates an information gap for conservation planning, especially with regard to selectively logged forests that maintain high conservation potential. Analysing LANDSAT images, we estimate that 75.7% (558,060 km2) of Borneo's area (737,188 km2) was forested around 1973. Based upon a forest cover map for 2010 derived using ALOS-PALSAR and visually reviewing LANDSAT images, we estimate that the 1973 forest area had declined by 168,493 km2 (30.2%) in 2010. The highest losses were recorded in Sabah and Kalimantan with 39.5% and 30.7% of their total forest area in 1973 becoming non-forest in 2010, and the lowest in Brunei and Sarawak (8.4%, and 23.1%). We estimate that the combined area planted in industrial oil palm and timber plantations in 2010 was 75,480 km2, representing 10% of Borneo. We mapped 271,819 km of primary logging roads that were created between 1973 and 2010. The greatest density of logging roads was found in Sarawak, at 0.89 km km-2, and the lowest density in Brunei, at 0.18 km km-2. Analyzing MODIS-based tree cover maps, we estimate that logging operated within 700 m of primary logging roads. Using this distance, we estimate that 266,257 km2 of 1973 forest cover has been logged. With 389,566 km2 (52.8%) of the island remaining forested, of which 209,649 km2 remains intact. There is still hope for biodiversity conservation in Borneo. Protecting logged forests from fire and conversion to plantations is an urgent priority for reducing rates of deforestation in Borneo. PMID:25029192
Swift, T.E.; Kumar, R.M.; Marlow, R.E.; Wilhelm, M.H.
1982-08-01
Field operations, which were conducted as a cooperative effort between Conoco and Gruy Federal, began on January 16, 1980 when the well was spudded. The well was drilled to 3692 feet, and 18 cores recovered in 18 core-barrel runs (144 feet). Upon completion of the coring phase, the hole was drilled to a total depth of 4150 feet and a complete suite of geophysical logs was run. Logging was then followed by completion and testing by Concoco. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712 feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations. Residual oil saturation were 259 STB/acre-ft for the 4035 - 4055 feet interval, and 150 STB/acre-ft for the 3692 - 3718 feet interval. Nine BOPD was produced from the 4035 - 4055 feet interval and no oil was produced from 3692 to 3718 feet interval, qualitatively confirming the relative oil saturations. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. 68 figures, 11 tables.
Likelihood-based error correction for holographic storage systems
NASA Astrophysics Data System (ADS)
Neifeld, Mark A.; Chou, Wu-Chun
1999-11-01
We consider a volume holographic memory (VHM) system that is corrupted by interpixel interference (IPI) and detector noise. We compare hard-decision Reed-Solomon (RS) decoding with both hard- and soft-decision algorithms for 2D array decoding. RS codes are shown to provide larger VHM storage capacity and density as compared with array codes when hard-decision methods are employed. A new likelihood-based soft-decision algorithm for 2D array decoding is described. The new decoding algorithm is motivated by iterative turbo-decoding methods and is capable of incorporating a priori knowledge of the corrupting IPI channel during decoding. The new algorithm is shown to offer VHM capacity and density performance superior to hard-decision RS methods.
Chapman, J.B.; Mihevc, T.M.; Lyles, B.F.
1994-08-01
The Central Nevada Test Area was the site of the Faultless underground nuclear test that could be a source of radionuclide contamination to aquifers in Hot Creek Valley, Nevada. Field studies in 1992 and 1993 have used hydrologic logging and water sampling to determine the adequacy of the current groundwater monitoring network and the status of water-level recovery to pre-shot levels in the nuclear chimney. The field studies have determined that there is a possibility for contaminant migration away from the Faultless event though the pre-event water level has not been attained, while new data raise questions about the ability of the current monitoring network to detect migration. Hydrologic logs from the postshot hole (drilled into the chimney created by the nuclear detonation) reveal inflow around 485 m below land surface. The physical and chemical characteristics of the inflow water indicate that its source is much deeper in the chimney, perhaps driven upward in a convection cell generated by heat near the nuclear cavity. Logging and sampling at monitoring wells HTH-1 and HTH-2 revealed that the completion of HTH-1 may be responsible for its elevated water level (as compared to pre-nuclear test levels) and may have also created a local mound in the water table in the alluvium that accounts for higher post-test water levels at HTH-2. This mound would serve to divert flow around the monitoring wells, so that only migration of contaminants through the underlying, higher pressure, volcanic units is currently monitored. A hydraulic high found in an abandoned hole located between the nuclear chimney and the monitoring wells further reduces the likelihood of HTH-1 or HTH-2 intercepting contaminant migration.
Correlating Log Messages for System Diagnostics
Gunasekaran, Raghul; Dillow, David A; Shipman, Galen M; Maxwell, Don E; Hill, Jason J; Park, Byung H; Geist, Al
2010-01-01
In large-scale computing systems, the sheer volume of log data generated presents daunting challenges for debugging and monitoring of these systems. The Oak Ridge Leadership Computing Facility s premier simulation platform, the Cray XT5 known as Jaguar, can generate a few hundred thousand log entries in less than a minute for many system level events. Determining the root cause of such system events requires analyzing and interpretation of a large number of log messages. Most often, the log messages are best understood when they are interpreted collectively rather than individually. In this paper, we present our approach to interpreting log messages by identifying their commonalities and grouping them into clusters. Given a set of log messages within a time interval, we group the messages based on source, target, and/or error type, and correlate the messages with hardware and application information. We monitor the Lustre log messages in the XT5 console log and show that such grouping of log messages assists in detecting the source of system events. By intelligent grouping and correlation of events in the log, we are able to provide system administrators with meaningful information in a concise format for root cause analysis.
Dual excitation acoustic paramagnetic logging tool
Vail, III, William B.
1989-01-01
New methods and apparatus are disclosed which allow measurement of the presence of oil and water in gelogical formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleous present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described.
Log-Euclidean free-form deformation
NASA Astrophysics Data System (ADS)
Modat, Marc; Ridgway, Gerard R.; Daga, Pankaj; Cardoso, M. J.; Hawkes, David J.; Ashburner, John; Ourselin, Sébastien
2011-03-01
The Free-Form Deformation (FFD) algorithm is a widely used method for non-rigid registration. Modifications have previously been proposed to ensure topology preservation and invertibility within this framework. However, in practice, none of these yield the inverse transformation itself, and one loses the parsimonious B-spline parametrisation. We present a novel log-Euclidean FFD approach in which a spline model of a stationary velocity field is exponentiated to yield a diffeomorphism, using an efficient scaling-and-squaring algorithm. The log-Euclidean framework allows easy computation of a consistent inverse transformation, and offers advantages in group-wise atlas building and statistical analysis. We optimise the Normalised Mutual Information plus a regularisation term based on the Jacobian determinant of the transformation, and we present a novel analytical gradient of the latter. The proposed method has been assessed against a fast FFD implementation (F3D) using simulated T1- and T2-weighted magnetic resonance brain images. The overlap measures between propagated grey matter tissue probability maps used in the simulations show similar results for both approaches; however, our new method obtains more reasonable Jacobian values, and yields inverse transformations.
Dual excitation acoustic paramagnetic logging tool
Vail, W.B. III.
1989-02-14
New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth's magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation. The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be performed in open boreholes and in cased well bores. The Dual Excitation Acoustic Paramagnetic Logging Tool employing two acoustic sources is also described. 6 figs.
Taylor, D.A.; Morganti, J.K.; White, H.J. ); Noblett, B.R. )
1996-01-01
Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.
Taylor, D.A.; Morganti, J.K.; White, H.J.; Noblett, B.R.
1996-12-31
Nuclear magnetic resonance (NMR) logging using the new C Series Magnetic Resonance Imaging Log (MRIL) system is rapidly enhancing formation evaluation throughout the industry. By measuring irreducible water saturations, permeabilities, and effective porosities, MRIL data can help petrophysicists evaluate low-resistivity pays. In these environments, conventional openhole logs may not define all of the pay intervals. The MRIL system can also reduce the number of unnecessary completions in zones of potentially high water cut. MRIL tool theory and log presentations used with conventional logs and sidewall cores are presented along with field examples. Scanning electron microscope (SEM) analysis shows good correlation of varying grain size in sandstones with the T2 distribution and bulk volume irreducible water determined from the MRIL measurements. Analysis of each new well drilled in the study area shows how water-free production zones were defined. Because the MRIL data were not recorded on one of the wells, predictions from the conventional logs and the MRIL data collected on the other two wells were used to estimate productive zones in the first well. Discussion of additional formation characteristics, completion procedures, actual production, and predicted producibility of the shaly sands is presented. Integrated methodologies resulted in the perforation of 3 new wells for a gross initial potential of 690 BOPD and 0 BWPD.
Selective logging and its relation to deforestation
NASA Astrophysics Data System (ADS)
Asner, Gregory P.; Keller, Michael; Lentini, Marco; Merry, Frank; Souza, Carlos, Jr.
Selective logging is a major contributor to the social, economic, and ecological dynamics of Brazilian Amazonia. Logging activities have expanded from low-volume floodplain harvests in past centuries to high-volume operations today that take about 25 million m3 of wood from the forest each year. The most common highimpact conventional and often illegal logging practices result in major collateral forest damage, with cascading effects on ecosystem processes. Initial carbon losses and forest recovery rates following timber harvest are tightly linked to initial logging intensity, which drives changes in forest gap fraction, fragmentation, and the light environment. Other ecological processes affected by selective logging include nutrient cycling, hydrological function, and postharvest disturbance such as fire. This chapter synthesizes the ecological impacts of selective logging, in the context of the recent socioeconomic conditions throughout Brazilian Amazonia, as determined from field-based and remote sensing studies carried out during the Large-Scale Biosphere-Atmosphere Experiment in Amazonia program.
A Maximum Likelihood Approach to Correlational Outlier Identification.
ERIC Educational Resources Information Center
Bacon, Donald R.
1995-01-01
A maximum likelihood approach to correlational outlier identification is introduced and compared to the Mahalanobis D squared and Comrey D statistics through Monte Carlo simulation. Identification performance depends on the nature of correlational outliers and the measure used, but the maximum likelihood approach is the most robust performance…
A Survey of the Likelihood Approach to Bioequivalence Trials
Choi, Leena; Caffo, Brian; Rohde, Charles
2009-01-01
SUMMARY Bioequivalence trials are abbreviated clinical trials whereby a generic drug or new formulation is evaluated to determine if it is “equivalent” to a corresponding previously approved brand-name drug or formulation. In this manuscript, we survey the process of testing bioequivalence and advocate the likelihood paradigm for representing the resulting data as evidence. We emphasize the unique conflicts between hypothesis testing and confidence intervals in this area - which we believe are indicative of the existence of the systemic defects in the frequentist approach - that the likelihood paradigm avoids. We suggest the direct use of profile likelihoods for evaluating bioequivalence. We discuss how the likelihood approach is useful to present the evidence for both average and population bioequivalence within a unified framework. We also examine the main properties of profile likelihoods and estimated likelihoods under simulation. This simulation study shows that profile likelihoods offer a viable alternative to the (unknown) true likelihood for a range of parameters commensurate with bioequivalence research. PMID:18618422
Optimal message log reclamation for independent checkpointing
NASA Technical Reports Server (NTRS)
Wang, Yi-Min; Fuchs, W. Kent
1993-01-01
Independent (uncoordinated) check pointing for parallel and distributed systems allows maximum process autonomy but suffers from possible domino effects and the associated storage space overhead for maintaining multiple checkpoints and message logs. In most research on check pointing and recovery, it was assumed that only the checkpoints and message logs older than the global recovery line can be discarded. It is shown how recovery line transformation and decomposition can be applied to the problem of efficiently identifying all discardable message logs, thereby achieving optimal garbage collection. Communication trace-driven simulation for several parallel programs is used to show the benefits of the proposed algorithm for message log reclamation.
Flow rate logging seepage meter
NASA Technical Reports Server (NTRS)
Reay, William G. (Inventor); Walthall, Harry G. (Inventor)
1996-01-01
An apparatus for remotely measuring and logging the flow rate of groundwater seepage into surface water bodies. As groundwater seeps into a cavity created by a bottomless housing, it displaces water through an inlet and into a waterproof sealed upper compartment, at which point, the water is collected by a collection bag, which is contained in a bag chamber. A magnet on the collection bag approaches a proximity switch as the collection bag fills, and eventually enables the proximity switch to activate a control circuit. The control circuit then rotates a three-way valve from the collection path to a discharge path, enables a data logger to record the time, and enables a pump, which discharges the water from the collection bag, through the three-way valve and pump, and into the sea. As the collection bag empties, the magnet leaves the proximity of the proximity switch, and the control circuit turns off the pump, resets the valve to provide a collection path, and restarts the collection cycle.
VizieR Online Data Catalog: CoRoT observation log (N2-4.4) (CoRoT 2016)
NASA Astrophysics Data System (ADS)
COROT Team
2014-03-01
CoRoT, a space astronomy mission, has measured photometric micro-variability of stars from minutes to months (up to 150 days) with a high duty cycle (more than 90%). The mission was led by CNES in association with four French laboratories and 7 participating countries and agencies (Austria, Belgium, Brazil, Germany, Spain, and the ESA Science Programme). The satellite was composed of a PROTEUS platform (the 3rd in the series) and a unique instrument: a stellar rapid photometer. It was launched on December 27th 2006 by a Soyuz Rocket, from Bakonour. The mission has lasted almost 6 years (the nominal 3-year duration and a 3-year extension) and has observed more than 160 000 stars. It stopped sending data on November 2nd 2012. Two regions of the sky were accessible for long period of time: circles of 10 degrees centered on the equator around alpha=06:50 and alpha=18:50. They were called the CoRoT eyes: the "anticenter" and the "center eye" (as they are approximately in these directions). Each pointing covers 1.4x2.8 square degrees within one of those CoRoT eyes. The original scientific objectives were focussed on the study of stellar pulsations (asteroseismology) to probe the internal structure of stars, and the detection of small exoplanets through their "transit" in front of their host star, and the measurement of their size. This lead to introduce two modes of observations, working simultaneously: - The "bright star" mode dedicated to very precise seismology of a small sample of bright and closeby stars - The "faint star" mode, observing a very large number of stars at the same time, to detect transits, which are rare events, as they imply the alignment of the star, the planet and the observer. The large amount of data gathered in this mode turned out to be extremely fruitful for many topics of stellar physics. Beyond these two initial objectives, CoRoT data revealed stellar variability associated with various other phenomena: granulation, rotational modulation by
VizieR Online Data Catalog: CoRoT observation log (N2-4.4) (CoRoT 2016)
NASA Astrophysics Data System (ADS)
COROT Team
2014-03-01
CoRoT, a space astronomy mission, has measured photometric micro-variability of stars from minutes to months (up to 150 days) with a high duty cycle (more than 90%). The mission was led by CNES in association with four French laboratories and 7 participating countries and agencies (Austria, Belgium, Brazil, Germany, Spain, and the ESA Science Programme). The satellite was composed of a PROTEUS platform (the 3rd in the series) and a unique instrument: a stellar rapid photometer. It was launched on December 27th 2006 by a Soyuz Rocket, from Bakonour. The mission has lasted almost 6 years (the nominal 3-year duration and a 3-year extension) and has observed more than 160 000 stars. It stopped sending data on November 2nd 2012. Two regions of the sky were accessible for long period of time: circles of 10 degrees centered on the equator around alpha=06:50 and alpha=18:50. They were called the CoRoT eyes: the "anticenter" and the "center eye" (as they are approximately in these directions). Each pointing covers 1.4x2.8 square degrees within one of those CoRoT eyes. The original scientific objectives were focussed on the study of stellar pulsations (asteroseismology) to probe the internal structure of stars, and the detection of small exoplanets through their "transitÃ¢â¬Â in front of their host star, and the measurement of their size. This lead to introduce two modes of observations, working simultaneously: - The Ã¢â¬Åbright starÃ¢â¬Â mode dedicated to very precise seismology of a small sample of bright and closeby stars - The Ã¢â¬Åfaint starÃ¢â¬Â mode, observing a very large number of stars at the same time, to detect transits, which are rare events, as they imply the alignment of the star, the planet and the observer. The large amount of data gathered in this mode turned out to be extremely fruitful for many topics of stellar physics. Beyond these two initial objectives, CoRoT data revealed stellar variability associated
Sloto, Ronald A.; Goode, Daniel J.; Frasch, Steven M.
2002-01-01
Ground water pumped from supply wells 1 and 2 on the Willow Grove Naval Air Station/Joint Reserve Base (NAS/JRB) provides water for use at the base, including potable water for drinking. The supply wells have been contaminated by volatile organic compounds (VOC?s), particularly trichloroethylene (TCE) and tetrachloroethylene (PCE), and the water is treated to remove the VOC?s. The Willow Grove NAS/JRB and surrounding area are underlain by sedimentary rocks of the Triassic-age Stockton Formation, which form a complex, heterogeneous aquifer. The ground-water-flow system for the supply wells was characterized by use of borehole geophysical logs and heatpulse-flowmeter measurements. The heatpulse-flowmeter measurements showed upward and downward borehole flow under nonpumping conditions in both wells. The hydraulic and chemical properties of discrete water-bearing fractures in the supply wells were characterized by isolating each water-bearing fracture with straddle packers. Eight fractures in supply well 1 and five fractures in supply well 2 were selected for testing on the basis of the borehole geophysical logs and borehole television surveys. Water samples were collected from each isolated fracture and analyzed for VOC?s and inorganic constituents. Fractures at 50?59, 79?80, 196, 124?152, 182, 241, 256, and 350?354 ft btoc (feet below top of casing) were isolated in supply well 1. Specific capacities ranged from 0.26 to 5.7 (gal/min)/ft (gallons per minute per foot) of drawdown. The highest specific capacity was for the fracture isolated at 179.8?188 ft btoc. Specific capacity and depth of fracture were not related in either supply well. The highest concentrations of PCE were in water samples collected from fractures isolated at 236.8?245 and 249.8?258 ft btoc, which are hydraulically connected. The concentration of PCE generally increased with depth to a maximum of 39 mg/L (micrograms per liter) at a depth of 249.8? 258 ft btoc and then decreased to 21 mg/L at a
Tolerance bounds for log gamma regression models
NASA Technical Reports Server (NTRS)
Jones, R. A.; Scholz, F. W.; Ossiander, M.; Shorack, G. R.
1985-01-01
The present procedure for finding lower confidence bounds for the quantiles of Weibull populations, on the basis of the solution of a quadratic equation, is more accurate than current Monte Carlo tables and extends to any location-scale family. It is shown that this method is accurate for all members of the log gamma(K) family, where K = 1/2 to infinity, and works well for censored data, while also extending to regression data. An even more accurate procedure involving an approximation to the Lawless (1982) conditional procedure, with numerical integrations whose tables are independent of the data, is also presented. These methods are applied to the case of failure strengths of ceramic specimens from each of three billets of Si3N4, which have undergone flexural strength testing.
Multi-rate flowing Wellbore electric conductivity logging method
Tsang, Chin-Fu; Doughty, Christine
2003-04-22
The flowing wellbore electric conductivity logging method involves the replacement of wellbore water by de-ionized or constant-salinity water, followed by constant pumping with rate Q, during which a series of fluid electric conductivity logs are taken. The logs can be analyzed to identify depth locations of inflow, and evaluate the transmissivity and electric conductivity (salinity) of the fluid at each inflow point. The present paper proposes the use of the method with two or more pumping rates. In particular it is recommended that the method be applied three times with pumping rates Q, Q /2, and 2Q. Then a combined analysis of the multi-rate data allows an efficient means of determining transmissivity and salinity values of all inflow points along a well with a confidence measure, as well as their inherent or far-field pressure heads. The method is illustrated by a practical example.
Allison, M.L.
1995-12-08
Strip logs for 491 wells were produced from a digital subsurface database of lithologic descriptions of the Ferron Sandstone Member of the Mancos Shale. This subsurface database covers wells from the parts of Emery and Sevier Counties in central Utah that occur between Ferron Creek on the north and Last Chance Creek on the south. The lithologic descriptions were imported into a logging software application designed for the display of stratigraphic data. Strip logs were produced at a scale of one inch equals 20 feet. The strip logs were created as part of a study by the Utah Geological Survey to develop a comprehensive, interdisciplinary, and qualitative characterization of a fluvial-deltaic reservoir using the Ferron Sandstone as a surface analogue. The study was funded by the U.S. Department of Energy (DOE) under the Geoscience/Engineering Reservoir Characterization Program.
Calibration of two complex ecosystem models with different likelihood functions
NASA Astrophysics Data System (ADS)
Hidy, Dóra; Haszpra, László; Pintér, Krisztina; Nagy, Zoltán; Barcza, Zoltán
2014-05-01
The biosphere is a sensitive carbon reservoir. Terrestrial ecosystems were approximately carbon neutral during the past centuries, but they became net carbon sinks due to climate change induced environmental change and associated CO2 fertilization effect of the atmosphere. Model studies and measurements indicate that the biospheric carbon sink can saturate in the future due to ongoing climate change which can act as a positive feedback. Robustness of carbon cycle models is a key issue when trying to choose the appropriate model for decision support. The input parameters of the process-based models are decisive regarding the model output. At the same time there are several input parameters for which accurate values are hard to obtain directly from experiments or no local measurements are available. Due to the uncertainty associated with the unknown model parameters significant bias can be experienced if the model is used to simulate the carbon and nitrogen cycle components of different ecosystems. In order to improve model performance the unknown model parameters has to be estimated. We developed a multi-objective, two-step calibration method based on Bayesian approach in order to estimate the unknown parameters of PaSim and Biome-BGC models. Biome-BGC and PaSim are a widely used biogeochemical models that simulate the storage and flux of water, carbon, and nitrogen between the ecosystem and the atmosphere, and within the components of the terrestrial ecosystems (in this research the developed version of Biome-BGC is used which is referred as BBGC MuSo). Both models were calibrated regardless the simulated processes and type of model parameters. The calibration procedure is based on the comparison of measured data with simulated results via calculating a likelihood function (degree of goodness-of-fit between simulated and measured data). In our research different likelihood function formulations were used in order to examine the effect of the different model
NASA Astrophysics Data System (ADS)
Nourali, Mahrouz; Ghahraman, Bijan; Pourreza-Bilondi, Mohsen; Davary, Kamran
2016-09-01
In the present study, DREAM(ZS), Differential Evolution Adaptive Metropolis combined with both formal and informal likelihood functions, is used to investigate uncertainty of parameters of the HEC-HMS model in Tamar watershed, Golestan province, Iran. In order to assess the uncertainty of 24 parameters used in HMS, three flood events were used to calibrate and one flood event was used to validate the posterior distributions. Moreover, performance of seven different likelihood functions (L1-L7) was assessed by means of DREAM(ZS)approach. Four likelihood functions, L1-L4, Nash-Sutcliffe (NS) efficiency, Normalized absolute error (NAE), Index of agreement (IOA), and Chiew-McMahon efficiency (CM), is considered as informal, whereas remaining (L5-L7) is represented in formal category. L5 focuses on the relationship between the traditional least squares fitting and the Bayesian inference, and L6, is a hetereoscedastic maximum likelihood error (HMLE) estimator. Finally, in likelihood function L7, serial dependence of residual errors is accounted using a first-order autoregressive (AR) model of the residuals. According to the results, sensitivities of the parameters strongly depend on the likelihood function, and vary for different likelihood functions. Most of the parameters were better defined by formal likelihood functions L5 and L7 and showed a high sensitivity to model performance. Posterior cumulative distributions corresponding to the informal likelihood functions L1, L2, L3, L4 and the formal likelihood function L6 are approximately the same for most of the sub-basins, and these likelihood functions depict almost a similar effect on sensitivity of parameters. 95% total prediction uncertainty bounds bracketed most of the observed data. Considering all the statistical indicators and criteria of uncertainty assessment, including RMSE, KGE, NS, P-factor and R-factor, results showed that DREAM(ZS) algorithm performed better under formal likelihood functions L5 and L7
Code of Federal Regulations, 2011 CFR
2011-10-01
... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...
Code of Federal Regulations, 2012 CFR
2012-10-01
... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...
Code of Federal Regulations, 2010 CFR
2010-10-01
... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...
Code of Federal Regulations, 2013 CFR
2013-10-01
... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...
Code of Federal Regulations, 2014 CFR
2014-10-01
... part. (iii) An entry of each test and activation of the Emergency Alert System (EAS) pursuant to the requirement of part 11 of this chapter and the EAS Operating Handbook. Stations may keep EAS data in a special EAS log which shall be maintained at a convenient location; however, this log is considered a part...
29 CFR 1917.18 - Log handling.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 7 2010-07-01 2010-07-01 false Log handling. 1917.18 Section 1917.18 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR (CONTINUED) MARINE TERMINALS Marine Terminal Operations § 1917.18 Log handling. (a) The employer shall ensure...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Data logging. 90.412 Section 90.412....412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 89.409 Section 89.409... Data logging. (a) A computer or any other automatic data processing device(s) may be used as long as the system meets the requirements of this subpart. (b) Determine from the data collection records...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Data logging. 91.412 Section 91.412... EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test Procedures § 91.412 Data logging. (a) A computer or any other automatic data collection (ADC) device(s) may be used as long as the system meets...
Learning Logs in Introductory Literature Courses
ERIC Educational Resources Information Center
Babcock, Matthew James
2007-01-01
This article explores the beneficial and sometimes unpredictable implications of a daily reflective writing exercise for introductory literature courses: the learning log. Pseudonymous samples of student writing, coupled with instructor commentary, gesture toward ways in which the learning log's continual implementation and modification foster a…
Improve reliability with operator log sheets
Devender, A.V.; Ganesan, S.T.
1996-01-01
Recently, there have been significant improvements in computerized vibration and online performance monitoring systems. However, despite all the developments, the importance of monitoring rotating equipment through operator log sheets must not be overlooked or neglected. Operator log sheets filled out during shifts can be very useful tools in detecting problems early, provided they are diligently completed and evaluated during the operating shift. In most cases, performance deviations can be corrected by measured within the control of the operator. If the operator understands the purpose of log sheets, and knows the cause and effect of deviations in operating parameters, he or she will be motivated to complete the log sheets to increase equipment reliability. Logged data should include any operating data from equipment that reveals its mechanical condition or performance. The most common data logged are pressure, temperature, flow, power and vibration. The purposes of log sheets are to: establish and recognize the normal operating parameters and identify deviations in performance data; perform timely corrective actions on deviations to avoid unplanned shutdowns and catastrophic failures; avoid repetitive failures and increase mean time between failures; and provide base line data for troubleshooting. Two case histories are presented to illustrate the usefulness of logs: a compressor thrust bearing problem and steam turbine blade washing.
10 CFR 34.71 - Utilization logs.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Utilization logs. 34.71 Section 34.71 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall...
Discover Presidential Log Cabins. Teacher's Discussion Guide.
ERIC Educational Resources Information Center
National Park Service (Dept. of Interior), Washington, DC.
Discover Presidential Log Cabins is a set of materials designed to help educate 6-8 grade students about the significance of three log cabin sites occupied by George Washington, Ulysses Grant, Abraham Lincoln, and Theodore Roosevelt. This teacher's discussion guide is intended for use as part of a larger, comprehensive social studies program, and…
Selective Logging in the Brazilian Amazon
NASA Astrophysics Data System (ADS)
Asner, Gregory P.; Knapp, David E.; Broadbent, Eben N.; Oliveira, Paulo J. C.; Keller, Michael; Silva, Jose N.
2005-10-01
Amazon deforestation has been measured by remote sensing for three decades. In comparison, selective logging has been mostly invisible to satellites. We developed a large-scale, high-resolution, automated remote-sensing analysis of selective logging in the top five timber-producing states of the Brazilian Amazon. Logged areas ranged from 12,075 to 19,823 square kilometers per year (+/-14%) between 1999 and 2002, equivalent to 60 to 123% of previously reported deforestation area. Up to 1200 square kilometers per year of logging were observed on conservation lands. Each year, 27 million to 50 million cubic meters of wood were extracted, and a gross flux of ~0.1 billion metric tons of carbon was destined for release to the atmosphere by logging.
ERIC Educational Resources Information Center
Spillane, James P.; Zuberi, Anita
2009-01-01
Purpose: This article aims to validate the Leadership Daily Practice (LDP) log, an instrument for conducting research on leadership in schools. Research Design: Using a combination of data sources--namely, a daily practice log, observations, and open-ended cognitive interviews--the authors evaluate the validity of the LDP log. Participants: Formal…
NASA Astrophysics Data System (ADS)
Bousse, Alexandre; Bertolli, Ottavia; Atkinson, David; Arridge, Simon; Ourselin, Sébastien; Hutton, Brian F.; Thielemans, Kris
2016-02-01
This work is an extension of our recent work on joint activity reconstruction/motion estimation (JRM) from positron emission tomography (PET) data. We performed JRM by maximization of the penalized log-likelihood in which the probabilistic model assumes that the same motion field affects both the activity distribution and the attenuation map. Our previous results showed that JRM can successfully reconstruct the activity distribution when the attenuation map is misaligned with the PET data, but converges slowly due to the significant cross-talk in the likelihood. In this paper, we utilize time-of-flight PET for JRM and demonstrate that the convergence speed is significantly improved compared to JRM with conventional PET data.
Log-amplitude statistics for Beck-Cohen superstatistics
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Konno, Hidetoshi
2013-05-01
As a possible generalization of Beck-Cohen superstatistical processes, we study non-Gaussian processes with temporal heterogeneity of local variance. To characterize the variance heterogeneity, we define log-amplitude cumulants and log-amplitude autocovariance and derive closed-form expressions of the log-amplitude cumulants for χ2, inverse χ2, and log-normal superstatistical distributions. Furthermore, we show that χ2 and inverse χ2 superstatistics with degree 2 are closely related to an extreme value distribution, called the Gumbel distribution. In these cases, the corresponding superstatistical distributions result in the q-Gaussian distribution with q=5/3 and the bilateral exponential distribution, respectively. Thus, our finding provides a hypothesis that the asymptotic appearance of these two special distributions may be explained by a link with the asymptotic limit distributions involving extreme values. In addition, as an application of our approach, we demonstrated that non-Gaussian fluctuations observed in a stock index futures market can be well approximated by the χ2 superstatistical distribution with degree 2.
Log-Gabor Weber descriptor for face recognition
NASA Astrophysics Data System (ADS)
Li, Jing; Sang, Nong; Gao, Changxin
2015-09-01
The Log-Gabor transform, which is suitable for analyzing gradually changing data such as in iris and face images, has been widely used in image processing, pattern recognition, and computer vision. In most cases, only the magnitude or phase information of the Log-Gabor transform is considered. However, the complementary effect taken by combining magnitude and phase information simultaneously for an image-feature extraction problem has not been systematically explored in the existing works. We propose a local image descriptor for face recognition, called Log-Gabor Weber descriptor (LGWD). The novelty of our LGWD is twofold: (1) to fully utilize the information from the magnitude or phase feature of multiscale and orientation Log-Gabor transform, we apply the Weber local binary pattern operator to each transform response. (2) The encoded Log-Gabor magnitude and phase information are fused at the feature level by utilizing kernel canonical correlation analysis strategy, considering that feature level information fusion is effective when the modalities are correlated. Experimental results on the AR, Extended Yale B, and UMIST face databases, compared with those available from recent experiments reported in the literature, show that our descriptor yields a better performance than state-of-the art methods.
Likelihood ratio meta-analysis: New motivation and approach for an old method.
Dormuth, Colin R; Filion, Kristian B; Platt, Robert W
2016-03-01
A 95% confidence interval (CI) in an updated meta-analysis may not have the expected 95% coverage. If a meta-analysis is simply updated with additional data, then the resulting 95% CI will be wrong because it will not have accounted for the fact that the earlier meta-analysis failed or succeeded to exclude the null. This situation can be avoided by using the likelihood ratio (LR) as a measure of evidence that does not depend on type-1 error. We show how an LR-based approach, first advanced by Goodman, can be used in a meta-analysis to pool data from separate studies to quantitatively assess where the total evidence points. The method works by estimating the log-likelihood ratio (LogLR) function from each study. Those functions are then summed to obtain a combined function, which is then used to retrieve the total effect estimate, and a corresponding 'intrinsic' confidence interval. Using as illustrations the CAPRIE trial of clopidogrel versus aspirin in the prevention of ischemic events, and our own meta-analysis of higher potency statins and the risk of acute kidney injury, we show that the LR-based method yields the same point estimate as the traditional analysis, but with an intrinsic confidence interval that is appropriately wider than the traditional 95% CI. The LR-based method can be used to conduct both fixed effect and random effects meta-analyses, it can be applied to old and new meta-analyses alike, and results can be presented in a format that is familiar to a meta-analytic audience. PMID:26837056
Accuracy of Maximum Likelihood Parameter Estimators for Heston Stochastic Volatility SDE
NASA Astrophysics Data System (ADS)
Azencott, Robert; Gadhyan, Yutheeka
2015-04-01
We study approximate maximum likelihood estimators (MLEs) for the parameters of the widely used Heston Stock price and volatility stochastic differential equations (SDEs). We compute explicit closed form estimators maximizing the discretized log-likelihood of observations recorded at times . We compute the asymptotic biases of these parameter estimators for fixed and , as well as the rate at which these biases vanish when . We determine asymptotically consistent explicit modifications of these MLEs. For the Heston volatility SDE, we identify a canonical form determined by two canonical parameters and which are explicit functions of the original SDE parameters. We analyze theoretically the asymptotic distribution of the MLEs and of their consistent modifications, and we outline their concrete speeds of convergence by numerical simulations. We clarify in terms of the precise dichotomy between asymptotic normality and attraction by stable like distributions with heavy tails. We illustrate numerical model fitting for Heston SDEs by two concrete examples, one for daily data and one for intraday data, both with moderate values of.
A dual formulation of a penalized maximum likelihood x-ray CT reconstruction problem
NASA Astrophysics Data System (ADS)
Xu, Jingyan; Taguchi, Katsuyuki; Gullberg, Grant T.; Tsui, Benjamin M. W.
2009-02-01
This work studies the dual formulation of a penalized maximum likelihood reconstruction problem in x-ray CT. The primal objective function is a Poisson log-likelihood combined with a weighted cross-entropy penalty term. The dual formulation of the primal optimization problem is then derived and the optimization procedure outlined. The dual formulation better exploits the structure of the problem, which translates to faster convergence of iterative reconstruction algorithms. A gradient descent algorithm is implemented for solving the dual problem and its performance is compared with the filtered back-projection algorithm, and with the primal formulation optimized by using surrogate functions. The 3D XCAT phantom and an analytical x-ray CT simulator are used to generate noise-free and noisy CT projection data set with monochromatic and polychromatic x-ray spectrums. The reconstructed images from the dual formulation delineate the internal structures at early iterations better than the primal formulation using surrogate functions. However the body contour is slower to converge in the dual than in the primal formulation. The dual formulation demonstrate better noise-resolution tradeoff near the internal organs than the primal formulation. Since the surrogate functions in general can provide a diagonal approximation of the Hessian matrix of the objective function, further convergence speed up may be achieved by deriving the surrogate function of the dual objective function.
Low-complexity approximations to maximum likelihood MPSK modulation classification
NASA Technical Reports Server (NTRS)
Hamkins, Jon
2004-01-01
We present a new approximation to the maximum likelihood classifier to discriminate between M-ary and M'-ary phase-shift-keying transmitted on an additive white Gaussian noise (AWGN) channel and received noncoherentl, partially coherently, or coherently.
Maximum likelihood estimation of finite mixture model for economic data
NASA Astrophysics Data System (ADS)
Phoong, Seuk-Yen; Ismail, Mohd Tahir
2014-06-01
Finite mixture model is a mixture model with finite-dimension. This models are provides a natural representation of heterogeneity in a finite number of latent classes. In addition, finite mixture models also known as latent class models or unsupervised learning models. Recently, maximum likelihood estimation fitted finite mixture models has greatly drawn statistician's attention. The main reason is because maximum likelihood estimation is a powerful statistical method which provides consistent findings as the sample sizes increases to infinity. Thus, the application of maximum likelihood estimation is used to fit finite mixture model in the present paper in order to explore the relationship between nonlinear economic data. In this paper, a two-component normal mixture model is fitted by maximum likelihood estimation in order to investigate the relationship among stock market price and rubber price for sampled countries. Results described that there is a negative effect among rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia.
New Techniques: Muon Glaciology and Ultrasonic Logging
NASA Astrophysics Data System (ADS)
Chirkin, D.; Allen, J.; Bay, R. C.; Bramall, N.; Price, P. B.
2003-12-01
The strain rate of cold glacial ice depends mainly on the stress tensor, temperature, grain size, and crystal habit. Lab measurements cannot be made at both the low stresses and low temperatures relevant to flow of cold glacial ice. Field studies with inclinometers measure only the horizontal components of flow. We have developed a new method for measuring the 3D strain-rate field at -40o to -15oC, using the AMANDA neutrino-detecting array frozen into deep ice at South Pole. Each strain detector consists of a photomultiplier tube (PMT) in its pressure vessel. AMANDA has ˜600 PMTs at depths 1500 to 2300 m in a ˜0.02 km3 volume. The coordinates of each PMT relative to a coordinate system moving down slope at 9 m yr-1 can be measured with s.d. <1 m in 1 day by mapping trajectories of down-going cosmic-ray muons that pass through the array. The PMTs record the arrival times of the Cherenkov light emitted along the muon trajectory. Use of maximum likelihood for 105 muon tracks allows PMT positions to be determined; their positions are then updated at six-month intervals. We will report results of strain-rate measurements in three dimensions, made in 2000, 2001, and 2002 at T ≈ -30oC. Applying the same technique to the future 1 km3 IceCube array, by averaging over subsets of the 5000 detectors, values of the strain-rate tensor as small as 3x 10-5 yr-1 can be measured as a function of temperature and lateral position. The vertical strain rate due to snow accumulation, estimated to be ˜ 3x 10-5 yr-1, can be measured and will serve as a check on the method. The second new method is designed to measure mean grain size in the ice surrounding a borehole. We will adapt an all-digital logging tool originally developed by Advanced Logic Technology (Luxembourg) for geophysics prospecting in rock boreholes. A 1.3 MHz transducer emits acoustic pulses horizontally into the ice in increments of 5o in azimuth and records the wave train back-scattered from grain boundaries. For
Coal log pipeline pilot plant study
Liu, H.; Lenau, C.W.; Burkett, W.
2000-07-01
After 8 years of extensive R and D in the new technology of coal log pipeline (CLP), a pilot plant is being built to demonstrate and test a complete CLP system for coal transportation. The system consists of a coal log fabrication plant, a 3,000-ft-length, 6-inch-diameter underground pipeline loop to transport 5.4-inch diameter coal logs, a log injection/ejection system, a pump bypass, a reservoir that serves as both the intake and the outlet of the CLP systems, an instrumentation system that includes pressure transducers, coal log sensors, and flowmeters, and an automatic control system that includes PLCs and a central computer. The pilot plant is to be completed in May of Year 2000. Upon completion of construction, the pilot plant will be used for running various types of coal, testing the degradation rate of drag reduction in CLP using Polyox (polyethylene oxide), testing the reliability of a special coal log sensor invented at the University of Missouri, testing the reliability and the efficiency of the pump-bypass system for pumping coal log trains through the pipe, and testing various hardware components and software for operating the pilot plant. Data collected from the tests will be used for designing future commercial systems of CLP. The pilot plant experiments are to be completed in two years. Then, the technology of CLP will be ready for commercial use.
Improved maximum likelihood reconstruction of complex multi-generational pedigrees.
Sheehan, Nuala A; Bartlett, Mark; Cussens, James
2014-11-01
The reconstruction of pedigrees from genetic marker data is relevant to a wide range of applications. Likelihood-based approaches aim to find the pedigree structure that gives the highest probability to the observed data. Existing methods either entail an exhaustive search and are hence restricted to small numbers of individuals, or they take a more heuristic approach and deliver a solution that will probably have high likelihood but is not guaranteed to be optimal. By encoding the pedigree learning problem as an integer linear program we can exploit efficient optimisation algorithms to construct pedigrees guaranteed to have maximal likelihood for the standard situation where we have complete marker data at unlinked loci and segregation of genes from parents to offspring is Mendelian. Previous work demonstrated efficient reconstruction of pedigrees of up to about 100 individuals. The modified method that we present here is not so restricted: we demonstrate its applicability with simulated data on a real human pedigree structure of over 1600 individuals. It also compares well with a very competitive approximate approach in terms of solving time and accuracy. In addition to identifying a maximum likelihood pedigree, we can obtain any number of pedigrees in decreasing order of likelihood. This is useful for assessing the uncertainty of a maximum likelihood solution and permits model averaging over high likelihood pedigrees when this would be appropriate. More importantly, when the solution is not unique, as will often be the case for large pedigrees, it enables investigation into the properties of maximum likelihood pedigree estimates which has not been possible up to now. Crucially, we also have a means of assessing the behaviour of other approximate approaches which all aim to find a maximum likelihood solution. Our approach hence allows us to properly address the question of whether a reasonably high likelihood solution that is easy to obtain is practically as
A notion of graph likelihood and an infinite monkey theorem
NASA Astrophysics Data System (ADS)
Banerji, Christopher R. S.; Mansour, Toufik; Severini, Simone
2014-01-01
We play with a graph-theoretic analogue of the folklore infinite monkey theorem. We define a notion of graph likelihood as the probability that a given graph is constructed by a monkey in a number of time steps equal to the number of vertices. We present an algorithm to compute this graph invariant and closed formulas for some infinite classes. We have to leave the computational complexity of the likelihood as an open problem.
COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION FOR BANDED PROBABILITY DISTRIBUTIONS
Gjerløw, E.; Mikkelsen, K.; Eriksen, H. K.; Næss, S. K.; Seljebotn, D. S.; Górski, K. M.; Huey, G.; Jewell, J. B.; Rocha, G.; Wehus, I. K.
2013-11-10
We investigate sets of random variables that can be arranged sequentially such that a given variable only depends conditionally on its immediate predecessor. For such sets, we show that the full joint probability distribution may be expressed exclusively in terms of uni- and bivariate marginals. Under the assumption that the cosmic microwave background (CMB) power spectrum likelihood only exhibits correlations within a banded multipole range, Δl{sub C}, we apply this expression to two outstanding problems in CMB likelihood analysis. First, we derive a statistically well-defined hybrid likelihood estimator, merging two independent (e.g., low- and high-l) likelihoods into a single expression that properly accounts for correlations between the two. Applying this expression to the Wilkinson Microwave Anisotropy Probe (WMAP) likelihood, we verify that the effect of correlations on cosmological parameters in the transition region is negligible in terms of cosmological parameters for WMAP; the largest relative shift seen for any parameter is 0.06σ. However, because this may not hold for other experimental setups (e.g., for different instrumental noise properties or analysis masks), but must rather be verified on a case-by-case basis, we recommend our new hybridization scheme for future experiments for statistical self-consistency reasons. Second, we use the same expression to improve the convergence rate of the Blackwell-Rao likelihood estimator, reducing the required number of Monte Carlo samples by several orders of magnitude, and thereby extend it to high-l applications.
Recognizing Patterns In Log-Polar Coordinates
NASA Technical Reports Server (NTRS)
Weiman, Carl F. R.
1992-01-01
Log-Hough transform is basis of improved method for recognition of patterns - particularly, straight lines - in noisy images. Takes advantage of rotational and scale invariance of mapping from Cartesian to log-polar coordinates, and offers economy of representation and computation. Unification of iconic and Hough domains simplifies computations in recognition and eliminates erroneous quantization of slopes attributable to finite spacing of Cartesian coordinate grid of classical Hough transform. Equally efficient recognizing curves. Log-Hough transform more amenable to massively parallel computing architectures than traditional Cartesian Hough transform. "In-place" nature makes it possible to apply local pixel-neighborhood processing.
Nonblocking and orphan free message logging protocols
NASA Technical Reports Server (NTRS)
Alvisi, Lorenzo; Hoppe, Bruce; Marzullo, Keith
1992-01-01
Currently existing message logging protocols demonstrate a classic pessimistic vs. optimistic tradeoff. We show that the optimistic-pessimistic tradeoff is not inherent to the problem of message logging. We construct a message-logging protocol that has the positive features of both optimistic and pessimistic protocol: our protocol prevents orphans and allows simple failure recovery; however, it requires no blocking in failure-free runs. Furthermore, this protocol does not introduce any additional message overhead as compared to one implemented for a system in which messages may be lost but processes do not crash.
Logged On for Learning. An "Education Week" Special Report.
ERIC Educational Resources Information Center
West, Peter
1995-01-01
This document and accompanying disk contain all of the articles from an "Education Week" special report examining the complex phenomenon commonly known as the "information highway" and how it affects the nation's schools. The articles, all by Peter West, include: (1) "Logged On for Learning," an overview of the report; (2) "Wired for the Future,"…
14 CFR 60.20 - Logging FSTD discrepancies.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 2 2011-01-01 2011-01-01 false Logging FSTD discrepancies. 60.20 Section 60.20 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.20...
14 CFR 60.20 - Logging FSTD discrepancies.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Logging FSTD discrepancies. 60.20 Section 60.20 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.20...
63. View looking north. A detail which shows the log ...
63. View looking north. A detail which shows the log crib support mortised into the rear (north) wall of the 'outrigger'. - Wabash & Erie Canal, Lock No. 2, 8 miles east of Fort Wayne, adjacent to U.S. Route 24, New Haven, Allen County, IN
14 CFR 60.20 - Logging FSTD discrepancies.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 2 2014-01-01 2014-01-01 false Logging FSTD discrepancies. 60.20 Section 60.20 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.20...
14 CFR 60.20 - Logging FSTD discrepancies.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 2 2012-01-01 2012-01-01 false Logging FSTD discrepancies. 60.20 Section 60.20 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.20...
14 CFR 60.20 - Logging FSTD discrepancies.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 2 2013-01-01 2013-01-01 false Logging FSTD discrepancies. 60.20 Section 60.20 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) AIRMEN FLIGHT SIMULATION TRAINING DEVICE INITIAL AND CONTINUING QUALIFICATION AND USE § 60.20...
NASA Technical Reports Server (NTRS)
Scholz, D.; Fuhs, N.; Hixson, M.
1979-01-01
The overall objective of this study was to apply and evaluate several of the currently available classification schemes for crop identification. The approaches examined were: (1) a per point Gaussian maximum likelihood classifier, (2) a per point sum of normal densities classifier, (3) a per point linear classifier, (4) a per point Gaussian maximum likelihood decision tree classifier, and (5) a texture sensitive per field Gaussian maximum likelihood classifier. Three agricultural data sets were used in the study: areas from Fayette County, Illinois, and Pottawattamie and Shelby Counties in Iowa. The segments were located in two distinct regions of the Corn Belt to sample variability in soils, climate, and agricultural practices.
A NEW LOG EVALUATION METHOD TO APPRAISE MESAVERDE RE-COMPLETION OPPORTUNITES
Albert Greer
2002-09-11
This report covers the information presented at the technical progress review. The following topics were discussed: (1) Structure of the study area dataset. (2) A visual illustration of the typical logs. (3) Crossplots developed and evaluated for patterns that might bear a relationship to initial producing rate. (4) Development of fuzzy curves to rank the goodness of the statistical properties of the logs used to construct the crossplots. (5) Neural network correlations developed with statistical properties of the logs.
MAIL LOG, program theory, volume 1. [Scout project automatic data system
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The program theory used to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, is described. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG data base consists of three main subfiles: (1) incoming and outgoing mail correspondence; (2) design information releases and reports; and (3) drawings and engineering orders. All subroutine descriptions, flowcharts, and MAIL LOG outputs are given and the data base design is described.
Finding Quantitative Trait Loci Genes with Collaborative Targeted Maximum Likelihood Learning.
Wang, Hui; Rose, Sherri; van der Laan, Mark J
2011-07-01
Quantitative trait loci mapping is focused on identifying the positions and effect of genes underlying an an observed trait. We present a collaborative targeted maximum likelihood estimator in a semi-parametric model using a newly proposed 2-part super learning algorithm to find quantitative trait loci genes in listeria data. Results are compared to the parametric composite interval mapping approach. PMID:21572586
Finding Quantitative Trait Loci Genes with Collaborative Targeted Maximum Likelihood Learning
Wang, Hui; Rose, Sherri; van der Laan, Mark J.
2010-01-01
Quantitative trait loci mapping is focused on identifying the positions and effect of genes underlying an an observed trait. We present a collaborative targeted maximum likelihood estimator in a semi-parametric model using a newly proposed 2-part super learning algorithm to find quantitative trait loci genes in listeria data. Results are compared to the parametric composite interval mapping approach. PMID:21572586
Comparison of IRT Likelihood Ratio Test and Logistic Regression DIF Detection Procedures
ERIC Educational Resources Information Center
Atar, Burcu; Kamata, Akihito
2011-01-01
The Type I error rates and the power of IRT likelihood ratio test and cumulative logit ordinal logistic regression procedures in detecting differential item functioning (DIF) for polytomously scored items were investigated in this Monte Carlo simulation study. For this purpose, 54 simulation conditions (combinations of 3 sample sizes, 2 sample…
Out-of-atlas likelihood estimation using multi-atlas segmentation
Asman, Andrew J.; Chambless, Lola B.; Thompson, Reid C.; Landman, Bennett A.
2013-01-01
Purpose: Multi-atlas segmentation has been shown to be highly robust and accurate across an extraordinary range of potential applications. However, it is limited to the segmentation of structures that are anatomically consistent across a large population of potential target subjects (i.e., multi-atlas segmentation is limited to “in-atlas” applications). Herein, the authors propose a technique to determine the likelihood that a multi-atlas segmentation estimate is representative of the problem at hand, and, therefore, identify anomalous regions that are not well represented within the atlases. Methods: The authors derive a technique to estimate the out-of-atlas (OOA) likelihood for every voxel in the target image. These estimated likelihoods can be used to determine and localize the probability of an abnormality being present on the target image. Results: Using a collection of manually labeled whole-brain datasets, the authors demonstrate the efficacy of the proposed framework on two distinct applications. First, the authors demonstrate the ability to accurately and robustly detect malignant gliomas in the human brain—an aggressive class of central nervous system neoplasms. Second, the authors demonstrate how this OOA likelihood estimation process can be used within a quality control context for diffusion tensor imaging datasets to detect large-scale imaging artifacts (e.g., aliasing and image shading). Conclusions: The proposed OOA likelihood estimation framework shows great promise for robust and rapid identification of brain abnormalities and imaging artifacts using only weak dependencies on anomaly morphometry and appearance. The authors envision that this approach would allow for application-specific algorithms to focus directly on regions of high OOA likelihood, which would (1) reduce the need for human intervention, and (2) reduce the propensity for false positives. Using the dual perspective, this technique would allow for algorithms to focus on
Analysis of neighborhood dynamics of forest ecosystems using likelihood methods and modeling.
Canham, Charles D; Uriarte, María
2006-02-01
Advances in computing power in the past 20 years have led to a proliferation of spatially explicit, individual-based models of population and ecosystem dynamics. In forest ecosystems, the individual-based models encapsulate an emerging theory of "neighborhood" dynamics, in which fine-scale spatial interactions regulate the demography of component tree species. The spatial distribution of component species, in turn, regulates spatial variation in a whole host of community and ecosystem properties, with subsequent feedbacks on component species. The development of these models has been facilitated by development of new methods of analysis of field data, in which critical demographic rates and ecosystem processes are analyzed in terms of the spatial distributions of neighboring trees and physical environmental factors. The analyses are based on likelihood methods and information theory, and they allow a tight linkage between the models and explicit parameterization of the models from field data. Maximum likelihood methods have a long history of use for point and interval estimation in statistics. In contrast, likelihood principles have only more gradually emerged in ecology as the foundation for an alternative to traditional hypothesis testing. The alternative framework stresses the process of identifying and selecting among competing models, or in the simplest case, among competing point estimates of a parameter of a model. There are four general steps involved in a likelihood analysis: (1) model specification, (2) parameter estimation using maximum likelihood methods, (3) model comparison, and (4) model evaluation. Our goal in this paper is to review recent developments in the use of likelihood methods and modeling for the analysis of neighborhood processes in forest ecosystems. We will focus on a single class of processes, seed dispersal and seedling dispersion, because recent papers provide compelling evidence of the potential power of the approach, and illustrate
Coal-log pipeline system development. [Contains bibliography
Not Available
1991-01-01
One of the four tasks of the contract awarded by the Energy-Related Inventions Program, US Department of Energy, is to test and demonstrate a small model of the injection system for coal logs. Because the injection system works in a coordinated manner with the pumping and ejection systems, it is difficult to separately demonstrate them. Consequently, it was decided to broaden the scope of this task to build and test an entire system of CLP (Coal Log Pipeline) including the injection system (intake), the pumping system and the ejection system. Such a demo unit will also allow us to test a complete system of CLP, a task that must be done sooner or later. Additional costs for constructing this demonstration CLP system are provided by the CLP Consortium which, at present, consists of nine companies. One purpose of this test program is to determine the optimal conditions necessary for the fabrication of binderless coal logs. An earlier literature review indicated that the moisture content, temperature, compaction pressure, and maximum coal particle size will have the greatest influence on the quality of compacted coal logs. Consequently, the test program described in last quarter's report was design to investigate these variables. In the coal log extrusion experiment, Powder River Basin coal dried and with 5 wt % asphalt was extruded from 1.6 in., 1.7 in. and 2 in. diameter dies continuously and at various speeds. The extrusion temperature was 300 {degrees}F. Coal log formation was still not satisfactory but extrusion improves as die diameter was decreased. This achievement now allows conditions of temperature, asphalt concentration, speed and die conditions to be explored in future experiments. The literature review in this progress report update focuses on reports published by the International Briquetting Association during 1953 to 1961. 22 refs., 13 figs., 1 tab.
Maximum likelihood estimation for cytogenetic dose-response curves
Frome, E.L; DuFrain, R.J.
1983-10-01
In vitro dose-response curves are used to describe the relation between the yield of dicentric chromosome aberrations and radiation dose for human lymphocytes. The dicentric yields follow the Poisson distribution, and the expected yield depends on both the magnitude and the temporal distribution of the dose for low LET radiation. A general dose-response model that describes this relation has been obtained by Kellerer and Rossi using the theory of dual radiation action. The yield of elementary lesions is kappa(..gamma..d + g(t, tau)d/sup 2/), where t is the time and d is dose. The coefficient of the d/sup 2/ term is determined by the recovery function and the temporal mode of irradiation. Two special cases of practical interest are split-dose and continuous exposure experiments, and the resulting models are intrinsically nonlinear in the parameters. A general purpose maximum likelihood estimation procedure is described and illustrated with numerical examples from both experimental designs. Poisson regression analysis is used for estimation, hypothesis testing, and regression diagnostics. Results are discussed in the context of exposure assessment procedures for both acute and chronic human radiation exposure.
Fast WMAP Likelihood Code and GSR PC Functions
NASA Astrophysics Data System (ADS)
Dvorkin, Cora; Hu, Wayne
2010-10-01
We place functional constraints on the shape of the inflaton potential from the cosmic microwave background through a variant of the generalized slow roll approximation that allows large amplitude, rapidly changing deviations from scale-free conditions. Employing a principal component decomposition of the source function G'~3(V'/V)^2 - 2V''/V and keeping only those measured to better than 10% results in 5 nearly independent Gaussian constraints that maybe used to test any single-field inflationary model where such deviations are expected. The first component implies < 3% variations at the 100 Mpc scale. One component shows a 95% CL preference for deviations around the 300 Mpc scale at the ~10% level but the global significance is reduced considering the 5 components examined. This deviation also requires a change in the cold dark matter density which in a flat LCDM model is disfavored by current supernova and Hubble constant data and can be tested with future polarization or high multipole temperature data. Its impact resembles a local running of the tilt from multipoles 30-800 but is only marginally consistent with a constant running beyond this range. For this analysis, we have implemented a ~40x faster WMAP7 likelihood method which we have made publicly available.
Evaluation of historical dry well surveillance logs
Price, R.K.
1996-09-09
Several dry well surveillance logs from 1975 through 1995 for the SX Tank Farm have been examined to identify potential subsurface zones of radioactive contaminant migration. Several dynamic conditions of the gamma-ray emitting radioactive contaminant shave been identified.
10 CFR 34.71 - Utilization logs.
Code of Federal Regulations, 2014 CFR
2014-01-01
... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain... make, model, and serial number of the radiographic exposure device or transport or storage container...
10 CFR 34.71 - Utilization logs.
Code of Federal Regulations, 2012 CFR
2012-01-01
... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain... make, model, and serial number of the radiographic exposure device or transport or storage container...
10 CFR 34.71 - Utilization logs.
Code of Federal Regulations, 2011 CFR
2011-01-01
... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain... make, model, and serial number of the radiographic exposure device or transport or storage container...
10 CFR 34.71 - Utilization logs.
Code of Federal Regulations, 2013 CFR
2013-01-01
... RADIOGRAPHIC OPERATIONS Recordkeeping Requirements § 34.71 Utilization logs. (a) Each licensee shall maintain... make, model, and serial number of the radiographic exposure device or transport or storage container...
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-11-13
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
Logging-while-coring method and apparatus
Goldberg, David S.; Myers, Gregory J.
2007-01-30
A method and apparatus for downhole coring while receiving logging-while-drilling tool data. The apparatus includes core collar and a retrievable core barrel. The retrievable core barrel receives core from a borehole which is sent to the surface for analysis via wireline and latching tool The core collar includes logging-while-drilling tools for the simultaneous measurement of formation properties during the core excavation process. Examples of logging-while-drilling tools include nuclear sensors, resistivity sensors, gamma ray sensors, and bit resistivity sensors. The disclosed method allows for precise core-log depth calibration and core orientation within a single borehole, and without at pipe trip, providing both time saving and unique scientific advantages.
CMLOG: A common message logging system
Chen, J.; Akers, W.; Bickley, M.; Wu, D.; Watson, W. III
1997-12-01
The Common Message Logging (CMLOG) system is an object-oriented and distributed system that not only allows applications and systems to log data (messages) of any type into a centralized database but also lets applications view incoming messages in real-time or retrieve stored data from the database according to selection rules. It consists of a concurrent Unix server that handles incoming logging or searching messages, a Motif browser that can view incoming messages in real-time or display stored data in the database, a client daemon that buffers and sends logging messages to the server, and libraries that can be used by applications to send data to or retrieve data from the database via the server. This paper presents the design and implementation of the CMLOG system meanwhile it will also address the issue of integration of CMLOG into existing control systems.
Sisyphus - An Event Log Analysis Toolset
Jon Stearley, Glenn Laguna
2004-09-01
Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiy understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.
Expansion of industrial logging in Central Africa.
Laporte, Nadine T; Stabach, Jared A; Grosch, Robert; Lin, Tiffany S; Goetz, Scott J
2007-06-01
Industrial logging has become the most extensive land use in Central Africa, with more than 600,000 square kilometers (30%) of forest currently under concession. With use of a time series of satellite imagery for the period from 1976 to 2003, we measured 51,916 kilometers of new logging roads. The density of roads across the forested region was 0.03 kilometer per square kilometer, but areas of Gabon and Equatorial Guinea had values over 0.09 kilometer per square kilometer. A new frontier of logging expansion was identified within the Democratic Republic of Congo, which contains 63% of the remaining forest of the region. Tree felling and skid trails increased disturbance in selectively logged areas. PMID:17556578
Sisyphus - An Event Log Analysis Toolset
2004-09-01
Event logs are a ubiquitous source of system feedback from computer systems, but have widely ranging format and can be extremely numerous, particularly from systems with many logging components. Inspection of these logs is fundamental to system debugging; increased capability to quickly extract meaningful information will impact MTTR (mean time to repair) and may impact MTBF (mean time between failure). Sisyphus is a machine-leanring analysis system whose goal is to enable content-novice analysts to efficieniiymore » understand evolving trends, identify anomalies, and investigate cause-effect hypotheses in large multiple-souce log sets. The toolkit is comprised a framework for utilizing third-party frequentitemset data mining tools Teiresias and SLCT. and software to cluster messages according to time statistics, and an interactive results viewer.« less
Optimal message log reclamation for uncoordinated checkpointing
NASA Technical Reports Server (NTRS)
Wang, Yi-Min; Fuchs, W. K.
1994-01-01
Uncoordinated checkpointing for message-passing systems allows maximum process autonomy and general nondeterministic execution, but suffers from potential domino effect and the large space overhead for maintaining checkpoints and message logs. Traditionally, it has been assumed that only obsolete checkpoints and message logs before the global recovery line can be garbage-collected. Recently, an approach to identifying all garbage checkpoints based on recovery line transformation and decomposition has been developed. We show in this paper that the same approach can be applied to the problem of identifying all garbage message logs for systems requiring message logging to record in-transit messages. Communication trace-driven simulation for several parallel programs is used to evaluate the proposed algorithm.
Conversation Threads Hidden within Email Server Logs
NASA Astrophysics Data System (ADS)
Palus, Sebastian; Kazienko, Przemysław
Email server logs contain records of all email Exchange through this server. Often we would like to analyze those emails not separately but in conversation thread, especially when we need to analyze social network extracted from those email logs. Unfortunately each mail is in different record and those record are not tided to each other in any obvious way. In this paper method for discussion threads extraction was proposed together with experiments on two different data sets - Enron and WrUT..
32 CFR 700.845 - Maintenance of logs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... bell book shall be maintained as an adjunct to the engineering log. (c) The Chief of Naval Operations shall prescribe regulations governing the contents and preparation of the deck and engineering logs...
29 CFR 42.7 - Complaint/directed action logs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 1 2010-07-01 2010-07-01 true Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...
29 CFR 42.7 - Complaint/directed action logs.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 1 2011-07-01 2011-07-01 false Complaint/directed action logs. 42.7 Section 42.7 Labor Office of the Secretary of Labor COORDINATED ENFORCEMENT § 42.7 Complaint/directed action logs. (a) To... operation of a system of coordinated Complaint/Directed Action Logs (logs). The logs shall be maintained...
Bayesian inference with adaptive fuzzy priors and likelihoods.
Osoba, Osonde; Mitaim, Sanya; Kosko, Bart
2011-10-01
Fuzzy rule-based systems can approximate prior and likelihood probabilities in Bayesian inference and thereby approximate posterior probabilities. This fuzzy approximation technique allows users to apply a much wider and more flexible range of prior and likelihood probability density functions than found in most Bayesian inference schemes. The technique does not restrict the user to the few known closed-form conjugacy relations between the prior and likelihood. It allows the user in many cases to describe the densities with words and just two rules can absorb any bounded closed-form probability density directly into the rulebase. Learning algorithms can tune the expert rules as well as grow them from sample data. The learning laws and fuzzy approximators have a tractable form because of the convex-sum structure of additive fuzzy systems. This convex-sum structure carries over to the fuzzy posterior approximator. We prove a uniform approximation theorem for Bayesian posteriors: An additive fuzzy posterior uniformly approximates the posterior probability density if the prior or likelihood densities are continuous and bounded and if separate additive fuzzy systems approximate the prior and likelihood densities. Simulations demonstrate this fuzzy approximation of priors and posteriors for the three most common conjugate priors (as when a beta prior combines with a binomial likelihood to give a beta posterior). Adaptive fuzzy systems can also approximate non-conjugate priors and likelihoods as well as approximate hyperpriors in hierarchical Bayesian inference. The number of fuzzy rules can grow exponentially in iterative Bayesian inference if the previous posterior approximator becomes the new prior approximator. PMID:21478078
Carbon emissions performance of commercial logging in East Kalimantan, Indonesia.
Griscom, Bronson; Ellis, Peter; Putz, Francis E
2014-03-01
Adoption of reduced-impact logging (RIL) methods could reduce CO2 emissions by 30-50% across at least 20% of remaining tropical forests. We developed two cost effective and robust indices for comparing the climate benefits (reduced CO2 emissions) due to RIL. The indices correct for variability in the volume of commercial timber among concessions. We determined that a correction for variability in terrain slope was not needed. We found that concessions certified by the Forest Stewardship Council (FSC, N = 3), when compared with noncertified concessions (N = 6), did not have lower overall CO2 emissions from logging activity (felling, skidding, and hauling). On the other hand, FSC certified concessions did have lower emissions from one type of logging impact (skidding), and we found evidence of a range of improved practices using other field metrics. One explanation of these results may be that FSC criteria and indicators, and associated RIL practices, were not designed to achieve overall emissions reductions. Also, commonly used field metrics are not reliable proxies for overall logging emissions performance. Furthermore, the simple distinction between certified and noncertified concessions does not fully represent the complex history of investments in improved logging practices. To clarify the relationship between RIL and emissions reductions, we propose the more explicit term 'RIL-C' to refer to the subset of RIL practices that can be defined by quantified thresholds and that result in measurable emissions reductions. If tropical forest certification is to be linked with CO2 emissions reductions, certification standards need to explicitly require RIL-C practices. PMID:24022913
Race of source effects in the elaboration likelihood model.
White, P H; Harkins, S G
1994-11-01
In a series of experiments, we investigated the effect of race of source on persuasive communications in the Elaboration Likelihood Model (R.E. Petty & J.T. Cacioppo, 1981, 1986). In Experiment 1, we found no evidence that White participants responded to a Black source as a simple negative cue. Experiment 2 suggested the possibility that exposure to a Black source led to low-involvement message processing. In Experiments 3 and 4, a distraction paradigm was used to test this possibility, and it was found that participants under low involvement were highly motivated to process a message presented by a Black source. In Experiment 5, we found that attitudes toward the source's ethnic group, rather than violations of expectancies, accounted for this processing effect. Taken together, the results of these experiments are consistent with S.L. Gaertner and J.F. Dovidio's (1986) theory of aversive racism, which suggests that Whites, because of a combination of egalitarian values and underlying negative racial attitudes, are very concerned about not appearing unfavorable toward Blacks, leading them to be highly motivated to process messages presented by a source from this group. PMID:7983579
Likelihood of nitrogen condensation in Titan's present-day atmosphere
NASA Astrophysics Data System (ADS)
Tokano, Tetsuya
2016-04-01
The temperature in Titan's upper troposphere measured by the Huygens Probe is relatively close to the nitrogen condensation point. This poses the question as to whether seasonal nitrogen condensation might occur on Titan analogously to seasonal carbon dioxide condensation on Mars. The likelihood of nitrogen condensation in Titan's atmosphere is investigated using tropospheric temperature data obtained by Cassini radio occultations and simulation with a general circulation model (GCM). The observed tropospheric temperature generally decreases towards both poles but does not reach the nitrogen condensation point anywhere. However, Cassini may not have sounded the coldest season and area in Titan's troposphere. The GCM simulation shows that in the upper troposphere the variable solar distance associated with Saturn's orbital eccentricity has a larger impact on the seasonal polar temperature variation than the variable solar declination associated with Saturn's obliquity. In the upper troposphere relevant for nitrogen condensation the annual minimum polar temperature is predicted to occur around the northern autumnal equinox, approximately one season after aphelion. This temperature is then 1-2 K lower than in the season of the Cassini/Huygens mission. It is possible if not certain that some nitrogen condensation with cloud formation occurs in the northern and southern polar region in the upper troposphere around the northern autumnal equinox. Under the present orbital parameters of Saturn and Titan nitrogen condensation may occur more frequently near the south pole than near the north pole.
Maximum likelihood estimation of shear wave speed in transient elastography.
Audière, Stéphane; Angelini, Elsa D; Sandrin, Laurent; Charbit, Maurice
2014-06-01
Ultrasonic transient elastography (TE), enables to assess, under active mechanical constraints, the elasticity of the liver, which correlates with hepatic fibrosis stages. This technique is routinely used in clinical practice to assess noninvasively liver stiffness. The Fibroscan system used in this work generates a shear wave via an impulse stress applied on the surface of the skin and records a temporal series of radio-frequency (RF) lines using a single-element ultrasound probe. A shear wave propagation map (SWPM) is generated as a 2-D map of the displacements along depth and time, derived from the correlations of the sequential 1-D RF lines, assuming that the direction of propagation (DOP) of the shear wave coincides with the ultrasound beam axis (UBA). Under the assumption of pure elastic tissue, elasticity is proportional to the shear wave speed. This paper introduces a novel approach to the processing of the SWPM, deriving the maximum likelihood estimate of the shear wave speed when comparing the observed displacements and the estimates provided by the Green's functions. A simple parametric model is used to interface Green's theoretical values of noisy measures provided by the SWPM, taking into account depth-varying attenuation and time-delay. The proposed method was evaluated on numerical simulations using a finite element method simulator and on physical phantoms. Evaluation on this test database reported very high agreements of shear wave speed measures when DOP and UBA coincide. PMID:24835213
Constraint likelihood analysis for a network of gravitational wave detectors
Klimenko, S.; Rakhmanov, M.; Mitselmakher, G.; Mohanty, S.
2005-12-15
We propose a coherent method for detection and reconstruction of gravitational wave signals with a network of interferometric detectors. The method is derived by using the likelihood ratio functional for unknown signal waveforms. In the likelihood analysis, the global maximum of the likelihood ratio over the space of waveforms is used as the detection statistic. We identify a problem with this approach. In the case of an aligned pair of detectors, the detection statistic depends on the cross correlation between the detectors as expected, but this dependence disappears even for infinitesimally small misalignments. We solve the problem by applying constraints on the likelihood functional and obtain a new class of statistics. The resulting method can be applied to data from a network consisting of any number of detectors with arbitrary detector orientations. The method allows us reconstruction of the source coordinates and the waveforms of two polarization components of a gravitational wave. We study the performance of the method with numerical simulations and find the reconstruction of the source coordinates to be more accurate than in the standard likelihood method.
Efficient Strategies for Calculating Blockwise Likelihoods Under the Coalescent
Lohse, Konrad; Chmelik, Martin; Martin, Simon H.; Barton, Nicholas H.
2016-01-01
The inference of demographic history from genome data is hindered by a lack of efficient computational approaches. In particular, it has proved difficult to exploit the information contained in the distribution of genealogies across the genome. We have previously shown that the generating function (GF) of genealogies can be used to analytically compute likelihoods of demographic models from configurations of mutations in short sequence blocks (Lohse et al. 2011). Although the GF has a simple, recursive form, the size of such likelihood calculations explodes quickly with the number of individuals and applications of this framework have so far been mainly limited to small samples (pairs and triplets) for which the GF can be written by hand. Here we investigate several strategies for exploiting the inherent symmetries of the coalescent. In particular, we show that the GF of genealogies can be decomposed into a set of equivalence classes that allows likelihood calculations from nontrivial samples. Using this strategy, we automated blockwise likelihood calculations for a general set of demographic scenarios in Mathematica. These histories may involve population size changes, continuous migration, discrete divergence, and admixture between multiple populations. To give a concrete example, we calculate the likelihood for a model of isolation with migration (IM), assuming two diploid samples without phase and outgroup information. We demonstrate the new inference scheme with an analysis of two individual butterfly genomes from the sister species Heliconius melpomene rosina and H. cydno. PMID:26715666
Efficient Strategies for Calculating Blockwise Likelihoods Under the Coalescent.
Lohse, Konrad; Chmelik, Martin; Martin, Simon H; Barton, Nicholas H
2016-02-01
The inference of demographic history from genome data is hindered by a lack of efficient computational approaches. In particular, it has proved difficult to exploit the information contained in the distribution of genealogies across the genome. We have previously shown that the generating function (GF) of genealogies can be used to analytically compute likelihoods of demographic models from configurations of mutations in short sequence blocks (Lohse et al. 2011). Although the GF has a simple, recursive form, the size of such likelihood calculations explodes quickly with the number of individuals and applications of this framework have so far been mainly limited to small samples (pairs and triplets) for which the GF can be written by hand. Here we investigate several strategies for exploiting the inherent symmetries of the coalescent. In particular, we show that the GF of genealogies can be decomposed into a set of equivalence classes that allows likelihood calculations from nontrivial samples. Using this strategy, we automated blockwise likelihood calculations for a general set of demographic scenarios in Mathematica. These histories may involve population size changes, continuous migration, discrete divergence, and admixture between multiple populations. To give a concrete example, we calculate the likelihood for a model of isolation with migration (IM), assuming two diploid samples without phase and outgroup information. We demonstrate the new inference scheme with an analysis of two individual butterfly genomes from the sister species Heliconius melpomene rosina and H. cydno. PMID:26715666
MARGINAL EMPIRICAL LIKELIHOOD AND SURE INDEPENDENCE FEATURE SCREENING
Chang, Jinyuan; Tang, Cheng Yong; Wu, Yichao
2013-01-01
We study a marginal empirical likelihood approach in scenarios when the number of variables grows exponentially with the sample size. The marginal empirical likelihood ratios as functions of the parameters of interest are systematically examined, and we find that the marginal empirical likelihood ratio evaluated at zero can be used to differentiate whether an explanatory variable is contributing to a response variable or not. Based on this finding, we propose a unified feature screening procedure for linear models and the generalized linear models. Different from most existing feature screening approaches that rely on the magnitudes of some marginal estimators to identify true signals, the proposed screening approach is capable of further incorporating the level of uncertainties of such estimators. Such a merit inherits the self-studentization property of the empirical likelihood approach, and extends the insights of existing feature screening methods. Moreover, we show that our screening approach is less restrictive to distributional assumptions, and can be conveniently adapted to be applied in a broad range of scenarios such as models specified using general moment conditions. Our theoretical results and extensive numerical examples by simulations and data analysis demonstrate the merits of the marginal empirical likelihood approach. PMID:24415808
On-site assessment of rock discontinuities from resistivity logs. T-L log: A new logging technique
NASA Astrophysics Data System (ADS)
Tselentis, Gerasimos-Akis
1986-03-01
The direct on-site assessment of the vertical distribution of discontinuities to rock masses is very important since it can give a first estimation of the hydraulic properties of the strata and has many practical applications, such as groundwater resources investigations, radioactive and toxic waste disposal, dam foundation site investigations, etc. In the present work, the effect that fractures have upon some geophysical parameters which can easily be determined from the analysis of conventional normal resistivity logs is examined and a new technique for the on-site processing of resistivity logging data is introduced. Using a microcomputer in series with the logging unit, a zonation process was applied to the logs, which were interpreted in terms of a series of beds, each having a specific thickness and resistivity, and a new parameter defined by the difference between transverse and longitudinal resistivities was computed (T-L log). In almost all the cases that the method was applied, the obtained results were satisfactory and the microcomputer-based software and hardware package that was developed for the automatic processing of the data proved to be very efficient.
The Likelihood of Recent Record Warmth
NASA Astrophysics Data System (ADS)
Mann, Michael E.; Rahmstorf, Stefan; Steinman, Byron A.; Tingley, Martin; Miller, Sonya K.
2016-01-01
2014 was nominally the warmest year on record for both the globe and northern hemisphere based on historical records spanning the past one and a half centuries1,2. It was the latest in a recent run of record temperatures spanning the past decade and a half. Press accounts reported odds as low as one-in-650 million that the observed run of global temperature records would be expected to occur in the absence of human-caused global warming. Press reports notwithstanding, the question of how likely observed temperature records may have have been both with and without human influence is interesting in its own right. Here we attempt to address that question using a semi-empirical approach that combines the latest (CMIP53) climate model simulations with observations of global and hemispheric mean temperature. We find that individual record years and the observed runs of record-setting temperatures were extremely unlikely to have occurred in the absence of human-caused climate change, though not nearly as unlikely as press reports have suggested. These same record temperatures were, by contrast, quite likely to have occurred in the presence of anthropogenic climate forcing.
The Likelihood of Recent Record Warmth
Mann, Michael E.; Rahmstorf, Stefan; Steinman, Byron A.; Tingley, Martin; Miller, Sonya K.
2016-01-01
2014 was nominally the warmest year on record for both the globe and northern hemisphere based on historical records spanning the past one and a half centuries1,2. It was the latest in a recent run of record temperatures spanning the past decade and a half. Press accounts reported odds as low as one-in-650 million that the observed run of global temperature records would be expected to occur in the absence of human-caused global warming. Press reports notwithstanding, the question of how likely observed temperature records may have have been both with and without human influence is interesting in its own right. Here we attempt to address that question using a semi-empirical approach that combines the latest (CMIP53) climate model simulations with observations of global and hemispheric mean temperature. We find that individual record years and the observed runs of record-setting temperatures were extremely unlikely to have occurred in the absence of human-caused climate change, though not nearly as unlikely as press reports have suggested. These same record temperatures were, by contrast, quite likely to have occurred in the presence of anthropogenic climate forcing. PMID:26806092
55. VIEW OF STEAMOPERATED LOG HOIST TO PUT IN COMING ...
55. VIEW OF STEAM-OPERATED LOG HOIST TO PUT IN COMING LOGS INTO RALPH HULL LUMBER CO. LOG POND. PHOTOGRAPHER: UNKNOWN. DATE: 1942. COURTESY OF RALPH HULL. - Hull-Oakes Lumber Company, 23837 Dawson Road, Monroe, Benton County, OR
5. Log calving barn. Detail of wall corner showing half ...
5. Log calving barn. Detail of wall corner showing half dovetail notching on hand-hewn logs. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT
32 CFR 700.846 - Status of logs.
Code of Federal Regulations, 2010 CFR
2010-07-01
... OFFICIAL RECORDS UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS The Commanding Officer Commanding Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...
32 CFR 700.846 - Status of logs.
Code of Federal Regulations, 2011 CFR
2011-07-01
... OFFICIAL RECORDS UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS The Commanding Officer Commanding Officers Afloat § 700.846 Status of logs. The deck log, the engineering log, the compass record,...
Forecasting populations of undiscovered oil fields with the log-Pareto distribution
Bettini, C.
1987-01-01
The search for a statistical representation of parent populations of oil field sizes in sedimentary basins has yielded a new probability distribution, termed the log-Pareto distribution. The log-Pareto law, related to the log-Pareto distribution, describes the medians of oil field sizes (either volumes or areas) associated with size-ranks in a parent population of fields. The name proposed for the new distribution stems from the assumption that an algebraic transformation involving logarithms of the volumes (or areas) of a parent population of oil fields yields a frequency distribution of transformed sizes that accords with the Pareto distribution. The derivation of the log-Pareto law and the log-Pareto distribution involved two steps, namely, (1) simulating a parent population of oil fields such that the population is conditioned to both the proportion of a basin's area occupied by fields and their geographic variability, and (2) finding a mathematical function that fits the log-log plot of field ranks versus simulated field areas. Both the log-Pareto law and log-Pareto distribution are useful for estimating the size distribution of the oil fields that remain to be discovered in a basin. A log-Pareto law fitted to the range of the largest fields discovered in maturely explored basins tends to underestimate the subpopulation of undiscovered fields, while Pareto's law tends to overestimate the undiscovered subpopulation. Therefore the log-Pareto law is more appropriate for risk-averse decision makers, and Pareto law for risk-seeking decision makers. The log-Pareto law has been validated with both simulated data and actual data from Nigeria, the Denver-Julesburg Basin, and the Campos Basin in Brazil.
Log-periodic self-similarity: an emerging financial law?
NASA Astrophysics Data System (ADS)
Drożdż, S.; Grümmer, F.; Ruf, F.; Speth, J.
2003-06-01
A hypothesis that the financial log-periodicity, cascading self-similarly through various time scales, carries signatures of a law is pursued. It is shown that the most significant historical financial events can be classified amazingly well using a single and unique value of the preferred scaling factor λ=2, which indicates that its real value should be close to this number. This applies even to a declining decelerating log-periodic phase. Crucial in this connection is identification of a “super-bubble” (bubble on bubble) phenomenon. Identifying a potential “universal” preferred scaling factor, as undertaken here, may significantly improve the predictive power of the corresponding methodology. Several more specific related results include evidence that: the real end of the high technology bubble on the stock market started (with a decelerating log-periodic draw down) in the beginning of September 2000; a parallel 2000-2002 decline seen in the Standard & Poor's 500 from the log-periodic perspective is already of the same significance as the one of the early 1930s and of the late 1970s; all this points to a much more serious global crash in around 2025, of course from a level much higher (at least one order of magnitude) than in 2000.
Fluid-Rock Characterization and Interactions in NMR Well Logging
George J. Hirasaki; Kishore K. Mohanty
2005-09-05
The objective of this report is to characterize the fluid properties and fluid-rock interactions that are needed for formation evaluation by NMR well logging. The advances made in the understanding of NMR fluid properties are summarized in a chapter written for an AAPG book on NMR well logging. This includes live oils, viscous oils, natural gas mixtures, and the relation between relaxation time and diffusivity. Oil based drilling fluids can have an adverse effect on NMR well logging if it alters the wettability of the formation. The effect of various surfactants on wettability and surface relaxivity are evaluated for silica sand. The relation between the relaxation time and diffusivity distinguishes the response of brine, oil, and gas in a NMR well log. A new NMR pulse sequence in the presence of a field gradient and a new inversion technique enables the T{sub 2} and diffusivity distributions to be displayed as a two-dimensional map. The objectives of pore morphology and rock characterization are to identify vug connectivity by using X-ray CT scan, and to improve NMR permeability correlation. Improved estimation of permeability from NMR response is possible by using estimated tortuosity as a parameter to interpolate between two existing permeability models.
LogCauchy, log-sech and lognormal distributions of species abundances in forest communities
Yin, Z.-Y.; Peng, S.-L.; Ren, H.; Guo, Q.; Chen, Z.-H.
2005-01-01
Species-abundance (SA) pattern is one of the most fundamental aspects of biological community structure, providing important information regarding species richness, species-area relation and succession. To better describe the SA distribution (SAD) in a community, based on the widely used lognormal (LN) distribution model with exp(-x2) roll-off on Preston's octave scale, this study proposed two additional models, logCauchy (LC) and log-sech (LS), respectively with roll-offs of simple x-2 and e-x. The estimation of the theoretical total number of species in the whole community, S*, including very rare species not yet collected in sample, was derived from the left-truncation of each distribution. We fitted these three models by Levenberg-Marquardt nonlinear regression and measured the model fit to the data using coefficient of determination of regression, parameters' t-test and distribution's Kolmogorov-Smirnov (KS) test. Examining the SA data from six forest communities (five in lower subtropics and one in tropics), we found that: (1) on a log scale, all three models that are bell-shaped and left-truncated statistically adequately fitted the observed SADs, and the LC and LS did better than the LN; (2) from each model and for each community the S* values estimated by the integral and summation methods were almost equal, allowing us to estimate S* using a simple integral formula and to estimate its asymptotic confidence internals by regression of a transformed model containing it; (3) following the order of LC, LS, and LN, the fitted distributions became lower in the peak, less concave in the side, and shorter in the tail, and overall the LC tended to overestimate, the LN tended to underestimate, while the LS was intermediate but slightly tended to underestimate, the observed SADs (particularly the number of common species in the right tail); (4) the six communities had some similar structural properties such as following similar distribution models, having a common
Function Based Risk Assessment: Mapping Function to Likelihood
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Grantham, Katie; Stone, Robert
2005-01-01
The concept of function offers a high potential for thinking and reasoning about designs as well as providing a common thread for relating together other design information. This paper focuses specifically on the relation between function and risk by presenting a mathematical mapping from product function to risk likelihood. This risk information is composed of design parameters, failure modes, and likelihood values. A spacecraft orientation subsystem, subsystem used to guide science instruments, and a Bell 206 rotorcraft are used to test the mapping which continues research on these products relating function to failure. Finally, a case study is presented in which the risk element likelihood is calculated for a fuel cell which is in the conceptual design phase at NASA JPL.
Nonparametric identification and maximum likelihood estimation for hidden Markov models
Alexandrovich, G.; Holzmann, H.; Leister, A.
2016-01-01
Nonparametric identification and maximum likelihood estimation for finite-state hidden Markov models are investigated. We obtain identification of the parameters as well as the order of the Markov chain if the transition probability matrices have full-rank and are ergodic, and if the state-dependent distributions are all distinct, but not necessarily linearly independent. Based on this identification result, we develop a nonparametric maximum likelihood estimation theory. First, we show that the asymptotic contrast, the Kullback–Leibler divergence of the hidden Markov model, also identifies the true parameter vector nonparametrically. Second, for classes of state-dependent densities which are arbitrary mixtures of a parametric family, we establish the consistency of the nonparametric maximum likelihood estimator. Here, identification of the mixing distributions need not be assumed. Numerical properties of the estimates and of nonparametric goodness of fit tests are investigated in a simulation study.
The Atacama Cosmology Telescope: likelihood for small-scale CMB data
Dunkley, J.; Calabrese, E.; Sievers, J.; Addison, G.E.; Halpern, M.; Battaglia, N.; Battistelli, E.S.; Bond, J.R.; Hajian, A.; Hincks, A.D.; Das, S.; Devlin, M.J.; Dünner, R.; Fowler, J.W.; Irwin, K.D.; Gralla, M.; Hasselfield, M.; Hlozek, R.; Hughes, J.P.; Kosowsky, A.; and others
2013-07-01
The Atacama Cosmology Telescope has measured the angular power spectra of microwave fluctuations to arcminute scales at frequencies of 148 and 218 GHz, from three seasons of data. At small scales the fluctuations in the primordial Cosmic Microwave Background (CMB) become increasingly obscured by extragalactic foregounds and secondary CMB signals. We present results from a nine-parameter model describing these secondary effects, including the thermal and kinematic Sunyaev-Zel'dovich (tSZ and kSZ) power; the clustered and Poisson-like power from Cosmic Infrared Background (CIB) sources, and their frequency scaling; the tSZ-CIB correlation coefficient; the extragalactic radio source power; and thermal dust emission from Galactic cirrus in two different regions of the sky. In order to extract cosmological parameters, we describe a likelihood function for the ACT data, fitting this model to the multi-frequency spectra in the multipole range 500 < l < 10000. We extend the likelihood to include spectra from the South Pole Telescope at frequencies of 95, 150, and 220 GHz. Accounting for different radio source levels and Galactic cirrus emission, the same model provides an excellent fit to both datasets simultaneously, with χ{sup 2}/dof= 675/697 for ACT, and 96/107 for SPT. We then use the multi-frequency likelihood to estimate the CMB power spectrum from ACT in bandpowers, marginalizing over the secondary parameters. This provides a simplified 'CMB-only' likelihood in the range 500 < l < 3500 for use in cosmological parameter estimation.
The Atacama Cosmology Telescope: Likelihood for Small-Scale CMB Data
NASA Technical Reports Server (NTRS)
Dunkley, J.; Calabrese, E.; Sievers, J.; Addison, G. E.; Battaglia, N.; Battistelli, E. S.; Bond, J. R.; Das, S.; Devlin, M. J.; Dunner, R.; Fowler, J. W.; Gralla, M.; Hajian, A.; Halpern, M.; Hasselfield, M.; Hincks, A. D.; Hlozek, R.; Hughes, J. P.; Irwin, K. D.; Kosowsky, A.; Louis, T.; Marriage, T. A.; Marsden, D.; Menanteau, F.; Niemack, M.
2013-01-01
The Atacama Cosmology Telescope has measured the angular power spectra of microwave fluctuations to arcminute scales at frequencies of 148 and 218 GHz, from three seasons of data. At small scales the fluctuations in the primordial Cosmic Microwave Background (CMB) become increasingly obscured by extragalactic foregounds and secondary CMB signals. We present results from a nine-parameter model describing these secondary effects, including the thermal and kinematic Sunyaev-Zel'dovich (tSZ and kSZ) power; the clustered and Poisson-like power from Cosmic Infrared Background (CIB) sources, and their frequency scaling; the tSZ-CIB correlation coefficient; the extragalactic radio source power; and thermal dust emission from Galactic cirrus in two different regions of the sky. In order to extract cosmological parameters, we describe a likelihood function for the ACT data, fitting this model to the multi-frequency spectra in the multipole range 500 < l < 10000. We extend the likelihood to include spectra from the South Pole Telescope at frequencies of 95, 150, and 220 GHz. Accounting for different radio source levels and Galactic cirrus emission, the same model provides an excellent fit to both datasets simultaneously, with ?2/dof= 675/697 for ACT, and 96/107 for SPT. We then use the multi-frequency likelihood to estimate the CMB power spectrum from ACT in bandpowers, marginalizing over the secondary parameters. This provides a simplified 'CMB-only' likelihood in the range 500 < l < 3500 for use in cosmological parameter estimation
Deep penetration well logging system and method
Cox, P.T.; Warren, W.F.; Johnson, D.L.
1990-01-30
This patent describes a well logging system for determining the dielectric constant and/or resistivity of earth formations, some of which have been invaded by drilling fluid, traversed by a borehole. It comprises: a well logging sonde adapted to be passed through the borehole including: means for transmitting electromagnetic energy into the earth formation at a frequency which enables the electromagnetic energy to propagate throughout the surrounding earth formation; first, second and third receiver means; means connected to the three receiver means for processing the three receiver signals to provide a combined signal for application to well logging cable means, well logging cable means for conducting the combined signal from the signal processing means out of the borehole; and surface electronics. The surface electronics includes indication means connected to the well logging cable means for providing an indication of the dielectric constant and/or the resistivity of the earth formation in accordance with portions of the combined signal conducted by the cable means representative of secondary electromagnetic fields at two of the three receiving means locations.
Logjam: A scalable unified log file archiver
2001-08-01
Log files are a necessary record of events on any system. However, as systems scale, so does the volume of data captured. To complicate matters, this data can be distributed across all nodes within the system. This creates challenges in ways to obtain these files as well as archiving them in a consistent manner. It has become commonplace to develop a custom written utility for each system that is tailored specifically to that system. Formore » computer centers that contain multiple systems, each system would have their own respective utility for gathering and archiving log files. Each time a new log file is produced, a modification to the utility is necessary. With each modification, risks of errors could be introduced as well as spending time to introduce that change. This is precisely the purpose of logjam. Once installed, the code only requires modification when new features are required. A configuration file is used to identify each log file as well as where to harvest it and how to archive it. Adding a new log file is as simple as defining it in a configuration file and testing can be performed in the production environment.« less
Logjam: A scalable unified log file archiver
Cardo, Nicholas P.
2001-08-01
Log files are a necessary record of events on any system. However, as systems scale, so does the volume of data captured. To complicate matters, this data can be distributed across all nodes within the system. This creates challenges in ways to obtain these files as well as archiving them in a consistent manner. It has become commonplace to develop a custom written utility for each system that is tailored specifically to that system. For computer centers that contain multiple systems, each system would have their own respective utility for gathering and archiving log files. Each time a new log file is produced, a modification to the utility is necessary. With each modification, risks of errors could be introduced as well as spending time to introduce that change. This is precisely the purpose of logjam. Once installed, the code only requires modification when new features are required. A configuration file is used to identify each log file as well as where to harvest it and how to archive it. Adding a new log file is as simple as defining it in a configuration file and testing can be performed in the production environment.
Modified maximum likelihood registration based on information fusion
NASA Astrophysics Data System (ADS)
Qi, Yongqing; Jing, Zhongliang; Hu, Shiqiang
2007-11-01
The bias estimation of passive sensors is considered based on information fusion in multi-platform multi-sensor tracking system. The unobservable problem of bearing-only tracking in blind spot is analyzed. A modified maximum likelihood method, which uses the redundant information of multi-sensor system to calculate the target position, is investigated to estimate the biases. Monte Carlo simulation results show that the modified method eliminates the effect of unobservable problem in the blind spot and can estimate the biases more rapidly and accurately than maximum likelihood method. It is statistically efficient since the standard deviation of bias estimation errors meets the theoretical lower bounds.
Maximum-likelihood block detection of noncoherent continuous phase modulation
NASA Technical Reports Server (NTRS)
Simon, Marvin K.; Divsalar, Dariush
1993-01-01
This paper examines maximum-likelihood block detection of uncoded full response CPM over an additive white Gaussian noise (AWGN) channel. Both the maximum-likelihood metrics and the bit error probability performances of the associated detection algorithms are considered. The special and popular case of minimum-shift-keying (MSK) corresponding to h = 0.5 and constant amplitude frequency pulse is treated separately. The many new receiver structures that result from this investigation can be compared to the traditional ones that have been used in the past both from the standpoint of simplicity of implementation and optimality of performance.
Nonparametric maximum likelihood estimation for the multisample Wicksell corpuscle problem
Chan, Kwun Chuen Gary; Qin, Jing
2016-01-01
We study nonparametric maximum likelihood estimation for the distribution of spherical radii using samples containing a mixture of one-dimensional, two-dimensional biased and three-dimensional unbiased observations. Since direct maximization of the likelihood function is intractable, we propose an expectation-maximization algorithm for implementing the estimator, which handles an indirect measurement problem and a sampling bias problem separately in the E- and M-steps, and circumvents the need to solve an Abel-type integral equation, which creates numerical instability in the one-sample problem. Extensions to ellipsoids are studied and connections to multiplicative censoring are discussed. PMID:27279657
Carbon emissions from tropical forest degradation caused by logging
NASA Astrophysics Data System (ADS)
Pearson, Timothy R. H.; Brown, Sandra; Casarim, Felipe M.
2014-03-01
The focus of land-use related efforts in developing countries to reduce carbon emissions has been on slowing deforestation, yet international agreements are to reduce emissions from both deforestation and forest degradation (REDD). The second ‘D’ is poorly understood and accounted for a number of technical and policy reasons. Here we introduce a complete accounting method for estimating emission factors from selective timber harvesting, a substantial form of forest degradation in many tropical developing countries. The method accounts separately for emissions from the extracted log, from incidental damage to the surrounding forest, and from logging infrastructure, and emissions are expressed as units of carbon per cubic meter of timber extracted to allow for simple application to timber harvesting statistics. We applied the method in six tropical countries (Belize, Bolivia, Brazil, Guyana, Indonesia, and Republic of Congo), resulting in total emission factors of 0.99-2.33 Mg C m-3. In all cases, emissions were dominated by damage to surrounding vegetation and the infrastructure rather than the logs themselves, and total emissions represented about 3-15% of the biomass carbon stocks of the associated unlogged forests. We then combined the emission factors with country level logging statistics for nine key timber producing countries represented by our study areas to gain an understanding of the order of magnitude of emissions from degradation compared to those recently reported for deforestation in the same countries. For the nine countries included, emissions from logging were on average equivalent to about 12% of those from deforestation. For those nine countries with relatively low emissions from deforestation, emissions from logging were equivalent to half or more of those from deforestation, whereas for those countries with the highest emissions from deforestation, emissions from logging were equivalent to <10% of those from deforestation. Understanding how
Nuclear Well Log Properties of Natural Gas Hydrate Reservoirs
NASA Astrophysics Data System (ADS)
Burchwell, A.; Cook, A.
2015-12-01
Characterizing gas hydrate in a reservoir typically involves a full suite of geophysical well logs. The most common method involves using resistivity measurements to quantify the decrease in electrically conductive water when replaced with gas hydrate. Compressional velocity measurements are also used because the gas hydrate significantly strengthens the moduli of the sediment. At many gas hydrate sites, nuclear well logs, which include the photoelectric effect, formation sigma, carbon/oxygen ratio and neutron porosity, are also collected but often not used. In fact, the nuclear response of a gas hydrate reservoir is not known. In this research we will focus on the nuclear log response in gas hydrate reservoirs at the Mallik Field at the Mackenzie Delta, Northwest Territories, Canada, and the Gas Hydrate Joint Industry Project Leg 2 sites in the northern Gulf of Mexico. Nuclear logs may add increased robustness to the investigation into the properties of gas hydrates and some types of logs may offer an opportunity to distinguish between gas hydrate and permafrost. For example, a true formation sigma log measures the thermal neutron capture cross section of a formation and pore constituents; it is especially sensitive to hydrogen and chlorine in the pore space. Chlorine has a high absorption potential, and is used to determine the amount of saline water within pore spaces. Gas hydrate offers a difference in elemental composition compared to water-saturated intervals. Thus, in permafrost areas, the carbon/oxygen ratio may vary between gas hydrate and permafrost, due to the increase of carbon in gas hydrate accumulations. At the Mallik site, we observe a hydrate-bearing sand (1085-1107 m) above a water-bearing sand (1107-1140 m), which was confirmed through core samples and mud gas analysis. We observe a decrease in the photoelectric absorption of ~0.5 barnes/e-, as well as an increase in the formation sigma readings of ~5 capture units in the water-bearing sand as
Spreadsheet log analysis in subsurface geology
Doveton, J.H.
2000-01-01
Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.
Log-rolling block copolymers cylinders
NASA Astrophysics Data System (ADS)
Kim, So Youn; Kim, Ye Chan; Kim, Dong Hyup; Kwon, Na Kyung; Register, Richard A.
Shear has been the most effective method to create long range order of micro- or nano- structures in soft materials. When shear is applied, soft particles or polymers tend to align along the shear direction to minimize the viscous dissipation, thus transverse (so-called ``log-rolling'') alignment is unfavored. In this study, for the first time we report the transverse alignment of cylinder-forming block copolymers. Poly(styrene-b-methyl methacrylate), PS-PMMA, can form a metastable hemicylinder structure when confined in a thin film, and this hemicylinder structure can align either along the shear direction, or transverse to the shear direction (``log-rolling''), depending on the shearing temperature. This unusual ``log-rolling'' behavior is explained by the different chain mobility of the two blocks in PS-PMMA; the rigidity of core cylinder is the critical parameter determining the direction of shear alignment.
Unconventional neutron sources for oil well logging
NASA Astrophysics Data System (ADS)
Frankle, C. M.; Dale, G. E.
2013-09-01
Americium-Beryllium (AmBe) radiological neutron sources have been widely used in the petroleum industry for well logging purposes. There is strong desire on the part of various governmental and regulatory bodies to find alternate sources due to the high activity and small size of AmBe sources. Other neutron sources are available, both radiological (252Cf) and electronic accelerator driven (D-D and D-T). All of these, however, have substantially different neutron energy spectra from AmBe and thus cause significantly different responses in well logging tools. We report on simulations performed using unconventional sources and techniques to attempt to better replicate the porosity and carbon/oxygen ratio responses a well logging tool would see from AmBe neutrons. The AmBe response of these two types of tools is compared to the response from 252Cf, D-D, D-T, filtered D-T, and T-T sources.
LogSafe and Smart: Minnesota OSHA's LogSafe Program Takes Root.
ERIC Educational Resources Information Center
Honerman, James
1999-01-01
Logging is now the most dangerous U.S. occupation. The Occupational Safety and Health Administration (OSHA) developed specialized safety training for the logging industry but has been challenged to reach small operators. An OSHA-approved state program in Minnesota provides annual safety seminars to about two-thirds of the state's full-time…
Lithologic logs and geophysical logs from test drilling in Palm Beach County, Florida, since 1974
Swayze, Leo J.; McGovern, Michael C.; Fischer, John N.
1980-01-01
Test-hole data that may be used to determine the hydrogeology of the zone of high permeability in Palm Beach County, Fla., are presented. Lithologic logs from 46 test wells and geophysical logs from 40 test wells are contained in this report. (USGS)
Maximum Likelihood Shift Estimation Using High Resolution Polarimetric SAR Clutter Model
NASA Astrophysics Data System (ADS)
Harant, Olivier; Bombrun, Lionel; Vasile, Gabriel; Ferro-Famil, Laurent; Gay, Michel
2011-03-01
This paper deals with a Maximum Likelihood (ML) shift estimation method in the context of High Resolution (HR) Polarimetric SAR (PolSAR) clutter. Texture modeling is exposed and the generalized ML texture tracking method is extended to the merging of various sensors. Some results on displacement estimation on the Argentiere glacier in the Mont Blanc massif using dual-pol TerraSAR-X (TSX) and quad-pol RADARSAT-2 (RS2) sensors are finally discussed.
Development of pulsed neutron uranium logging instrument
Wang, Xin-guang; Liu, Dan; Zhang, Feng
2015-03-15
This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of {sup 235}U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously.
Development of pulsed neutron uranium logging instrument.
Wang, Xin-guang; Liu, Dan; Zhang, Feng
2015-03-01
This article introduces a development of pulsed neutron uranium logging instrument. By analyzing the temporal distribution of epithermal neutrons generated from the thermal fission of (235)U, we propose a new method with a uranium-bearing index to calculate the uranium content in the formation. An instrument employing a D-T neutron generator and two epithermal neutron detectors has been developed. The logging response is studied using Monte Carlo simulation and experiments in calibration wells. The simulation and experimental results show that the uranium-bearing index is linearly correlated with the uranium content, and the porosity and thermal neutron lifetime of the formation can be acquired simultaneously. PMID:25832251
Permeability extraction: A sonic log inversion
Akbar, N.; Kim, J.J.
1994-12-31
In this paper the authors provide the missing important link between permeability and acoustic velocities by generating a permeability-dependent synthetic sonic log in a carbonate reservoir. The computations are based on Akbar`s theory that relates wave velocity to frequency, rock properties (e.g., lithology, permeability, and porosity), and fluid saturation and properties (viscosity, density, and compressibility). An inverted analytical expression of the theory is used to extract permeability from sonic velocity. The synthetic sonic and the computed permeability are compared with the observed sonic log and with plug permeability, respectively. The results demonstrate, as predicted by theory, that permeability can be related directly to acoustic velocities.
Compacting a Kentucky coal for quality logs
Lin, Y.; Li, Z.; Mao, S.
1999-07-01
A Kentucky coal was found more difficult to be compacted into large size strong logs. Study showed that compaction parameters affecting the strength of compacted coal logs could be categorized into three groups. The first group is coal inherent properties such as elasticity and coefficient of friction, the second group is machine properties such as mold geometry, and the third group is the coal mixture preparation parameters such as particle size distribution. Theoretical analysis showed that an appropriate backpressure can reduce surface cracks occurring during ejection. This has been confirmed by the experiments conducted.
Likelihood-Free Inference in High-Dimensional Models.
Kousathanas, Athanasios; Leuenberger, Christoph; Helfer, Jonas; Quinodoz, Mathieu; Foll, Matthieu; Wegmann, Daniel
2016-06-01
Methods that bypass analytical evaluations of the likelihood function have become an indispensable tool for statistical inference in many fields of science. These so-called likelihood-free methods rely on accepting and rejecting simulations based on summary statistics, which limits them to low-dimensional models for which the value of the likelihood is large enough to result in manageable acceptance rates. To get around these issues, we introduce a novel, likelihood-free Markov chain Monte Carlo (MCMC) method combining two key innovations: updating only one parameter per iteration and accepting or rejecting this update based on subsets of statistics approximately sufficient for this parameter. This increases acceptance rates dramatically, rendering this approach suitable even for models of very high dimensionality. We further derive that for linear models, a one-dimensional combination of statistics per parameter is sufficient and can be found empirically with simulations. Finally, we demonstrate that our method readily scales to models of very high dimensionality, using toy models as well as by jointly inferring the effective population size, the distribution of fitness effects (DFE) of segregating mutations, and selection coefficients for each locus from data of a recent experiment on the evolution of drug resistance in influenza. PMID:27052569
Relevance Data for Language Models Using Maximum Likelihood.
ERIC Educational Resources Information Center
Bodoff, David; Wu, Bin; Wong, K. Y. Michael
2003-01-01
Presents a preliminary empirical test of a maximum likelihood approach to using relevance data for training information retrieval parameters. Discusses similarities to language models; the unification of document-oriented and query-oriented views; tests on data sets; algorithms and scalability; and the effectiveness of maximum likelihood…
Likelihood of volcanic eruption at Long Valley, California, is reduced
Kelly, D.
1984-01-01
A relatively low level of earthquake activity as well as reduced rates of ground deformation over the past year have led U.S Geological Survey scientists to conclude that the likelihood of imminent volcanic activity at Long Valley, California, is reduced from that of mid-1982 through 1983.
Likelihood Methods for Adaptive Filtering and Smoothing. Technical Report #455.
ERIC Educational Resources Information Center
Butler, Ronald W.
The dynamic linear model or Kalman filtering model provides a useful methodology for predicting the past, present, and future states of a dynamic system, such as an object in motion or an economic or social indicator that is changing systematically with time. Recursive likelihood methods for adaptive Kalman filtering and smoothing are developed.…
Counseling Pretreatment and the Elaboration Likelihood Model of Attitude Change.
ERIC Educational Resources Information Center
Heesacker, Martin
The importance of high levels of involvement in counseling has been related to theories of interpersonal influence. To examine differing effects of counselor credibility as a function of how personally involved counselors are, the Elaboration Likelihood Model (ELM) of attitude change was applied to counseling pretreatment. Students (N=256) were…
Counseling Pretreatment and the Elaboration Likelihood Model of Attitude Change.
ERIC Educational Resources Information Center
Heesacker, Martin
1986-01-01
Results of the application of the Elaboration Likelihood Model (ELM) to a counseling context revealed that more favorable attitudes toward counseling occurred as subjects' ego involvement increased and as intervention quality improved. Counselor credibility affected the degree to which subjects' attitudes reflected argument quality differences.…
Maximum Likelihood Estimation of Nonlinear Structural Equation Models.
ERIC Educational Resources Information Center
Lee, Sik-Yum; Zhu, Hong-Tu
2002-01-01
Developed an EM type algorithm for maximum likelihood estimation of a general nonlinear structural equation model in which the E-step is completed by a Metropolis-Hastings algorithm. Illustrated the methodology with results from a simulation study and two real examples using data from previous studies. (SLD)
Troubled Conception: Negotiating the Likelihood of Having Children
ERIC Educational Resources Information Center
May, Marian
2007-01-01
In the context of low fertility and Australia's ageing population, a national longitudinal telephone survey, "Negotiating the Life Course" (NLC), asks women about their childbearing intentions. This paper uses conversation analysis (CA) to examine interaction between an interviewer and respondents on one NLC question about the likelihood of having…
Mixture Rasch Models with Joint Maximum Likelihood Estimation
ERIC Educational Resources Information Center
Willse, John T.
2011-01-01
This research provides a demonstration of the utility of mixture Rasch models. Specifically, a model capable of estimating a mixture partial credit model using joint maximum likelihood is presented. Like the partial credit model, the mixture partial credit model has the beneficial feature of being appropriate for analysis of assessment data…
Likelihood-Based Confidence Intervals in Exploratory Factor Analysis
ERIC Educational Resources Information Center
Oort, Frans J.
2011-01-01
In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…
Effective Family Position and Likelihood of Becoming an Alcoholic.
ERIC Educational Resources Information Center
Majumdar, Mahbubon N.; Bhatia, Pritam S.
1980-01-01
Discusses effective family position as a new variable developed to measure the effect of birth order and childhood home environment on the likelihood of becoming an alcoholic. Constructs of fixation and regression may also be helpful in differentiating two types of alcoholism. (JAC)
Targeted maximum likelihood based causal inference: Part I.
van der Laan, Mark J
2010-01-01
Given causal graph assumptions, intervention-specific counterfactual distributions of the data can be defined by the so called G-computation formula, which is obtained by carrying out these interventions on the likelihood of the data factorized according to the causal graph. The obtained G-computation formula represents the counterfactual distribution the data would have had if this intervention would have been enforced on the system generating the data. A causal effect of interest can now be defined as some difference between these counterfactual distributions indexed by different interventions. For example, the interventions can represent static treatment regimens or individualized treatment rules that assign treatment in response to time-dependent covariates, and the causal effects could be defined in terms of features of the mean of the treatment-regimen specific counterfactual outcome of interest as a function of the corresponding treatment regimens. Such features could be defined nonparametrically in terms of so called (nonparametric) marginal structural models for static or individualized treatment rules, whose parameters can be thought of as (smooth) summary measures of differences between the treatment regimen specific counterfactual distributions. In this article, we develop a particular targeted maximum likelihood estimator of causal effects of multiple time point interventions. This involves the use of loss-based super-learning to obtain an initial estimate of the unknown factors of the G-computation formula, and subsequently, applying a target-parameter specific optimal fluctuation function (least favorable parametric submodel) to each estimated factor, estimating the fluctuation parameter(s) with maximum likelihood estimation, and iterating this updating step of the initial factor till convergence. This iterative targeted maximum likelihood updating step makes the resulting estimator of the causal effect double robust in the sense that it is
Targeted Maximum Likelihood Based Causal Inference: Part I
van der Laan, Mark J.
2010-01-01
Given causal graph assumptions, intervention-specific counterfactual distributions of the data can be defined by the so called G-computation formula, which is obtained by carrying out these interventions on the likelihood of the data factorized according to the causal graph. The obtained G-computation formula represents the counterfactual distribution the data would have had if this intervention would have been enforced on the system generating the data. A causal effect of interest can now be defined as some difference between these counterfactual distributions indexed by different interventions. For example, the interventions can represent static treatment regimens or individualized treatment rules that assign treatment in response to time-dependent covariates, and the causal effects could be defined in terms of features of the mean of the treatment-regimen specific counterfactual outcome of interest as a function of the corresponding treatment regimens. Such features could be defined nonparametrically in terms of so called (nonparametric) marginal structural models for static or individualized treatment rules, whose parameters can be thought of as (smooth) summary measures of differences between the treatment regimen specific counterfactual distributions. In this article, we develop a particular targeted maximum likelihood estimator of causal effects of multiple time point interventions. This involves the use of loss-based super-learning to obtain an initial estimate of the unknown factors of the G-computation formula, and subsequently, applying a target-parameter specific optimal fluctuation function (least favorable parametric submodel) to each estimated factor, estimating the fluctuation parameter(s) with maximum likelihood estimation, and iterating this updating step of the initial factor till convergence. This iterative targeted maximum likelihood updating step makes the resulting estimator of the causal effect double robust in the sense that it is
Crangle, Robert D., Jr.
2007-01-01
Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well logs for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well logs are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well log traces and lithologic logs were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well logs and converted to a Log ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper logs to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table 2).
Performance and Improved Design of the Log Spiral of Revolution Monochromator
Pease, D. M.; Shanthakumar, P.; Huang, T.; Budnick, J. I.; Odong, O.; Frenkel, A. I.; Abitbol, N.; Balasubramanian, M.; Brewer, D.
2007-02-02
We show that, whereas the present log spiral of revolution monochromator works well for Cr edges of 2.8% Cr2O3 in a V2O3 matrix, the device transmits noticeable V extended structure in the case of 0.365% Cr2O3. We demonstrate that the transmitted V extended structure is due to the V K{beta} line which is unresolved by the monochromator. It is suggested that this limitation may be overcome by designing a log spiral detector for the Cr K{beta} line rather than the Cr K{alpha} line. Aspects of the design of this modified log spiral are discussed.
Profile likelihood maps of a 15-dimensional MSSM
NASA Astrophysics Data System (ADS)
Strege, C.; Bertone, G.; Besjes, G. J.; Caron, S.; Ruiz de Austri, R.; Strubig, A.; Trotta, R.
2014-09-01
We present statistically convergent profile likelihood maps obtained via global fits of a phenomenological Minimal Supersymmetric Standard Model with 15 free parameters (the MSSM-15), based on over 250M points. We derive constraints on the model parameters from direct detection limits on dark matter, the Planck relic density measurement and data from accelerator searches. We provide a detailed analysis of the rich phenomenology of this model, and determine the SUSY mass spectrum and dark matter properties that are preferred by current experimental constraints. We evaluate the impact of the measurement of the anomalous magnetic moment of the muon ( g - 2) on our results, and provide an analysis of scenarios in which the lightest neutralino is a subdominant component of the dark matter. The MSSM-15 parameters are relatively weakly constrained by current data sets, with the exception of the parameters related to dark matter phenomenology ( M 1, M 2, μ), which are restricted to the sub-TeV regime, mainly due to the relic density constraint. The mass of the lightest neutralino is found to be < 1.5 TeV at 99% C.L., but can extend up to 3 TeV when excluding the g - 2 constraint from the analysis. Low-mass bino-like neutralinos are strongly favoured, with spin-independent scattering cross-sections extending to very small values, ~ 10-20 pb. ATLAS SUSY null searches strongly impact on this mass range, and thus rule out a region of parameter space that is outside the reach of any current or future direct detection experiment. The best-fit point obtained after inclusion of all data corresponds to a squark mass of 2.3 TeV, a gluino mass of 2.1 TeV and a 130 GeV neutralino with a spin-independent cross-section of 2.4 × 10-10 pb, which is within the reach of future multi-ton scale direct detection experiments and of the upcoming LHC run at increased centre-of-mass energy.
Selective logging: does the imprint remain on tree structure and composition after 45 years?
Osazuwa-Peters, Oyomoare L; Chapman, Colin A; Zanne, Amy E
2015-01-01
Selective logging of tropical forests is increasing in extent and intensity. The duration over which impacts of selective logging persist, however, remains an unresolved question, particularly for African forests. Here, we investigate the extent to which a past selective logging event continues to leave its imprint on different components of an East African forest 45 years later. We inventoried 2358 stems ≥10 cm in diameter in 26 plots (200 m × 10 m) within a 5.2 ha area in Kibale National Park, Uganda, in logged and unlogged forest. In these surveys, we characterized the forest light environment, taxonomic composition, functional trait composition using three traits (wood density, maximum height and maximum diameter) and forest structure based on three measures (stem density, total basal area and total above-ground biomass). In comparison to unlogged forests, selectively logged forest plots in Kibale National Park on average had higher light levels, different structure characterized by lower stem density, lower total basal area and lower above-ground biomass, and a distinct taxonomic composition driven primarily by changes in the relative abundance of species. Conversely, selectively logged forest plots were like unlogged plots in functional composition, having similar community-weighted mean values for wood density, maximum height and maximum diameter. This similarity in functional composition irrespective of logging history may be due to functional recovery of logged forest or background changes in functional attributes of unlogged forest. Despite the passage of 45 years, the legacy of selective logging on the tree community in Kibale National Park is still evident, as indicated by distinct taxonomic and structural composition and reduced carbon storage in logged forest compared with unlogged forest. The effects of selective logging are exerted via influences on tree demography rather than functional trait composition. PMID:27293697
Exclusion probabilities and likelihood ratios with applications to mixtures.
Slooten, Klaas-Jan; Egeland, Thore
2016-01-01
The statistical evidence obtained from mixed DNA profiles can be summarised in several ways in forensic casework including the likelihood ratio (LR) and the Random Man Not Excluded (RMNE) probability. The literature has seen a discussion of the advantages and disadvantages of likelihood ratios and exclusion probabilities, and part of our aim is to bring some clarification to this debate. In a previous paper, we proved that there is a general mathematical relationship between these statistics: RMNE can be expressed as a certain average of the LR, implying that the expected value of the LR, when applied to an actual contributor to the mixture, is at least equal to the inverse of the RMNE. While the mentioned paper presented applications for kinship problems, the current paper demonstrates the relevance for mixture cases, and for this purpose, we prove some new general properties. We also demonstrate how to use the distribution of the likelihood ratio for donors of a mixture, to obtain estimates for exceedance probabilities of the LR for non-donors, of which the RMNE is a special case corresponding to L R>0. In order to derive these results, we need to view the likelihood ratio as a random variable. In this paper, we describe how such a randomization can be achieved. The RMNE is usually invoked only for mixtures without dropout. In mixtures, artefacts like dropout and drop-in are commonly encountered and we address this situation too, illustrating our results with a basic but widely implemented model, a so-called binary model. The precise definitions, modelling and interpretation of the required concepts of dropout and drop-in are not entirely obvious, and we attempt to clarify them here in a general likelihood framework for a binary model. PMID:26160753
An n log n Generalized Born Approximation.
Anandakrishnan, Ramu; Daga, Mayank; Onufriev, Alexey V
2011-03-01
Molecular dynamics (MD) simulations based on the generalized Born (GB) model of implicit solvation offer a number of important advantages over the traditional explicit solvent based simulations. Yet, in MD simulations, the GB model has not been able to reach its full potential partly due to its computational cost, which scales as ∼n(2), where n is the number of solute atoms. We present here an ∼n log n approximation for the generalized Born (GB) implicit solvent model. The approximation is based on the hierarchical charge partitioning (HCP) method (Anandakrishnan and Onufriev J. Comput. Chem. 2010 , 31 , 691 - 706 ) previously developed and tested for electrostatic computations in gas-phase and distant dependent dielectric models. The HCP uses the natural organization of biomolecular structures to partition the structures into multiple hierarchical levels of components. The charge distribution for each of these components is approximated by a much smaller number of charges. The approximate charges are then used for computing electrostatic interactions with distant components, while the full set of atomic charges are used for nearby components. To apply the HCP concept to the GB model, we define the equivalent of the effective Born radius for components. The component effective Born radius is then used in GB computations for points that are distant from the component. This HCP approximation for GB (HCP-GB) is implemented in the open source MD software, NAB in AmberTools, and tested on a set of representative biomolecular structures ranging in size from 632 atoms to ∼3 million atoms. For this set of test structures, the HCP-GB method is 1.1-390 times faster than the GB computation without additional approximations (the reference GB computation), depending on the size of the structure. Similar to the spherical cutoff method with GB (cutoff-GB), which also scales as ∼n log n, the HCP-GB is relatively simple. However, for the structures considered here, we show
Penalized likelihood for sparse contingency tables with an application to full-length cDNA libraries
Dahinden, Corinne; Parmigiani, Giovanni; Emerick, Mark C; Bühlmann, Peter
2007-01-01
Background The joint analysis of several categorical variables is a common task in many areas of biology, and is becoming central to systems biology investigations whose goal is to identify potentially complex interaction among variables belonging to a network. Interactions of arbitrary complexity are traditionally modeled in statistics by log-linear models. It is challenging to extend these to the high dimensional and potentially sparse data arising in computational biology. An important example, which provides the motivation for this article, is the analysis of so-called full-length cDNA libraries of alternatively spliced genes, where we investigate relationships among the presence of various exons in transcript species. Results We develop methods to perform model selection and parameter estimation in log-linear models for the analysis of sparse contingency tables, to study the interaction of two or more factors. Maximum Likelihood estimation of log-linear model coefficients might not be appropriate because of the presence of zeros in the table's cells, and new methods are required. We propose a computationally efficient ℓ1-penalization approach extending the Lasso algorithm to this context, and compare it to other procedures in a simulation study. We then illustrate these algorithms on contingency tables arising from full-length cDNA libraries. Conclusion We propose regularization methods that can be used successfully to detect complex interaction patterns among categorical variables in a broad range of biological problems involving categorical variables. PMID:18072965
Shen, Qing; Liu, Ping; Wang, Xin; Royse, Daniel J
2008-11-01
Production costs for shiitake (Lentinula edodes) are on the rise in the United States due to increasing expenses including materials, labor and energy. Increased yield and improved bioconversion of raw materials may improve grower profit margins and may help reduce the cost of shiitake to the consumer. Two crops (Crop 1 and 2) of shiitake were grown to evaluate effects of three substrate moisture contents (50%, 55% and 60%), two log weights (2.7 and 3.2kg) and three porosities of bag filter (low, medium and high) on mushroom yield (g/log) and biological efficiency (BE). Yield data were collected under controlled environmental conditions for two breaks. The formulation with 55% substrate moisture gave the highest yield and BE. Higher mushroom yields were produced from heavier logs (3.2kg), but BE was not significantly affected. Filter porosity significantly affected yield and BE in Crop 1 but not in Crop 2. Significant interactions were observed for log moisture content x filter porosity for both crops. There were no significant two-way interactions observed for filter porosity x log weight or three-way interactions observed for moisture content x filter porosity x log weight. Maximum yields were obtained from 3.2kg logs with a substrate moisture content of ca. 55% using medium or low porosity-filtered bags. This study may provide growers with additional information to better optimize production practices and become more efficient and competitive. PMID:18485700
1. GENERAL VIEW OF LOG POND AND BOOM FOR UNLOADING ...
1. GENERAL VIEW OF LOG POND AND BOOM FOR UNLOADING CEDAR LOGS FROM TRUCKS AT LOG DUMP, ADJACENT TO MILL; TRUCKS FORMERLY USED TRIP STAKES, THOUGH FOR SAFER HANDLING OF LOGS WELDED STAKES ARE NOW REQUIRED; AS A RESULT LOADING IS NOW DONE WITH A CRANE - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR
32 CFR 700.845 - Maintenance of logs.
Code of Federal Regulations, 2011 CFR
2011-07-01
... AND OFFICIAL RECORDS UNITED STATES NAVY REGULATIONS AND OFFICIAL RECORDS The Commanding Officer Commanding Officers Afloat § 700.845 Maintenance of logs. (a) A deck log and an engineering log shall be... Naval Operations. (b) A compass record shall be maintained as an adjunct to the deck log. An...
Discovering the Local Landscape: Pioneer Log Buildings.
ERIC Educational Resources Information Center
Douglas, Bob; And Others
Building structures made from logs appeared in the eastern United States during the late 17th century, and immigrants from Sweden, Finland, and Germany are credited with their construction. There were two types of structures: the horizontal design introduced by the Scandinavians and the German or Pennsylvania Dutch model that was used by the…
Modelling tropical forests response to logging
NASA Astrophysics Data System (ADS)
Cazzolla Gatti, Roberto; Di Paola, Arianna; Valentini, Riccardo; Paparella, Francesco
2013-04-01
Tropical rainforests are among the most threatened ecosystems by large-scale fragmentation due to human activity such as heavy logging and agricultural clearance. Although, they provide crucial ecosystem goods and services, such as sequestering carbon from the atmosphere, protecting watersheds and conserving biodiversity. In several countries forest resource extraction has experienced a shift from clearcutting to selective logging to maintain a significant forest cover and understock of living biomass. However the knowledge on the short and long-term effects of removing selected species in tropical rainforest are scarce and need to be further investigated. One of the main effects of selective logging on forest dynamics seems to be the local disturbance which involve the invasion of open space by weed, vines and climbers at the expense of the late-successional state cenosis. We present a simple deterministic model that describes the dynamics of tropical rainforest subject to selective logging to understand how and why weeds displace native species. We argue that the selective removal of tallest tropical trees carries out gaps of light that allow weeds, vines and climbers to prevail on native species, inhibiting the possibility of recovery of the original vegetation. Our results show that different regime shifts may occur depending on the type of forest management adopted. This hypothesis is supported by a dataset of trees height and weed/vines cover that we collected from 9 plots located in Central and West Africa both in untouched and managed areas.
There's Life in Those Dead Logs!
ERIC Educational Resources Information Center
Biggs, Devin; Miller, Todd; Hall, Dee
2006-01-01
Although it is unspectacular in appearance, dead wood is one of the most ecologically important resources in forests. Fallen logs, dead standing trees, stumps, and even cavities in live trees fulfill a wide range of roles. Prominent among these is that they provide habitat for many organisms, especially insects. Fourth-grade students at Fox…
The Design Log: A New Informational Tool
ERIC Educational Resources Information Center
Spivak, Mayer
1978-01-01
The design log is a record of observations, diagnoses, prescriptions, and performance specifications for each space in a structure. It is a systematic approach to design that integrates information about user needs with traditional architectural programming and design. (Author/MLF)
OPAC User Logs: Implications for Bibliographic Instruction.
ERIC Educational Resources Information Center
Kern-Simirenko, Cheryl
1983-01-01
Discusses characteristics of typical online public access catalogs (OPACs) and examines patron use via printouts of transaction logs for three separate systems. Desirable features of OPACs (mnemonic search commands, boolean operators, forgiveness or automatic truncation, browsing a subject heading index, suggestive prompts) and need for…
High temperature spectral gamma well logging
Normann, R.A.; Henfling, J.A.
1997-01-01
A high temperature spectral gamma tool has been designed and built for use in small-diameter geothermal exploration wells. Several engineering judgments are discussed regarding operating parameters, well model selection, and signal processing. An actual well log at elevated temperatures is given with spectral gamma reading showing repeatability.
Rolling Deck to Repository III: Shipboard Event Logging
NASA Astrophysics Data System (ADS)
Chandler, C. L.; Arko, R.; Miller, S. P.; Dorsk, A.; McCue, S.; Fisichella, D. C.
2008-12-01
Data gathered during NSF-supported scientific research cruises represent an important component of the overall oceanographic data collection. The Rolling Deck to Repository (R2R) pilot project aims to improve access to basic shipboard data and ultimately reduce the work required to provide that access. Improved access will be achieved through identification of best practices for shipboard data management, identification of standard metadata and data products from research cruises, development of metadata schemas to describe a research cruise, and development of a prototype data discovery system that could be used by the entire NSF-supported academic research fleet. Shoreside data managers will work collaboratively with ship technicians and vessel operators to develop approaches that scale from smaller coastal vessels to larger open ocean research vessels. One of the coordinated subprojects within the R2R project will focus on development of a shipboard event logging system that would incorporate best practice guidelines, a metadata schema and new and existing applications to generate a scientific sampling event log in the form of a digital text file. A cruise event logging system enables researchers to record digitally all scientific sampling events and assign a unique event identifier to each entry. Decades of work conducted within large coordinated ocean research programs (JGOFS, GLOBEC, WOCE and RIDGE) have shown that creation of a shipboard sampling event log can facilitate greatly the subsequent integration of data sets from individual investigators. In addition to providing a quick way to determine what types of data might have been collected during a cruise, the sampling event log can be used to visualize the relationship, both temporally and geospatially, between the diverse types of sampling events conducted during a research cruise. Research questions in marine ecology or modeling projects are inherently multi-disciplinary and require access to a variety
Requirements-Driven Log Analysis Extended Abstract
NASA Technical Reports Server (NTRS)
Havelund, Klaus
2012-01-01
Imagine that you are tasked to help a project improve their testing effort. In a realistic scenario it will quickly become clear, that having an impact is diffcult. First of all, it will likely be a challenge to suggest an alternative approach which is significantly more automated and/or more effective than current practice. The reality is that an average software system has a complex input/output behavior. An automated testing approach will have to auto-generate test cases, each being a pair (i; o) consisting of a test input i and an oracle o. The test input i has to be somewhat meaningful, and the oracle o can be very complicated to compute. Second, even in case where some testing technology has been developed that might improve current practice, it is then likely difficult to completely change the current behavior of the testing team unless the technique is obviously superior and does everything already done by existing technology. So is there an easier way to incorporate formal methods-based approaches than the full edged test revolution? Fortunately the answer is affirmative. A relatively simple approach is to benefit from possibly already existing logging infrastructure, which after all is part of most systems put in production. A log is a sequence of events, generated by special log recording statements, most often manually inserted in the code by the programmers. An event can be considered as a data record: a mapping from field names to values. We can analyze such a log using formal methods, for example checking it against a formal specification. This separates running the system for analyzing its behavior. It is not meant as an alternative to testing since it does not address the important in- put generation problem. However, it offers a solution which testing teams might accept since it has low impact on the existing process. A single person might be assigned to perform such log analysis, compared to the entire testing team changing behavior.
Coal-log pipeline system development. Fifth quarterly report, August 25, 1991--November 25, 1991
Liu, H.
1991-12-01
Project tasks include: (1) Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. (2) Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and tests the logs produced. (3) Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. (VC)
Prensky, Stephen E.
1987-01-01
This report includes over 1,350 individual citations as well as a first-author index. The purpose of this bibliography is twofold, 1) to provide a basic, first-stop resource on well logging which the non-specialist, i. e. , geoscientist, can consult and, 2) to provide a reference on geologic applications for the non-geoscientist, i. e. , log analyst or petroleum engineer, as well as for the geoscientist.
Improved production log interpretation in horizontal wells using pulsed neutron logs
Brady, J.L.; Kohring, J.J.; North, R.J.
1996-12-31
Production log flow profiles provide a valuable tool to evaluate well and reservoir performance. Horizontal wellbores and their associated completion designs present several challenges to profile interpretation for conventional production logging sensors and techniques. A unique approach combining pulsed neutron capture (PNC) log data with conventional production logging measurements is providing improved flow profile answers in slotted liner, horizontal well completions on the North Slope of Alaska. Identifying and eliminating undesirable gas production is one of the chief goals of production logging on the North Slope. This process becomes difficult in horizontal wellbores as fluid segregation affects the area investigated by the various logging sensors and also the velocities of the individual phases. Typical slotted liner completions further complicate analysis as fluids are able to flow in the liner/openhole annulus. Analysis of PNC log data provides two good qualitative indicators of formation permeability. The first technique is derived from the difference of the formation sigma response before and after injecting a high-capture cross-section borax solution. The second technique uses the difference of the formation sigma response and the formation porosity measured while injecting the formation with crude or seawater. Further analysis of PNC log runs show that the two techniques closely correlate with production flow profiles under solution gas-oil ratio (GOR) conditions. These two techniques in combination with conventional production logging measurements of temperature, capacitance, pressure, and spinner improve flow profile results. PNC results can be combined with temperature and pressure data in the absence of valid spinner data to provide an approximate flow profile. These techniques have been used to successfully determine profiles in both cemented and slotted liner completions with GORs in excess of 15,000 scf/bbl.
Predicting hospital visits from geo-tagged Internet search logs
Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H.
2016-01-01
The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, 2) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search logs could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search logs, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user’s future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources—a crucial prerequisite for securing healthcare access for everyone in the days to come. PMID:27570641
Neutron and gamma (density) logging in welded tuff
Lin, W
1998-09-12
This Technical Implementation Procedure (TIP) describes the field operation, and the management of data records pertaining to neutron logging and density logging in welded tuff. This procedure applies to all borehole surveys performed in support of Engineered Barrier System Field Tests (EBSFT), including the Earge Block Tests (LBT) and Initial Engineered Barrier System Field Tests (IEBSFT) - WBS 1.2.3.12.4. The purpose of this TIP is to provide guidelines so that other equally trained and qualified personnel can understand how the work is performed or how to repeat the work if needed. The work will be documented by the use of Scientific Notebooks (SNs) as discussed in 033-YMP-QP 3.4. The TIP will provide a set of guidelines which the scientists will take into account in conducting the mea- surements. The use of this TIP does not imply that this is repetitive work that does not require profes- sional judgment.
Predicting hospital visits from geo-tagged Internet search logs.
Agarwal, Vibhu; Han, Lichy; Madan, Isaac; Saluja, Shaurya; Shidham, Aaditya; Shah, Nigam H
2016-01-01
The steady rise in healthcare costs has deprived over 45 million Americans of healthcare services (1, 2) and has encouraged healthcare providers to look for opportunities to improve their operational efficiency. Prior studies have shown that evidence of healthcare seeking intent in Internet searches correlates well with healthcare resource utilization. Given the ubiquitous nature of mobile Internet search, we hypothesized that analyzing geo-tagged mobile search logs could enable us to machine-learn predictors of future patient visits. Using a de-identified dataset of geo-tagged mobile Internet search logs, we mined text and location patterns that are predictors of healthcare resource utilization and built statistical models that predict the probability of a user's future visit to a medical facility. Our efforts will enable the development of innovative methods for modeling and optimizing the use of healthcare resources-a crucial prerequisite for securing healthcare access for everyone in the days to come. PMID:27570641
The Soufriere Hills Volcano, Montserrat, is "log logistic"
NASA Astrophysics Data System (ADS)
Connor, C.; Sparks, S.; Mason, R.; Bonadonna, C.; Young, S.; Lejeune, A.
2002-12-01
The Soufriere Hills Volcano, Montserrat, experienced a remarkable episode of activity in September-October, 1997. During this period, 75 vulcanian explosions generated plumes that commonly rose between 5 to 15 km a.s.l. accompanied by pyroclastic flows. Repose intervals between vulcanian explosions varied from 2.77 to 33.7 hrs, with a median repose interval of 9.0 hr and mean of 9.6 hr. During the eruption, this narrow range of repose intervals was used in a practical way to provide qualitative forecasts of volcanic hazard. We analyzed repose intervals for these 75 vulcanian explosions and discovered they fit a log logistic distribution with > 99% confidence. This comparatively simple, two parameter model accounts for departures from a classical material failure model (Weibull distribution) at long repose intervals, and serves as the basis for improved hazard forecasts. The crucial differences between Weibull, and log logistic probability models for volcano repose interval are illustrated by comparing their hazard functions. For Weibull distributions the hazard increases indefinitely: the volcano must erupt explosively eventually and as time increases the probability of an eruption in the next time interval becomes much greater. In contrast the log logistic hazard function goes through a simple maximum. We note that largest magnitude eruptions in the time series, deduced from column height estimates and seismic explosion amplitudes, correlate with the peak in the log logistic hazard function. The excellent model fit is explained in terms of two competing processes operating in the upper conduit on different time scales. Gas bubble pressure increases with time due to exsolution and due to rheological stiffening of magma of magma following an abrupt decompression caused by a previous explosion. Once bubble gas pressure exceeds the tensile strength of the magma an explosion occurs - a material failure model that should follow a Weibull distribution. However, this
Skewness for Maximum Likelihood Estimators of the Negative Binomial Distribution
Bowman, Kimiko o
2007-01-01
The probability generating function of one version of the negative binomial distribution being (p + 1 - pt){sup -k}, we study elements of the Hessian and in particular Fisher's discovery of a series form for the variance of k, the maximum likelihood estimator, and also for the determinant of the Hessian. There is a link with the Psi function and its derivatives. Basic algebra is excessively complicated and a Maple code implementation is an important task in the solution process. Low order maximum likelihood moments are given and also Fisher's examples relating to data associated with ticks on sheep. Efficiency of moment estimators is mentioned, including the concept of joint efficiency. In an Addendum we give an interesting formula for the difference of two Psi functions.
A composite likelihood approach for spatially correlated survival data
Paik, Jane; Ying, Zhiliang
2013-01-01
The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450
Maximum-Likelihood Fits to Histograms for Improved Parameter Estimation
NASA Astrophysics Data System (ADS)
Fowler, J. W.
2014-08-01
Straightforward methods for adapting the familiar statistic to histograms of discrete events and other Poisson distributed data generally yield biased estimates of the parameters of a model. The bias can be important even when the total number of events is large. For the case of estimating a microcalorimeter's energy resolution at 6 keV from the observed shape of the Mn K fluorescence spectrum, a poor choice of can lead to biases of at least 10 % in the estimated resolution when up to thousands of photons are observed. The best remedy is a Poisson maximum-likelihood fit, through a simple modification of the standard Levenberg-Marquardt algorithm for minimization. Where the modification is not possible, another approach allows iterative approximation of the maximum-likelihood fit.
A Targeted Maximum Likelihood Estimator for Two-Stage Designs
Rose, Sherri; van der Laan, Mark J.
2011-01-01
We consider two-stage sampling designs, including so-called nested case control studies, where one takes a random sample from a target population and completes measurements on each subject in the first stage. The second stage involves drawing a subsample from the original sample, collecting additional data on the subsample. This data structure can be viewed as a missing data structure on the full-data structure collected in the second-stage of the study. Methods for analyzing two-stage designs include parametric maximum likelihood estimation and estimating equation methodology. We propose an inverse probability of censoring weighted targeted maximum likelihood estimator (IPCW-TMLE) in two-stage sampling designs and present simulation studies featuring this estimator. PMID:21556285
Gaussian maximum likelihood and contextual classification algorithms for multicrop classification
NASA Technical Reports Server (NTRS)
Di Zenzo, Silvano; Bernstein, Ralph; Kolsky, Harwood G.; Degloria, Stephen D.
1987-01-01
The paper reviews some of the ways in which context has been handled in the remote-sensing literature, and additional possibilities are introduced. The problem of computing exhaustive and normalized class-membership probabilities from the likelihoods provided by the Gaussian maximum likelihood classifier (to be used as initial probability estimates to start relaxation) is discussed. An efficient implementation of probabilistic relaxation is proposed, suiting the needs of actual remote-sensing applications. A modified fuzzy-relaxation algorithm using generalized operations between fuzzy sets is presented. Combined use of the two relaxation algorithms is proposed to exploit context in multispectral classification of remotely sensed data. Results on both one artificially created image and one MSS data set are reported.
A composite likelihood approach for spatially correlated survival data.
Paik, Jane; Ying, Zhiliang
2013-01-01
The aim of this paper is to provide a composite likelihood approach to handle spatially correlated survival data using pairwise joint distributions. With e-commerce data, a recent question of interest in marketing research has been to describe spatially clustered purchasing behavior and to assess whether geographic distance is the appropriate metric to describe purchasing dependence. We present a model for the dependence structure of time-to-event data subject to spatial dependence to characterize purchasing behavior from the motivating example from e-commerce data. We assume the Farlie-Gumbel-Morgenstern (FGM) distribution and then model the dependence parameter as a function of geographic and demographic pairwise distances. For estimation of the dependence parameters, we present pairwise composite likelihood equations. We prove that the resulting estimators exhibit key properties of consistency and asymptotic normality under certain regularity conditions in the increasing-domain framework of spatial asymptotic theory. PMID:24223450
Maximum-likelihood registration of range images with missing data.
Sharp, Gregory C; Lee, Sang W; Wehe, David K
2008-01-01
Missing data are common in range images, due to geometric occlusions, limitations in the sensor field of view, poor reflectivity, depth discontinuities, and cast shadows. Using registration to align these data often fails, because points without valid correspondences can be incorrectly matched. This paper presents a maximum likelihood method for registration of scenes with unmatched or missing data. Using ray casting, correspondences are formed between valid and missing points in each view. These correspondences are used to classify points by their visibility properties, including occlusions, field of view, and shadow regions. The likelihood of each point match is then determined using statistical properties of the sensor, such as noise and outlier distributions. Experiments demonstrate a high rates of convergence on complex scenes with varying degrees of overlap. PMID:18000329
Liu, H.
1995-12-31
During this quarter (1/1/95-9/30/95), major progress has been made in the following areas of coal log pipeline research, development and technology transfer: (1) Conceptual design of a test machine based on hydraulic presses to mass-produce 5.4-inch-diameter coal logs for testing in a 6-inch-diameter pipeline has been completed. (2) Conceptual design of a rotary-press machine to produce 1.9-inch-diameter coal logs for testing in a 2-inch-diameter pipeline has also been completed. (3) It has been confirmed through experiments that molds with round-edge exit can make logs as good as those made with tapered exit. (4) Conducted a study to determine the effect of surface condition of mold and lubricants on the quality of coal logs. (5) Completed an evaluation of the effect of fiber (wood pulp) on coal log quality. (6) Prepared an apparatus for testing fast compaction of coal logs -- 2 second per log. (7) Compacted coal logs in a 5.3-inch-diameter mold. (8) Completed a preliminary study to assess vacuum and steam heating systems to enhance coal log production and quality. (9) Changed the small-scale-CLP-demo loop from a once-through system to a recirculating system. (10) Completed revision of CLP economic model and revised the 1993 report.
High precision X-ray log N - log S distributions: implications for the obscured AGN population
NASA Astrophysics Data System (ADS)
Mateos, S.; Warwick, R. S.; Carrera, F. J.; Stewart, G. C.; Ebrero, J.; Della Ceca, R.; Caccianiga, A.; Gilli, R.; Page, M. J.; Treister, E.; Tedds, J. A.; Watson, M. G.; Lamer, G.; Saxton, R. D.; Brunner, H.; Page, C. G.
2008-12-01
Context: Our knowledge of the properties of AGN, especially those of optical type-2 objects, is very incomplete. Because extragalactic source count distributions are dependent on the cosmological and statistical properties of AGN, they provide a direct method of investigating the underlying source populations. Aims: We aim to constrain the extragalactic source count distributions over a broad range of X-ray fluxes and in various energy bands to test whether the predictions from X-ray background synthesis models agree with the observational constraints provided by our measurements. Methods: We have used 1129 XMM-Newton observations at |b|>20° covering a total sky area of 132.3 deg2 to compile the largest complete samples of X-ray selected objects to date both in the 0.5-1 keV, 1-2 keV, 2-4.5 keV, 4.5-10 keV bands employed in standard XMM-Newton data processing and in the 0.5-2 keV and 2-10 keV energy bands more usually considered in source count studies. Our survey includes in excess of 30 000 sources and spans fluxes from 10-15 to 10-12 erg cm^{-2 s-1} below 2 keV and from 10-14 to 10-12 erg cm^{-2 s-1} above 2 keV where the bulk of the cosmic X-ray background energy density is produced. Results: The very large sample size we obtained means our results are not limited by cosmic variance or low counting statistics. A break in the source count distributions was detected in all energy bands except the 4.5-10 keV band. We find that an analytical model comprising 2 power-law components cannot adequately describe the curvature seen in the source count distributions. The shape of the log N(>S) - log S is strongly dependent on the energy band with a general steepening apparent as we move to higher energies. This is due to the fact that non-AGN populations, comprised mainly of stars and clusters of galaxies, contribute up to 30% of the source population at energies <2 keV and at fluxes ≥10-13 erg cm^{-2 s-1}, and these populations of objects have significantly flatter
Fast computation of genetic likelihoods on human pedigree data.
Goradia, T M; Lange, K; Miller, P L; Nadkarni, P M
1992-01-01
Gene mapping and genetic epidemiology require large-scale computation of likelihoods based on human pedigree data. Although computation of such likelihoods has become increasingly sophisticated, fast calculations are still impeded by complex pedigree structures, by models with many underlying loci and by missing observations on key family members. The current paper 'introduces' a new method of array factorization that substantially accelerates linkage calculations with large numbers of markers. This method is not limited to nuclear families or to families with complete phenotyping. Vectorization and parallelization are two general-purpose hardware techniques for accelerating computations. These techniques can assist in the rapid calculation of genetic likelihoods. We describe our experience using both of these methods with the existing program MENDEL. A vectorized version of MENDEL was run on an IBM 3090 supercomputer. A parallelized version of MENDEL was run on parallel machines of different architectures and on a network of workstations. Applying these revised versions of MENDEL to two challenging linkage problems yields substantial improvements in computational speed. PMID:1555846
A maximum-likelihood estimation of pairwise relatedness for autopolyploids
Huang, K; Guo, S T; Shattuck, M R; Chen, S T; Qi, X G; Zhang, P; Li, B G
2015-01-01
Relatedness between individuals is central to ecological genetics. Multiple methods are available to quantify relatedness from molecular data, including method-of-moment and maximum-likelihood estimators. We describe a maximum-likelihood estimator for autopolyploids, and quantify its statistical performance under a range of biologically relevant conditions. The statistical performances of five additional polyploid estimators of relatedness were also quantified under identical conditions. When comparing truncated estimators, the maximum-likelihood estimator exhibited lower root mean square error under some conditions and was more biased for non-relatives, especially when the number of alleles per loci was low. However, even under these conditions, this bias was reduced to be statistically insignificant with more robust genetic sampling. We also considered ambiguity in polyploid heterozygote genotyping and developed a weighting methodology for candidate genotypes. The statistical performances of three polyploid estimators under both ideal and actual conditions (including inbreeding and double reduction) were compared. The software package POLYRELATEDNESS is available to perform this estimation and supports a maximum ploidy of eight. PMID:25370210
The X-ray log N-log S relation. [background radiation in extragalactic media
NASA Technical Reports Server (NTRS)
Boldt, Elihu
1989-01-01
Results from various surveys are reviewed as regards X-ray source counts at high galactic latitudes and the luminosity functions determined for extragalactic sources. Constraints on the associated log N-log S relation provided by the extragalactic X-ray background are emphasized in terms of its spatial fluctuations and spectrum as well as absolute flux level. The large number of sources required for this background suggests that there is not a sharp boundary in the redshift distribution of visible matter.
Effects of pitch distance and likelihood on the perceived duration of deviant auditory events.
Kim, Elisa; McAuley, J Devin
2013-10-01
When a deviant (oddball) stimulus is presented within a series of otherwise identical (standard) stimuli, the duration of the oddball tends to be overestimated. Two experiments investigated factors affecting systematic distortions in the perceived duration of oddball stimuli. Both experiments used an auditory oddball paradigm where oddball tones varied in both their pitch distance from the pitch of a standard tone and their likelihood of occurrence. Experiment 1 revealed that (1) far-pitch oddballs were perceived to be longer than near-pitch oddballs, (2) effects of pitch distance were greater in low-likelihood conditions, and (3) oddballs in later serial positions were perceived to be longer than oddballs in earlier serial positions. The above effects held regardless of whether oddballs were higher or lower in pitch than the standard. Experiment 2 revealed a pattern of response times in an oddball detection task that generally paralleled the pattern of data observed in Experiment 1; across conditions, there was a negative correlation between detection times and perceived duration. Taken together, the results suggest that the observed effects of oddball pitch, likelihood, and position on perceived duration are at least partly driven by how quickly individuals are able to initiate timing the oddball following its onset. Implications for different theoretical accounts of the oddball effect are discussed. PMID:23801322
IQ-TREE: A Fast and Effective Stochastic Algorithm for Estimating Maximum-Likelihood Phylogenies
Nguyen, Lam-Tung; Schmidt, Heiko A.; von Haeseler, Arndt; Minh, Bui Quang
2015-01-01
Large phylogenomics data sets require fast tree inference methods, especially for maximum-likelihood (ML) phylogenies. Fast programs exist, but due to inherent heuristics to find optimal trees, it is not clear whether the best tree is found. Thus, there is need for additional approaches that employ different search strategies to find ML trees and that are at the same time as fast as currently available ML programs. We show that a combination of hill-climbing approaches and a stochastic perturbation method can be time-efficiently implemented. If we allow the same CPU time as RAxML and PhyML, then our software IQ-TREE found higher likelihoods between 62.2% and 87.1% of the studied alignments, thus efficiently exploring the tree-space. If we use the IQ-TREE stopping rule, RAxML and PhyML are faster in 75.7% and 47.1% of the DNA alignments and 42.2% and 100% of the protein alignments, respectively. However, the range of obtaining higher likelihoods with IQ-TREE improves to 73.3–97.1%. IQ-TREE is freely available at http://www.cibiv.at/software/iqtree. PMID:25371430
Heinze, Georg; Ploner, Meinhard; Beyea, Jan
2013-12-20
In the logistic regression analysis of a small-sized, case-control study on Alzheimer's disease, some of the risk factors exhibited missing values, motivating the use of multiple imputation. Usually, Rubin's rules (RR) for combining point estimates and variances would then be used to estimate (symmetric) confidence intervals (CIs), on the assumption that the regression coefficients were distributed normally. Yet, rarely is this assumption tested, with or without transformation. In analyses of small, sparse, or nearly separated data sets, such symmetric CI may not be reliable. Thus, RR alternatives have been considered, for example, Bayesian sampling methods, but not yet those that combine profile likelihoods, particularly penalized profile likelihoods, which can remove first order biases and guarantee convergence of parameter estimation. To fill the gap, we consider the combination of penalized likelihood profiles (CLIP) by expressing them as posterior cumulative distribution functions (CDFs) obtained via a chi-squared approximation to the penalized likelihood ratio statistic. CDFs from multiple imputations can then easily be averaged into a combined CDF c , allowing confidence limits for a parameter β at level 1 - α to be identified as those β* and β** that satisfy CDF c (β*) = α ∕ 2 and CDF c (β**) = 1 - α ∕ 2. We demonstrate that the CLIP method outperforms RR in analyzing both simulated data and data from our motivating example. CLIP can also be useful as a confirmatory tool, should it show that the simpler RR are adequate for extended analysis. We also compare the performance of CLIP to Bayesian sampling methods using Markov chain Monte Carlo. CLIP is available in the R package logistf. PMID:23873477
Measuring pad arrangement for a logging sonde
Vannier, D.; Tromelin, J.
1989-08-29
This patent describes a logging sonde for use in a borehole traversing an earth formation. The logging sonde comprising: an elongated sonde body; a plurality of measuring means for measuring a characteristic of the earth formation. Each of the measuring means comprising: a central element; a first measuring flap hingably connected to the central element; a second measuring flap hingable connected to the central element. The measuring flaps being disposed on either side of the central element, the first measuring flap staggered relative to the second measuring flap along the longitudinal direction of the sonde body; means operatively connected between the sonde body and the first and second measuring flaps for applying a resilient force to each of the measuring flaps, thereby tending to move the flaps away from the sonde body; and means connected between the sonde body and each of the measuring means for translocating the measuring means away from and back to the sonde body.
Quantifying logging residue - before the fact
Bones, J.T.
1982-06-01
Tree biomass estimation, which is being integrated into the U.S. Forest Service Renewable Resources Evaluation Program, will give foresters the ability to estimate the amount of logging residues they might expect from harvested treetops and branches and residual rough, rotten, and small trees before the actual harvest. With planning, and increased demand for such timber products as pulpwood and fuelwood, product recovery could be increased by up to 43 percent in softwood stands and 99% in hardwoods. Recovery levels affect gross product receipts and site preparation costs. An example of product recovery and residue generation is presented for three harvesting options in Pennsylvania hardwood stands. Under the whole-tree harvesting option, 46% more product was recovered than in single product harvesting, and logging residue levels were reduced by 58%.
INSPIRE and SPIRES Log File Analysis
Adams, Cole; /Wheaton Coll. /SLAC
2012-08-31
SPIRES, an aging high-energy physics publication data base, is in the process of being replaced by INSPIRE. In order to ease the transition from SPIRES to INSPIRE it is important to understand user behavior and the drivers for adoption. The goal of this project was to address some questions in regards to the presumed two-thirds of the users still using SPIRES. These questions are answered through analysis of the log files from both websites. A series of scripts were developed to collect and interpret the data contained in the log files. The common search patterns and usage comparisons are made between INSPIRE and SPIRES, and a method for detecting user frustration is presented. The analysis reveals a more even split than originally thought as well as the expected trend of user transition to INSPIRE.
LOTUS template for calculating well logs
Mitchell, R.J. ); Taylor, S.J. )
1993-09-01
Calculating well logs is a time-consuming process. This template uses input parameters consisting of well name, location county, state, formation name, starting depth, repeat interval, resistivity of shale, and irreducible bulk volume water, which provides heading information for print outs. Required information from basic well logs are porosity, conductivity (optional), formation resistivity, resistivity of the formation water for the zone being calculated, resistivity of the mud filtrate, the porosity cutoff for pay in the zone being calculated, and the saltwater saturation cutoff for the pay zone. These parameters are used to calculate apparent water resistivity, saltwater saturation, bulk volume water, ratio of apparent water resistivity to input water resistivity, irreducible saltwater saturation, resistivity volume of shale, permeability, and a derived porosity value. A print out of the results is available through the lotus print function. Using this template allows maximum control of the input parameters and reduces hand calculation time.
Li, Xiaochun; Li, Huilin; Jin, Man; D Goldberg, Judith
2016-09-10
We consider the non-inferiority (or equivalence) test of the odds ratio (OR) in a crossover study with binary outcomes to evaluate the treatment effects of two drugs. To solve this problem, Lui and Chang (2011) proposed both an asymptotic method and a conditional method based on a random effects logit model. Kenward and Jones (1987) proposed a likelihood ratio test (LRTM ) based on a log linear model. These existing methods are all subject to model misspecification. In this paper, we propose a likelihood ratio test (LRT) and a score test that are independent of model specification. Monte Carlo simulation studies show that, in scenarios considered in this paper, both the LRT and the score test have higher power than the asymptotic and conditional methods for the non-inferiority test; the LRT, score, and asymptotic methods have similar power, and they all have higher power than the conditional method for the equivalence test. When data can be well described by a log linear model, the LRTM has the highest power among all the five methods (LRTM , LRT, score, asymptotic, and conditional) for both non-inferiority and equivalence tests. However, in scenarios for which a log linear model does not describe the data well, the LRTM has the lowest power for the non-inferiority test and has inflated type I error rates for the equivalence test. We provide an example from a clinical trial that illustrates our methods. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27095359
Thermal Properties of Bazhen fm. Sediments from Thermal Core Logging
NASA Astrophysics Data System (ADS)
Spasennykh, Mikhail; Popov, Evgeny; Popov, Yury; Chekhonin, Evgeny; Romushkevich, Raisa; Zagranovskaya, Dzhuliya; Belenkaya, Irina; Zhukov, Vladislav; Karpov, Igor; Saveliev, Egor; Gabova, Anastasia
2016-04-01
The Bazhen formation (B. fm.) is the hugest self-contained source-and-reservoir continuous petroleum system covering by more than 1 mln. km2 (West Siberia, Russia). High lithological differentiation in Bazhen deposits dominated by silicic shales and carbonates accompanied by extremely high total organic carbon values (of up to 35%), pyrite content and brittle mineralogical composition deteriorate standard thermal properties assessment for low permeable rocks. Reliable information of unconventional system thermal characteristics is the necessary part of works such as modelling of different processes in reservoir under thermal EOR for accessing their efficiency, developing and optimizing design of the oil recovery methods, interpretation of the well temperature logging data and for the basin petroleum modelling. A unique set of data including thermal conductivity, thermal diffusivity, volumetric heat capacity, thermal anisotropy for the B.fm. rocks was obtained from thermal core logging (high resolution continuous thermal profiling) on more than 4680 core samples (2000 of B.fm. samples are among) along seven wells for four oil fields. Some systematic peculiarities of the relation between thermal properties of the B.fm. rocks and their mineralogical composition, structural and texture properties were obtained. The high-resolution data are processed jointly with the standard petrophysical logging that allowed us to provide better separation of the formation. The research work was done with financial support of the Russian Ministry of Education and Science (unique identification number RFMEFI58114X0008).
Rill Erosion in Post Wildfire Forests after Salvage Logging
NASA Astrophysics Data System (ADS)
Robichaud, Peter; Wagenbrenner, Joseph; Brown, Robert
2016-04-01
Despite the dominance of concentrated flow or rill erosion in the erosion processes especially in steep forest environments that have been affected by wildfire or management activities few studies have quantified these effects on rill erosion. This study quantified the effects of wildfire and post-fire timber salvage operations on rill runoff quantity, runoff velocity, and rill erosion. Simulated rill experiments were conducted at various sites in the Western US after wildfire and timber salvage operations. The onsite conditions consists of burned only, salvage logged, skid or snig trail, or skid trails with extra logging debris added. For each rill experiment, concentrated flow was applied at the top of the plot through an energy dissipater at five inflow rates for 12 min each. Runoff was sampled every 2 min and runoff volume and sediment concentration were determined for each sample. The runoff velocity was measured using a dyed calcium chloride solution and two conductivity probes placed a known distance apart. Runoff volume, runoff velocities, and sediment concentrations increased with increasing levels of disturbance. The burned only plots had lower runoff rates and sediment concentrations than any of the other disturbances. The salvage logged plots had greater responses than the burn only plots and the mitigation treatment had a marginal effect on runoff ratios, runoff velocities and sediment concentrations. These results suggest that additional disturbance after a wildfire can increase the erosional response and that proper erosion control mitigation may be an important consideration for post fire management to reduce onsite erosion.
Development of geothermal logging systems in the United States
Lysne, P.
1994-04-01
Logging technologies developed for hydrocarbon resource evaluation have not migrated into geothermal applications even though data so obtained would strengthen reservoir characterization efforts. Two causative issues have impeded progress: (1) there is a general lack of vetted, high-temperature instrumentation, and (2) the interpretation of log data generated in a geothermal formation is in its infancy. Memory-logging tools provide a path around the first obstacle by providing quality data at a low cost. These tools feature on-board computers that process and store data, and newer systems may be programmed to make decisions. Since memory tools are completely self-contained, they are readily deployed using the slick line found on most drilling locations. They have proven to be rugged, and a minimum training program is required for operator personnel. Present tools measure properties such as temperature and pressure, and the development of noise, deviation, and fluid conductivity logs based on existing hardware is relatively easy. A more complex geochemical tool aimed at a quantitative analysis of (potassium, uranium and thorium) is in the calibration phase, and it is expandable into all nuclear measurements common in the hydrocarbon industry. A fluid sampling tool is in the design phase. All tools are designed for operation at conditions exceeding 400 C, and for deployment in the slim holes produced by mining-coring operations. Partnerships are being formed between the geothermal industry and scientific drilling programs to define and develop inversion algorithms relating raw tool data to more pertinent information. These cooperative efforts depend upon quality guidelines such as those under development within the international Ocean Drilling Program.
Reconsidering Data Logging in Light of Digital Forensics
NASA Astrophysics Data System (ADS)
Chou, Bin-Hui; Takahashi, Kenichi; Hori, Yoshiaki; Sakurai, Kouichi
Logs record the events that have happened within in a system so they are considered the history of system activities. They are one of the objects that digital forensic investigators would like to examine when a security incident happens. However, logs were initially created for trouble shooting, and are not purposefully designed for digital forensics. Thus, enormous and redundant log data make analysis tasks complicated and time-consuming to find valuable information, and make logging-related techniques difficult utilized in some systems such as embedded systems. In this paper, we reconsider a data logging mechanism in terms of forensics and consequently, we propose purpose-based forensic logging. In purpose-based forensic logging, we only collect the required logs according to a specific purpose, which could decrease the space that logs occupy and may mitigate the analysis tasks during forensic investigations.
Vexler, Albert; Tanajian, Hovig; Hutson, Alan D.
2016-01-01
In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K-sample distributions. Recognizing that recent statistical software packages do not sufficiently address K-sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p-values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p-value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p-value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.
Quantitative Literacy: Working with Log Graphs
NASA Astrophysics Data System (ADS)
Shawl, S.
2013-04-01
The need for working with and understanding different types of graphs is a common occurrence in everyday life. Examples include anything having to do investments, being an educated juror in a case that involves evidence presented graphically, and understanding many aspect of our current political discourse. Within a science class graphs play a crucial role in presenting and interpreting data. In astronomy, where the range of graphed values is many orders of magnitude, log-axes must be used and understood. Experience shows that students do not understand how to read and interpret log-axes or how they differ from linear. Alters (1996), in a study of college students in an algebra-based physics class, found little understanding of log plotting. The purpose of this poster is to show the method and progression I have developed for use in my “ASTRO 101” class, with the goal being to help students better understand the H-R diagram, mass-luminosity relationship, and digital spectra.
Chiral gravity, log gravity, and extremal CFT
Maloney, Alexander; Song Wei; Strominger, Andrew
2010-03-15
We show that the linearization of all exact solutions of classical chiral gravity around the AdS{sub 3} vacuum have positive energy. Nonchiral and negative-energy solutions of the linearized equations are infrared divergent at second order, and so are removed from the spectrum. In other words, chirality is confined and the equations of motion have linearization instabilities. We prove that the only stationary, axially symmetric solutions of chiral gravity are BTZ black holes, which have positive energy. It is further shown that classical log gravity--the theory with logarithmically relaxed boundary conditions--has finite asymptotic symmetry generators but is not chiral and hence may be dual at the quantum level to a logarithmic conformal field theories (CFT). Moreover we show that log gravity contains chiral gravity within it as a decoupled charge superselection sector. We formally evaluate the Euclidean sum over geometries of chiral gravity and show that it gives precisely the holomorphic extremal CFT partition function. The modular invariance and integrality of the expansion coefficients of this partition function are consistent with the existence of an exact quantum theory of chiral gravity. We argue that the problem of quantizing chiral gravity is the holographic dual of the problem of constructing an extremal CFT, while quantizing log gravity is dual to the problem of constructing a logarithmic extremal CFT.
Log-transforming the matter power spectrum
NASA Astrophysics Data System (ADS)
Greiner, M.; Enßlin, T. A.
2015-02-01
We investigate whether non-linear effects on the large-scale power spectrum of dark matter, namely the increase in small-scale power and the smearing of baryon acoustic oscillations, can be decreased by a log-transformation or emulated by an exponential transformation of the linear spectrum. To that end we present a formalism to convert the power spectrum of a log-normal field to the power spectrum of the logarithmic Gaussian field and vice versa. All ingredients of our derivation can already be found in various publications in cosmology and other fields. We follow a more pedagogical approach providing a detailed derivation, application examples, and a discussion of implementation subtleties in one text. We use the formalism to show that the non-linear increase in small-scale power in the matter power spectrum is significantly smaller for the log-transformed spectrum which fits the linear spectrum (with less than 20% error) for redshifts down to 1 and k ≤ 1.0 h Mpc. For lower redshifts the fit to the linear spectrum is not as good, but the reduction of non-linear effects is still significant. Similarly, we show that applying the linear growth factor to the logarithmic density leads to an automatic increase in small-scale power for low redshifts fitting to third-order perturbation spectra and Cosmic Emulator spectra with an error of less than 20%. Smearing of baryon acoustic oscillations is at least three times weaker, but still present.
HALM: A Hybrid Asperity Likelihood Model for Italy
NASA Astrophysics Data System (ADS)
Gulia, L.; Wiemer, S.
2009-04-01
The Asperity Likelihood Model (ALM), first developed and currently tested for California, hypothesizes that small-scale spatial variations in the b-value of the Gutenberg and Richter relationship play a central role in forecasting future seismicity (Wiemer and Schorlemmer, SRL, 2007). The physical basis of the model is the concept that the local b-value is inversely dependent on applied shear stress. Thus low b-values (b < 0.7) characterize the locked paches of faults -asperities- from which future mainshocks are more likely to be generated, whereas the high b-values (b > 1.1) found for example in creeping section of faults suggest a lower seismic hazard. To test this model in a reproducible and prospective way suitable for the requirements of the CSEP initiative (www.cseptesting.org), the b-value variability is mapped on a grid. First, using the entire dataset above the overall magnitude of completeness, the regional b-value is estimated. This value is then compared to the one locally estimated at each grid-node for a number of radii, we use the local value if its likelihood score, corrected for the degrees of freedom using the Akaike Information Criterion, suggest to do so. We are currently calibrating the ALM model for implementation in the Italian testing region, the first region within the CSEP EU testing Center (eu.cseptesting.org) for which fully prospective tests of earthquake likelihood models will commence in Europe. We are also developing a modified approach, ‘hybrid' between a grid-based and a zoning one: the HALM (Hybrid Asperity Likelihood Model). According to HALM, the Italian territory is divided in three distinct regions depending on the main tectonic elements, combined with knowledge derived from GPS networks, seismic profile interpretation, borehole breakouts and the focal mechanisms of the event. The local b-value variability was thus mapped using three independent overall b-values. We evaluate the performance of the two models in
Communicating likelihoods and probabilities in forecasts of volcanic eruptions
NASA Astrophysics Data System (ADS)
Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas
2014-02-01
The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in
Can log-periodic power law structures arise from random fluctuations?
NASA Astrophysics Data System (ADS)
Wosnitza, Jan Henrik; Leker, Jens
2014-05-01
Recent research has established log-periodic power law (LPPL) patterns prior to the detonation of the German stock index (DAX) bubble in 1998. The purpose of this article is to explore whether a Langevin equation extracted from real world data can generate synthetic time series with comparable LPPL structures. To this end, we first estimate the stochastic process underlying the DAX log-returns during the period from mid-1997 until end-2003. The employed data set contains about 3.93ṡ106 intraday DAX quotes at a sampling rate of 15 s. Our results indicate that the DAX log-returns can be described as a Markov process. As a consequence, a Langevin equation is derived. Based on this model equation, we run extensive simulations in order to generate 100 synthetic DAX trajectories each covering 3000 trading days. We find LPPL behavior in ten artificial time series. Moreover, we can establish a link between LPPL patterns and ensuing bubble bursts in seven synthetic 600-week windows. However, the LPPL components in most synthetic trajectories differ fundamentally from those LPPL structures that have previously been detected in real financial time series. Summarized, this paper demonstrates that LPPL structures are not necessarily the signature of imitative behavior among investors but can also stem from noise, even though the likelihood of this is extremely low. Thus, our findings confirm with high statistical confidence that the LPPL structures in the DAX development are rooted deeper than only in the random fluctuations of the German stock market.
Bicknell, Jake E; Struebig, Matthew J; Davies, Zoe G; Baraloto, Christopher
2015-01-01
Over 20% of the world's tropical forests have been selectively logged, and large expanses are allocated for future timber extraction. Reduced-impact logging (RIL) is being promoted as best practice forestry that increases sustainability and lowers CO2 emissions from logging, by reducing collateral damage associated with timber extraction. RIL is also expected to minimize the impacts of selective logging on biodiversity, although this is yet to be thoroughly tested. We undertake the most comprehensive study to date to investigate the biodiversity impacts of RIL across multiple taxonomic groups. We quantified birds, bats and large mammal assemblage structures, using a before-after control-impact (BACI) design across 20 sample sites over a 5-year period. Faunal surveys utilized point counts, mist nets and line transects and yielded >250 species. We examined assemblage responses to logging, as well as partitions of feeding guild and strata (understorey vs. canopy), and then tested for relationships with logging intensity to assess the primary determinants of community composition. Community analysis revealed little effect of RIL on overall assemblages, as structure and composition were similar before and after logging, and between logging and control sites. Variation in bird assemblages was explained by natural rates of change over time, and not logging intensity. However, when partitioned by feeding guild and strata, the frugivorous and canopy bird ensembles changed as a result of RIL, although the latter was also associated with change over time. Bats exhibited variable changes post-logging that were not related to logging, whereas large mammals showed no change at all. Indicator species analysis and correlations with logging intensities revealed that some species exhibited idiosyncratic responses to RIL, whilst abundance change of most others was associated with time. Synthesis and applications. Our study demonstrates the relatively benign effect of reduced
Changes in soil nitrogen cycling under Norway spruce logging residues on a clear-cut
NASA Astrophysics Data System (ADS)
Smolander, Aino; Lindroos, Antti-Jussi; Kitunen, Veikko
2016-04-01
In Europe, forest biomass is increasingly being used as a source of energy to replace fossil fuels. In practice, this means that logging residues, consisting of green branches and stem tops, are more commonly harvested. In 2012 logging residues were harvested from about one third of clear-cuts in Finland. Our aim was to study how logging residues affect soil organic matter quality, in particular soil N cycling processes and composition of certain groups of plant secondary compounds, tannins and terpenes. Compounds in these groups were of interest because they are abundant in logging residues, and they have been shown to control soil N cycling. In connection with clear-cutting a Norway spruce stand in southern Finland, we established a controlled field experiment by building logging residue piles (40 kg/m2) on study plots. The piles consisted of fresh spruce branches and tops with green foliage. Control plots with no residues were included (0 kg/m2). Changes in soil organic matter properties have now been monitored for three growing seasons. Logging residues affected organic layer properties strongly. For example, they increased net nitrification and nitrate concentrations. There were also increases in the concentrations of certain terpenes and condensed tannins due to the residues. The significance of logging residues on soil processes and properties will be shown.
On the hydrophobicity of peptides: Comparing empirical predictions of peptide log P values.
Thompson, Sarah J; Hattotuwagama, Channa K; Holliday, John D; Flower, Darren R
2006-01-01
Peptides are of great therapeutic potential as vaccines and drugs. Knowledge of physicochemical descriptors, including the partition coefficient logP, is useful for the development of predictive Quantitative Structure-Activity Relationships (QSARs). We have investigated the accuracy of available programs for the prediction of logP values for peptides with known experimental values obtained from the literature. Eight prediction programs were tested, of which seven programs were fragment-based methods: XLogP, LogKow, PLogP, ACDLogP, AlogP, Interactive Analysis's LogP and MlogP; and one program used a whole molecule approach: QikProp. The predictive accuracy of the programs was assessed using r(2) values, with ALogP being the most effective (r( 2) = 0.822) and MLogP the least (r(2) = 0.090). We also examined three distinct types of peptide structure: blocked, unblocked, and cyclic. For each study (all peptides, blocked, unblocked and cyclic peptides) the performance of programs rated from best to worse is as follows: all peptides - ALogP, QikProp, PLogP, XLogP, IALogP, LogKow, ACDLogP, and MlogP; blocked peptides - PLogP, XLogP, ACDLogP, IALogP, LogKow, QikProp, ALogP, and MLogP; unblocked peptides - QikProp, IALogP, ALogP, ACDLogP, MLogP, XLogP, LogKow and PLogP; cyclic peptides - LogKow, ALogP, XLogP, MLogP, QikProp, ACDLogP, IALogP. In summary, all programs gave better predictions for blocked peptides, while, in general, logP values for cyclic peptides were under-predicted and those of unblocked peptides were over-predicted. PMID:17597897
Log-periodic route to fractal functions.
Gluzman, S; Sornette, D
2002-03-01
Log-periodic oscillations have been found to decorate the usual power-law behavior found to describe the approach to a critical point, when the continuous scale-invariance symmetry is partially broken into a discrete-scale invariance symmetry. For Ising or Potts spins with ferromagnetic interactions on hierarchical systems, the relative magnitude of the log-periodic corrections are usually very small, of order 10(-5). In growth processes [diffusion limited aggregation (DLA)], rupture, earthquake, and financial crashes, log-periodic oscillations with amplitudes of the order of 10% have been reported. We suggest a "technical" explanation for this 4 order-of-magnitude difference based on the property of the "regular function" g(x) embodying the effect of the microscopic degrees of freedom summed over in a renormalization group (RG) approach F(x)=g(x)+mu(-1)F(gamma x) of an observable F as a function of a control parameter x. For systems for which the RG equation has not been derived, the previous equation can be understood as a Jackson q integral, which is the natural tool for describing discrete-scale invariance. We classify the "Weierstrass-type" solutions of the RG into two classes characterized by the amplitudes A(n) of the power-law series expansion. These two classes are separated by a novel "critical" point. Growth processes (DLA), rupture, earthquake, and financial crashes thus seem to be characterized by oscillatory or bounded regular microscopic functions that lead to a slow power-law decay of A(n), giving strong log-periodic amplitudes. If in addition, the phases of A(n) are ergodic and mixing, the observable presents self-affine nondifferentiable properties. In contrast, the regular function of statistical physics models with "ferromagnetic"-type interactions at equilibrium involves unbound logarithms of polynomials of the control variable that lead to a fast exponential decay of A(n) giving weak log-periodic amplitudes and smoothed observables. PMID
The Posterior Distribution of the Likelihood Ratio as a Measure of Evidence
NASA Astrophysics Data System (ADS)
Smith, I.; Ferrari, A.
2011-03-01
This paper deals with simple versus composite hypothesis testing under Bayesian and frequentist settings. The Posterior distribution of the Likelihood Ratio (PLR) concept is proposed in [1] for significance testing. The PLR is shown to be equal to 1 minus the p-value in a simple case. The PLR is used in [2] in order to calibrate p-values, Fractional Bayes Factors (FBF) and others. Dempster's equivalence result is slightly extended by adding a nuisance parameter in the test. On the other hand, in [3] the p-values and the posterior probability of the null hypothesis Pr(H0|x) (seen as a Bayesian measure of evidence against the null hypothesis) are shown to be irreconcilable. Actually, as emphasized in [4], Pr(H0|x) is a measure of accuracy of a test, not a measure of evidence in a formal sense because it does not involve the likelihood ratio. The PLR may give such a measure of evidence and be related to a natural p-value. In this presentation, in a classical invariance framework the PLR with inner threshold 1 will be shown to be equal to 1 minus a p-value where the test statistics is the likelihood, weighted by a term that accounts for some volume distorsion effect. Other analytical properties of the PLR will be proved in more general settings. The minimum of its support is equal to the Generalized Likelihood Ratio if H0 is nested in H1 and its moments are directly related to the (F)BF for a proper prior. Its relation to credible domains is also studied. Practical issues will also be considered. The PLR can be implemented using a simple Monte Carlo Markov Chain and will be applied to extrasolar planet detection using direct imaging.