Transfer entropy as a log-likelihood ratio.
Barnett, Lionel; Bossomaier, Terry
2012-09-28
Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense. PMID:23030125
Transfer Entropy as a Log-Likelihood Ratio
NASA Astrophysics Data System (ADS)
Barnett, Lionel; Bossomaier, Terry
2012-09-01
Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.
Fast inference in generalized linear models via expected log-likelihoods
Ramirez, Alexandro D.; Paninski, Liam
2015-01-01
Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting “expected log-likelihood” can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina. PMID:23832289
Posterior propriety for hierarchical models with log-likelihoods that have norm bounds
Michalak, Sarah E.; Morris, Carl N.
2015-07-17
Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonlymore » used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).« less
Posterior propriety for hierarchical models with log-likelihoods that have norm bounds
Michalak, Sarah E.; Morris, Carl N.
2015-07-17
Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonly used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).
MICCLLR: Multiple-Instance Learning Using Class Conditional Log Likelihood Ratio
NASA Astrophysics Data System (ADS)
El-Manzalawy, Yasser; Honavar, Vasant
Multiple-instance learning (MIL) is a generalization of the supervised learning problem where each training observation is a labeled bag of unlabeled instances. Several supervised learning algorithms have been successfully adapted for the multiple-instance learning settings. We explore the adaptation of the Naive Bayes (NB) classifier and the utilization of its sufficient statistics for developing novel multiple-instance learning methods. Specifically, we introduce MICCLLR (multiple-instance class conditional log likelihood ratio), a method for mapping each bag of instances as a single meta-instance using class conditional log likelihood ratio statistics such that any supervised base classifier can be applied to the meta-data. The results of our experiments with MICCLLR using different base classifiers suggest that no single base classifier consistently outperforms other base classifiers on all data sets. We show that a substantial improvement in performance is obtained using an ensemble of MICCLLR classifiers trained using different base learners. We also show that an extra gain in classification accuracy is obtained by applying AdaBoost.M1 to weak MICCLLR classifiers. Overall, our results suggest that the predictive performance of the three proposed variants of MICCLLR are competitive to some of the state-of-the-art MIL methods.
2. Onroom log cabin (right), log root cellar (center), tworoom ...
2. On-room log cabin (right), log root cellar (center), two-room log cabin (left), and post-and-beam garage (background). View to southwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT
Read, Randy J; McCoy, Airlie J
2016-03-01
The crystallographic diffraction experiment measures Bragg intensities; crystallographic electron-density maps and other crystallographic calculations in phasing require structure-factor amplitudes. If data were measured with no errors, the structure-factor amplitudes would be trivially proportional to the square roots of the intensities. When the experimental errors are large, and especially when random errors yield negative net intensities, the conversion of intensities and their error estimates into amplitudes and associated error estimates becomes nontrivial. Although this problem has been addressed intermittently in the history of crystallographic phasing, current approaches to accounting for experimental errors in macromolecular crystallography have numerous significant defects. These have been addressed with the formulation of LLGI, a log-likelihood-gain function in terms of the Bragg intensities and their associated experimental error estimates. LLGI has the correct asymptotic behaviour for data with large experimental error, appropriately downweighting these reflections without introducing bias. LLGI abrogates the need for the conversion of intensity data to amplitudes, which is usually performed with the French and Wilson method [French & Wilson (1978), Acta Cryst. A35, 517-525], wherever likelihood target functions are required. It has general applicability for a wide variety of algorithms in macromolecular crystallography, including scaling, characterizing anisotropy and translational noncrystallographic symmetry, detecting outliers, experimental phasing, molecular replacement and refinement. Because it is impossible to reliably recover the original intensity data from amplitudes, it is suggested that crystallographers should always deposit the intensity data in the Protein Data Bank. PMID:26960124
Read, Randy J.; McCoy, Airlie J.
2016-01-01
The crystallographic diffraction experiment measures Bragg intensities; crystallographic electron-density maps and other crystallographic calculations in phasing require structure-factor amplitudes. If data were measured with no errors, the structure-factor amplitudes would be trivially proportional to the square roots of the intensities. When the experimental errors are large, and especially when random errors yield negative net intensities, the conversion of intensities and their error estimates into amplitudes and associated error estimates becomes nontrivial. Although this problem has been addressed intermittently in the history of crystallographic phasing, current approaches to accounting for experimental errors in macromolecular crystallography have numerous significant defects. These have been addressed with the formulation of LLGI, a log-likelihood-gain function in terms of the Bragg intensities and their associated experimental error estimates. LLGI has the correct asymptotic behaviour for data with large experimental error, appropriately downweighting these reflections without introducing bias. LLGI abrogates the need for the conversion of intensity data to amplitudes, which is usually performed with the French and Wilson method [French & Wilson (1978 ▸), Acta Cryst. A35, 517–525], wherever likelihood target functions are required. It has general applicability for a wide variety of algorithms in macromolecular crystallography, including scaling, characterizing anisotropy and translational noncrystallographic symmetry, detecting outliers, experimental phasing, molecular replacement and refinement. Because it is impossible to reliably recover the original intensity data from amplitudes, it is suggested that crystallographers should always deposit the intensity data in the Protein Data Bank. PMID:26960124
An improved scheme based on log-likelihood-ratio for lattice reduction-aided MIMO detection
NASA Astrophysics Data System (ADS)
Song, Yunchao; Liu, Chen; Lu, Feng
2016-01-01
Lattice reduction (LR)-aided detectors have been shown great potentials in wireless communications for their low complexity and low bit-error-rate (BER) performance. The LR algorithms use the unimodular transformation to improve the orthogonality of the channel matrix. However, the LR algorithms only utilize the channel state information (CSI) and do not take account for the received signal, which is also important information in enhancing the performance of the detectors. In this paper, we make a readjustment of the received signal in the LR domain and propose a new scheme which is based on the log-likelihood-ratio (LLR) criterion to improve the LR-aided detectors. The motivation of using the LLR criterion is that it utilizes both the received signal and the CSI, so that it can provide exact pairwise error probabilities (PEPs) of the symbols. Then, in the proposed scheme, we design the LLR-based transformation algorithm (TA) which uses the unimodular transformation to minimize the PEPs of the symbols by the LLR criterion. Note that the PEPs of the symbols affect the error propagation in the vertical Bell Laboratories Layered Space-Time (VBLAST) detector, and decreasing the PEPs can reduce the error propagation in the VBLAST detectors; thus, our LLR-based TA-aided VBLAST detectors will exhibit better BER performance than the previous LR-aided VBLAST detectors. Both the BER performance and the computational complexity are demonstrated through the simulation results.
Geetha, Chinnaraj; Manjula, Puttabasappa
2014-03-01
The aim of the present study was to evaluate the use of the envelope difference index (EDI) and log-likelihood ratio (LLR) to quantify the independent and interactive effects of wide dynamic range compression, digital noise reduction and directionality, and to carry out self-rated quality measures. A recorded sentence embedded in speech spectrum noise at +5 dB signal to noise ratio was presented to a four channel digital hearing aid and the output was recorded with different combinations of algorithms at 30, 45 and 70 dB HL levels of presentation through a 2 cc coupler. EDI and LLR were obtained in comparison with the original signal using MATLAB software. In addition, thirty participants with normal hearing sensitivity rated the output on the loudness and clarity parameters of quality. The results revealed that the temporal changes happening at the output is independent of the number of algorithms activated together in a hearing aid. However, at a higher level of presentation, temporal cues are better preserved if all of these algorithms are deactivated. The spectral components speech tend to get affected by the presentation level. The results also indicate the importance of quality rating as this helps in considering whether the spectral and/or temporal deviations created in the hearing aid are desirable or not. PMID:26557357
Geetha, Chinnaraj; Manjula, Puttabasappa
2014-01-01
The aim of the present study was to evaluate the use of the envelope difference index (EDI) and log-likelihood ratio (LLR) to quantify the independent and interactive effects of wide dynamic range compression, digital noise reduction and directionality, and to carry out self-rated quality measures. A recorded sentence embedded in speech spectrum noise at +5 dB signal to noise ratio was presented to a four channel digital hearing aid and the output was recorded with different combinations of algorithms at 30, 45 and 70 dB HL levels of presentation through a 2 cc coupler. EDI and LLR were obtained in comparison with the original signal using MATLAB software. In addition, thirty participants with normal hearing sensitivity rated the output on the loudness and clarity parameters of quality. The results revealed that the temporal changes happening at the output is independent of the number of algorithms activated together in a hearing aid. However, at a higher level of presentation, temporal cues are better preserved if all of these algorithms are deactivated. The spectral components speech tend to get affected by the presentation level. The results also indicate the importance of quality rating as this helps in considering whether the spectral and/or temporal deviations created in the hearing aid are desirable or not. PMID:26557357
Pastor, José V; Arrègle, Jean; García, José M; Zapata, L Daniel
2007-02-20
A methodology for processing images of diesel sprays under different experimental situations is presented. The new approach has been developed for cases where the background does not follow a Gaussian distribution but a positive bias appears. In such cases, the lognormal and the gamma probability density functions have been considered for the background digital level distributions. Two different algorithms have been compared with the standard log-likelihood ratio test (LRT): a threshold defined from the cumulative probability density function of the background shows a sensitive improvement, but the best results are obtained with modified versions of the LRT algorithm adapted to non-Gaussian cases. PMID:17279134
KUPPER, Lawrence L.
2012-01-01
A common goal in environmental epidemiologic studies is to undertake logistic regression modeling to associate a continuous measure of exposure with binary disease status, adjusting for covariates. A frequent complication is that exposure may only be measurable indirectly, through a collection of subject-specific variables assumed associated with it. Motivated by a specific study to investigate the association between lung function and exposure to metal working fluids, we focus on a multiplicative-lognormal structural measurement error scenario and approaches to address it when external validation data are available. Conceptually, we emphasize the case in which true untransformed exposure is of interest in modeling disease status, but measurement error is additive on the log scale and thus multiplicative on the raw scale. Methodologically, we favor a pseudo-likelihood (PL) approach that exhibits fewer computational problems than direct full maximum likelihood (ML) yet maintains consistency under the assumed models without necessitating small exposure effects and/or small measurement error assumptions. Such assumptions are required by computationally convenient alternative methods like regression calibration (RC) and ML based on probit approximations. We summarize simulations demonstrating considerable potential for bias in the latter two approaches, while supporting the use of PL across a variety of scenarios. We also provide accessible strategies for obtaining adjusted standard errors to accompany RC and PL estimates. PMID:24027381
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS ...
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS FROM POND TO JACK LADDER--AN ENDLESS CHAIN CONVEYOR THAT MOVES LOGS INTO MILL - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR
Loaiza-Brito, Oscar; Martin, Johannes; Nilles, Hans Peter; Ratz, Michael
2005-12-02
Flux compactifications of string theory seem to require the presence of a fine-tuned constant in the superpotential. We discuss a scheme where this constant is replaced by a dynamical quantity which we argue to be a 'continuous Chern-Simons term'. In such a scheme, the gaugino condensate generates the hierarchically small scale of supersymmetry breakdown rather than adjusting its size to a constant. A crucial ingredient is the appearance of the hierarchically small quantity exp(-
MAIL LOG, program theory, volume 2
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.
Experiment 2030. EE-2 Temperature Log and Downhole Water Sample
Grigsby, Charles O.
1983-07-29
A temperature log and downhole water sample run were conducted in EE-2 on July 13, 1983. The temperature log was taken to show any changes which had occurred in the fracture-to-wellbore intersections as a result of the Experiment 2020 pumping and to locate fluid entries for taking the water sample. The water sample was requested primarily to determine the arsenic concentration in EE-2 fluids (see memo from C.Grigsby, June 28, 1983 concerning arsenic in EE-3 samples.) The temperature log was run using the thermistor in the ESS-6 water samples.
Core log: Valles caldera No. 2A, New Mexico
Starguist, V.L.
1988-01-01
Scientific core hole VC-2A was drilled into the western ring-fracture zone at Sulphur Springs in the Valles caldera, New Mexico. VC-2A, the second scientific core hole in the caldera, was cored through a faulted and brecciated sequence of intracauldron tuffs and volcaniclastic rocks to a depth of 528 m. As of November 1, 1986, the unequilibrated bottom-hole temperature was 212/degree/C. The rocks penetrated are intensely altered and host sub-ore grade stockwork molybdenite mineralization between 25 and 125 m. This report contains a detailed core log to aid researchers in their studies of the Valles caldera magma hydrothermal system. 3 refs., 2 figs.
A note on G2 log-aesthetic curves
NASA Astrophysics Data System (ADS)
Wo, Mei Seen; Gobithaasan R., U.; Miura, Kenjiro T.; Abbas, Muhammad
2015-12-01
Log-aesthetic curve (LAC) is a curve family composed of transcendental curves that includes logarithmic spiral, clothoid, circle involute and Nielsen's spiral. They have linear logarithmic curvature graphs (LCGs) and are highly aesthetic. In order to implement G2 LAC in industrial design successfully, one needs guidance on the existence and uniqueness whether a LAC segment satisfy given G2 Hermite data. This paper focuses shows the existence and uniqueness of solution for single segment G2 LAC. A LAC equation that incorporates both start and end curvatures, and end tangential angle is first derived. Then, the end points of the LAC segments are calculated using the derived LAC equation, which is also a representation of the solution region of LAC given a set of G2 Hermite data. The derived function is investigated for its existence and uniqueness. It is shown that the solution region is a curve that do not self-intersect anywhere, thus the solution of single segment G2 LAC is always unique.
NASA Technical Reports Server (NTRS)
Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.
1981-01-01
An experiment was performed in which a complete X-ray survey of the 8.2 steradians of the sky at galactic latitudes where the absolute value of b is 20 deg down to a limiting sensitivity of 3.1 x ten to the minus 11th power ergs/sq cm sec in the 2-10 keV band. Of the 85 detected sources 17 were identified with galactic objects, 61 were identified with extragalactic objects, and 7 remain unidentified. The log N - log S relation for the non-galactic objects is well fit by the Euclidean relationship. The X-ray spectra of these objects were used to construct log N - log S in physical units. The complete sample of identified sources was used to construct X-ray luminosity functions, using the absolute maximum likelihood method, for clusters galaxies and active galactic nuclei.
Maximum likelihood topographic map formation.
Van Hulle, Marc M
2005-03-01
We introduce a new unsupervised learning algorithm for kernel-based topographic map formation of heteroscedastic gaussian mixtures that allows for a unified account of distortion error (vector quantization), log-likelihood, and Kullback-Leibler divergence. PMID:15802004
Assessment of geophysical logs from borehole USW G-2, Yucca Mountain, Nevada
Nelson, P.H.; Schimschal, U.
1993-05-01
Commercial logging contractors, Western Atlas, Schlumberger, and Edcon obtained borehole geophysical logs at the site of a potential high level nuclear waste repository at Yucca Mountain, Nevada. Drill hole USW-G2 was picked for this test of suitable logging tools and logging technology, both representing state-of-the-art technology by these commercial companies. Experience gained by analysis of existing core data and a variety of logs obtained earlier by Birdwell and Dresser Atlas served as a guide to a choice of logs to be obtained. Logs were obtained in water-filled borehole in zeolitized tuff (saturated zone) and in air-filled borehole largely in unaltered welded tuff (unsaturated zone).
Drovandi, Christopher C; McCutchan, Roy A
2016-06-01
In this article we present a new method for performing Bayesian parameter inference and model choice for low- count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel exact-approximate algorithm, which we refer to as alive SMC2. The advantages of this approach over competing methods are that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo, and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series, and the cumulative number of prion disease cases in mule deer. PMID:26584211
Terwilliger, Thomas C.
2001-01-01
The recently developed technique of maximum-likelihood density modification [Terwilliger (2000 ▶), Acta Cryst. D56, 965–972] allows a calculation of phase probabilities based on the likelihood of the electron-density map to be carried out separately from the calculation of any prior phase probabilities. Here, it is shown that phase-probability distributions calculated from the map-likelihood function alone can be highly accurate and that they show minimal bias towards the phases used to initiate the calculation. Map-likelihood phase probabilities depend upon expected characteristics of the electron-density map, such as a defined solvent region and expected electron-density distributions within the solvent region and the region occupied by a macromolecule. In the simplest case, map-likelihood phase-probability distributions are largely based on the flatness of the solvent region. Though map-likelihood phases can be calculated without prior phase information, they are greatly enhanced by high-quality starting phases. This leads to the technique of prime-and-switch phasing for removing model bias. In prime-and-switch phasing, biased phases such as those from a model are used to prime or initiate map-likelihood phasing, then final phases are obtained from map-likelihood phasing alone. Map-likelihood phasing can be applied in cases with solvent content as low as 30%. Potential applications of map-likelihood phasing include unbiased phase calculation from molecular-replacement models, iterative model building, unbiased electron-density maps for cases where 2Fo − Fc or σA-weighted maps would currently be used, structure validation and ab initio phase determination from solvent masks, non-crystallographic symmetry or other knowledge about expected electron density. PMID:11717488
List-Mode Likelihood: EM Algorithm and Image Quality Estimation Demonstrated on 2-D PET
Barrett, Harrison H.
2010-01-01
Using a theory of list-mode maximum-likelihood (ML) source reconstruction presented recently by Barrett et al. [1], this paper formulates a corresponding expectation-maximization (EM) algorithm, as well as a method for estimating noise properties at the ML estimate. List-mode ML is of interest in cases where the dimensionality of the measurement space impedes a binning of the measurement data. It can be advantageous in cases where a better forward model can be obtained by including more measurement coordinates provided by a given detector. Different figures of merit for the detector performance can be computed from the Fisher information matrix (FIM). This paper uses the observed FIM, which requires a single data set, thus, avoiding costly ensemble statistics. The proposed techniques are demonstrated for an idealized two-dimensional (2-D) positron emission tomography (PET) [2-D PET] detector. We compute from simulation data the improved image quality obtained by including the time of flight of the coincident quanta. PMID:9688154
Functional conservation between mammalian MGRN1 and plant LOG2 ubiquitin ligases
Guerra, Damian D.; Pratelli, Réjane; Kraft, Edward; Callis, Judy; Pilot, Guillaume
2014-01-01
Plant LOSS OF GDU 2 (LOG2) and mammalian MAHOGUNIN RING FINGER 1 (MGRN1) proteins are RING-type E3 ligases sharing similarity N-terminal to the RING domain. Deletion of this region disrupts the interaction of LOG2 with the plant membrane protein GLUTAMINE DUMPER 1 (GDU1). Phylogenetic analysis identified two clades of LOG2/MGRN1-like proteins in vertebrates and plants. The ability of MGRN1 to functionally replace LOG2 was tested. MGRN1 ubiquitylates GDU1 in vitro and can partially substitute for LOG2 in the plant, partially restoring amino acid resistance to a GDU1-myc over-expression, log2-1 background. Altogether, these results suggest a conserved function for the N-terminal domain in evolution. Structured summary of protein interactionsGDU1 physically interactswith rnMGRN1by two hybrid (1, 2)rnMGRN1 binds to GDU1by pull down(View interaction)GDU6 physically interactswith rnMGRN1by two hybrid (1, 2)GDU3 physically interactswith hsMGRN1by two hybrid (1, 2)rnMGRN1 physically interactswith GDU7by two hybrid (1, 2)GDU5 physically interactswith hsMGRN1by two hybrid (1, 2)hsMGRN1 physically interactswith GDU1by two hybrid (1, 2)rnMGRN1 physically interactswith GDU3by two hybrid (1, 2)rnMGRN1 physically interactswith GDU5by two hybrid (1, 2)hsMGRN1 physically interactswith GDU7by two hybrid (1, 2)GDU6 physically interactswith hsMGRN1by two hybrid (1, 2)LOG2 binds to GDU1by pull down (1, 2)hsMGRN1 physically interactswith GDU4by two hybrid (1, 2)rnMGRN1 physically interactswith GDU2by two hybrid (1, 2)GDU4 physically interactswith rnMGRN1by two hybrid (1, 2)LOG2 physically interactswith GDU1by two hybrid (1, 2, 3, 4)GDU2 physically interactswith hsMGRN1by two hybrid (1, 2) PMID:24036454
In situ fluid typing and quantification with 1D and 2D NMR logging.
Sun, Boqin
2007-05-01
In situ nuclear magnetic resonance (NMR) fluid typing has recently gained momentum due to data acquisition and inversion algorithm enhancement of NMR logging tools. T(2) distributions derived from NMR logging contain information on bulk fluids and pore size distributions. However, the accuracy of fluid typing is greatly overshadowed by the overlap between T(2) peaks arising from different fluids with similar apparent T(2) relaxation times. Nevertheless, the shapes of T(2) distributions from different fluid components are often different and can be predetermined. Inversion with predetermined T(2) distributions allows us to perform fluid component decomposition to yield individual fluid volume ratios. Another effective method for in situ fluid typing is two-dimensional (2D) NMR logging, which results in proton population distribution as a function of T(2) relaxation time and fluid diffusion coefficient (or T(1) relaxation time). Since diffusion coefficients (or T(1) relaxation time) for different fluid components can be very different, it is relatively easy to separate oil (especially heavy oil) from water signal in a 2D NMR map and to perform accurate fluid typing. Combining NMR logging with resistivity and/or neutron/density logs provides a third method for in situ fluid typing. We shall describe these techniques with field examples. PMID:17466778
The Phylogenetic Likelihood Library
Flouri, T.; Izquierdo-Carrasco, F.; Darriba, D.; Aberer, A.J.; Nguyen, L.-T.; Minh, B.Q.; Von Haeseler, A.; Stamatakis, A.
2015-01-01
We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2–10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). PMID:25358969
The phylogenetic likelihood library.
Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A
2015-03-01
We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). PMID:25358969
Use of well logs to characterize fluid flow in the Maljamar CO/sub 2/ Pilot
Albright, J.C.
1984-09-01
The Maljamar CO/sub 2/ Pilot in Lea County, New Mexico, is a 5-acre inverted five spot. Two zones are being flooded. They are a Grayburg dolomitic sand at 3,700 feet and a San Andres dolomite at 4,050 feet. Two logging observation wells, completed with fiberglass casing through the section of interest, are located in line with the center injector and one of the corner producers. Nine months of freshwater injection in the center well was followed by nine months of brine. A series of induction logs monitored the passing of the fresh water/brine interface providing data for a preliminary characterization of flow in the zones. The brine also established a uniform salinity at the observation wells for saturation determination. Gamma emitting tracers were injected into each zone of the center well as part of a well-to-well tracer study. Frequent gamma ray logs were run in the observation wells to see whether the movement of the tracers could be detected and used to characterize water movement. The results were very encouraging and provided better vertical and time resolution than the induction logs. The numerous responding layers in each zone could be classified by tracer arrival times into only a few basic types. Injection of CO/sub 2/ and follow-up brine has been monitored with a series of induction and neutron logs to follow the changes in water and CO/sub 2/ saturation as the flood progressed.
NASA Astrophysics Data System (ADS)
He, Yi; Liwo, Adam; Scheraga, Harold A.
2015-12-01
Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field.
He, Yi; Scheraga, Harold A.; Liwo, Adam
2015-12-28
Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field.
Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods.
McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando
2010-01-01
Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2Din-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349
Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods
McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando
2010-01-01
Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2D in-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349
The Likelihood Function and Likelihood Statistics
NASA Astrophysics Data System (ADS)
Robinson, Edward L.
2016-01-01
The likelihood function is a necessary component of Bayesian statistics but not of frequentist statistics. The likelihood function can, however, serve as the foundation for an attractive variant of frequentist statistics sometimes called likelihood statistics. We will first discuss the definition and meaning of the likelihood function, giving some examples of its use and abuse - most notably in the so-called prosecutor's fallacy. Maximum likelihood estimation is the aspect of likelihood statistics familiar to most people. When data points are known to have Gaussian probability distributions, maximum likelihood parameter estimation leads directly to least-squares estimation. When the data points have non-Gaussian distributions, least-squares estimation is no longer appropriate. We will show how the maximum likelihood principle leads to logical alternatives to least squares estimation for non-Gaussian distributions, taking the Poisson distribution as an example.The likelihood ratio is the ratio of the likelihoods of, for example, two hypotheses or two parameters. Likelihood ratios can be treated much like un-normalized probability distributions, greatly extending the applicability and utility of likelihood statistics. Likelihood ratios are prone to the same complexities that afflict posterior probability distributions in Bayesian statistics. We will show how meaningful information can be extracted from likelihood ratios by the Laplace approximation, by marginalizing, or by Markov chain Monte Carlo sampling.
Rolling Deck to Repository (R2R): A Research Cruise Prototype Event Logging System
NASA Astrophysics Data System (ADS)
Chandler, C. L.; Dorsk, A.
2009-12-01
Data gathered aboard research vessels coordinated by the University-National Oceanographic Laboratory System (UNOLS) represent an important component of the overall oceanographic data collection. The nascent Rolling Deck to Repository (R2R) project aims to improve access to basic shipboard data and ultimately reduce the work required to provide that access. The ultimate vision of R2R is to assist in transforming the academic fleet into an integrated global observing system. One of the coordinated subprojects within the R2R project is development of a shipboard, scientific event logging system that would incorporate best practice guidelines, a metadata schema and new and existing applications to generate a scientific sampling event log in the form of a digital text file. A cruise event logging system enables researchers to record digitally all scientific sampling events and assign a unique event identifier to each entry. Decades of work conducted within large coordinated ocean research programs (JGOFS, GLOBEC, WOCE and RIDGE) have shown that creation of a shipboard sampling event log can facilitate greatly the subsequent integration of data sets from individual investigators. A prototype event logger application, based on ELOG, has been developed and tested and results and lessons learned from this development effort will be shared.
Rolling Deck to Repository (R2R): Research Cruise Event Logging System Update
NASA Astrophysics Data System (ADS)
Maffei, A. R.; Chandler, C. L.; Stolp, L.
2010-12-01
Data gathered aboard research vessels coordinated by the University-National Oceanographic Laboratory System (UNOLS) represent an important component of the overall oceanographic data collection. The Rolling Deck to Repository (R2R) project aims to improve access to basic shipboard data and ultimately reduce the work required to provide that access. The ultimate vision of R2R is to assist in transforming the academic fleet into an integrated global observing system. One of the coordinated subprojects within the R2R project is the development of a shipboard scientific event logging system that incorporates best practice guidelines, controlled vocabularies, a cruise metadata schema, and a scientific event log. The event log application will use new and existing components to generate a digital text file with a fleet-wide agreed upon format. A cruise event logging system enables researchers to record digitally all scientific sampling events and assign a unique event identifier to each entry. Decades of work conducted within large coordinated ocean research programs (JGOFS, GLOBEC, WOCE and RIDGE) have shown that creation of a shipboard sampling event log can facilitate greatly the ingestion of these data into oceanographic repositories and subsequent integration of data sets from individual investigators. An event logger application, based on ELOG, has been developed and is being tested by scientists at sea. An important aspect of this project is the development of a controlled vocabulary and ontology for documenting the scientific sampling events that occur during a research cruise. The controlled vocabulary development is being influenced by similar efforts in other communities. In addition, a Web application is being developed for oceanographers to use to pre-configure the R2R event logger application with vocabulary terms and entry forms appropriate to the specific needs of their upcoming research cruise. Results and lessons learned from the R2R event logger
ERIC Educational Resources Information Center
Fennell, Mary L.; And Others
This document is part of a series of chapters described in SO 011 759. This chapter reports the results of Monte Carlo simulations designed to analyze problems of using maximum likelihood estimation (MLE: see SO 011 767) in research models which combine longitudinal and dynamic behavior data in studies of change. Four complications--censoring of…
Glenn, W.E.; Hulen, J.B.; Nielson, D.L.
1981-02-01
Utah State Geothermal Well 9-1 in the Roosevelt Hot Springs KGRA, Beaver County, Utah, has been donated by Phillips Petroleum Company for calibration and testing of well-logging equipment in the hot, corrosive, geothermal environment. It is the second Calibration/Test Well (C/T-2) in the Geothermal Log Interpretation Program. A study of cuttings and well logs from Well C/T-2 was completed. This synthesis and data presentation contains most of the subsurface geologic information needed to effect the total evaluation of geophysical logs acquired in this geothermal calibration/test well, C/T-2.
Emulation of petroleum well-logging D-T2 correlations on a standard benchtop spectrometer.
Mitchell, J; Fordham, E J
2011-10-01
An experimental protocol is described that allows two-dimensional (2D) nuclear magnetic resonance (NMR) correlations of apparent diffusion coefficient D(app) and effective transverse relaxation time T(2,eff) to be acquired on a bench-top spectrometer using pulsed field gradients (PFG) in such a manner as to emulate D(app)-T(2,eff) correlations acquired using a well-logging tool with a fixed field gradient (FFG). This technique allows laboratory-scale NMR measurements of liquid-saturated cored rock to be compared directly to logging data obtained from the well by virtue of providing a comparable acquisition protocol and data format, and hence consistent data processing. This direct comparison supports the interpretation of the well-logging data, including a quantitative determination of the oil/brine saturation. The D-T(2) pulse sequence described here uses two spin echoes (2SE) with a variable echo time to encode for diffusion. The diffusion and relaxation contributions to the signal decay are then deconvolved using a 2D numerical inversion. This measurement allows shorter relaxation time components to be probed than in conventional diffusion measurements. A brief discussion of the numerical inversion algorithms available for inverting these non-rectangular data is included. The PFG-2SE sequence described is well suited to laboratory-scale studies of porous media and short T(2) samples in general. PMID:21875819
A new inversion method for (T2, D) 2D NMR logging and fluid typing
NASA Astrophysics Data System (ADS)
Tan, Maojin; Zou, Youlong; Zhou, Cancan
2013-02-01
One-dimensional nuclear magnetic resonance (1D NMR) logging technology has some significant limitations in fluid typing. However, not only can two-dimensional nuclear magnetic resonance (2D NMR) provide some accurate porosity parameters, but it can also identify fluids more accurately than 1D NMR. In this paper, based on the relaxation mechanism of (T2, D) 2D NMR in a gradient magnetic field, a hybrid inversion method that combines least-squares-based QR decomposition (LSQR) and truncated singular value decomposition (TSVD) is examined in the 2D NMR inversion of various fluid models. The forward modeling and inversion tests are performed in detail with different acquisition parameters, such as magnetic field gradients (G) and echo spacing (TE) groups. The simulated results are discussed and described in detail, the influence of the above-mentioned observation parameters on the inversion accuracy is investigated and analyzed, and the observation parameters in multi-TE activation are optimized. Furthermore, the hybrid inversion can be applied to quantitatively determine the fluid saturation. To study the effects of noise level on the hybrid method and inversion results, the numerical simulation experiments are performed using different signal-to-noise-ratios (SNRs), and the effect of different SNRs on fluid typing using three fluid models are discussed and analyzed in detail.
Goodall, John
2012-05-21
Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input it will input data from files, standard input, and syslog, (2) Parser it will parse the log file based on regular expressions into structured data (JSNO format), (3) Output it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.
The likelihood ratio and frequency of DQ2/DQ8 haplotypes in Iranian patients with celiac disease
Khosravi, Asghar; Mansouri, Masoume; Rostami-Nejad, Mohammad; Shahbazkhani, Bijan; Ekhlasi, Golnaz; Kalantari, Ebrahim
2016-01-01
Aim: The aim of this study was to evaluate the likelihood ratio and frequency of DQ2 and DQ8 in Iranian patients with celiac disease (CD). Background: The HLA DQ2 and HLA DQ8 are the important mediators in the development of celiac disease. A few studies evaluated the frequency of HLA DQ2 and HLA DQ8 haplotypes among the Iranian population with low sample size. Patients and methods: In this cross-sectional study, to predict HLA–DQ2 and DQ8 haplotypes, 141(73 male, 78 female) confirmed CD patients compared to 151 healthy controls were enrolled into this study during 2013-2014. HLA DQ2/ DQ8 haplotypes was determined in cases and controls using PCR-SSP technique. Results: DQ2 and DQ8 were positive in 80% (n=111) and 49% (n= 69) of CD patients and 36% (n=61) and 13% (n=21) of control group respectively. Moreover, 32% (n=45) of CD patients and 5.3% (n=8) of the control group were carrier of both haplotypes. In the case group about one-third of patients (32.2%) were positive for carrying both DQ2 and DQ8 heterodimers while only 5.3% (n=8) of the control group were carrier. In addition, the positive likelihood ratio of DQ2 and DQ8 were 1.74 (CI: 1.4- 2.1), and 2.6 (CI: 1.8– 2.7), respectively. Conclusion: The result of this study showed that the frequency of DQ8 among our population is higher than those reported by European countries, but it is close to those founded in South America and Middle East. This result suggests that the higher prevalence of HLA DQ8 pattern in Iranian CD patients is similar to non-European patients. PMID:26744610
Augmented Likelihood Image Reconstruction.
Stille, Maik; Kleine, Matthias; Hägele, Julian; Barkhausen, Jörg; Buzug, Thorsten M
2016-01-01
The presence of high-density objects remains an open problem in medical CT imaging. Data of projections passing through objects of high density, such as metal implants, are dominated by noise and are highly affected by beam hardening and scatter. Reconstructed images become less diagnostically conclusive because of pronounced artifacts that manifest as dark and bright streaks. A new reconstruction algorithm is proposed with the aim to reduce these artifacts by incorporating information about shape and known attenuation coefficients of a metal implant. Image reconstruction is considered as a variational optimization problem. The afore-mentioned prior knowledge is introduced in terms of equality constraints. An augmented Lagrangian approach is adapted in order to minimize the associated log-likelihood function for transmission CT. During iterations, temporally appearing artifacts are reduced with a bilateral filter and new projection values are calculated, which are used later on for the reconstruction. A detailed evaluation in cooperation with radiologists is performed on software and hardware phantoms, as well as on clinically relevant patient data of subjects with various metal implants. Results show that the proposed reconstruction algorithm is able to outperform contemporary metal artifact reduction methods such as normalized metal artifact reduction. PMID:26208310
Energy Science and Technology Software Center (ESTSC)
2013-08-01
This is an executable python script which offers two different conversions for well log data: 1) Conversion from a BoreholeLASLogData.xls model to a LAS version 2.0 formatted XML file. 2) Conversion from a LAS 2.0 formatted XML file to an entry in the WellLog Content Model. Example templates for BoreholeLASLogData.xls and WellLogsTemplate.xls can be found in the package after download.
Paillet, Frederick L.; Morin, R.H.
1988-01-01
A complete suite of conventional geophysical well logs was obtained in the upper part of a 3220-m-deep borehole drilled into geothermally altered alluvial sediments on the southeastern edge of the Salton Sea. Geophysical logs obtained in the State 2-14 borehole indicate that neutron porosity, gamma-gamma, and deep-induction logs provide useful information on lithologic trends with depth. The natural gamma log contains almost continuous, high-frequency fluctuations that obscure lithologic trends and that may be related to recent radioisotope redistribution and departure from radiometric equilibrium. Acoustic transit time logs give unrealistically low in situ compressional velocities ranging from 1.8 to 3.0 km/s, whereas acoustic waveform logs indicate that sediment compressional velocities range from less than 3.0 km/s shallower than 1000 m in depth to almost 5.0 km/s at depths greater than 2000 m. Analyses indicate that most log values lie between two lithologic end points: an electrically conductive claystone with moderate neutron porosity, but no effective porosity, and an electrically nonconductive, fully cemented siltstone that has small but finite porosity. -from Authors
ERIC Educational Resources Information Center
Jones, S.; And Others
1997-01-01
Discusses the use of transaction logging in Okapi-related projects to allow search algorithms and user interfaces to be investigated, evaluated, and compared. A series of examples is presented, illustrating logging software for character-based and graphical user interface systems, and demonstrating the usefulness of relational database management…
Condition and fate of logged forests in the Brazilian Amazon
Asner, Gregory P.; Broadbent, Eben N.; Oliveira, Paulo J. C.; Keller, Michael; Knapp, David E.; Silva, José N. M.
2006-01-01
The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16 ± 1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained. PMID:16901980
Condition and fate of logged forests in the Brazilian Amazon.
Asner, Gregory P; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Knapp, David E; Silva, José N M
2006-08-22
The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16+/-1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained. PMID:16901980
Model Fit after Pairwise Maximum Likelihood.
Barendse, M T; Ligtvoet, R; Timmerman, M E; Oort, F J
2016-01-01
Maximum likelihood factor analysis of discrete data within the structural equation modeling framework rests on the assumption that the observed discrete responses are manifestations of underlying continuous scores that are normally distributed. As maximizing the likelihood of multivariate response patterns is computationally very intensive, the sum of the log-likelihoods of the bivariate response patterns is maximized instead. Little is yet known about how to assess model fit when the analysis is based on such a pairwise maximum likelihood (PML) of two-way contingency tables. We propose new fit criteria for the PML method and conduct a simulation study to evaluate their performance in model selection. With large sample sizes (500 or more), PML performs as well the robust weighted least squares analysis of polychoric correlations. PMID:27148136
LIKELIHOOD OF THE POWER SPECTRUM IN COSMOLOGICAL PARAMETER ESTIMATION
Sun, Lei; Wang, Qiao; Zhan, Hu
2013-11-01
The likelihood function is a crucial element of parameter estimation. In analyses of galaxy overdensities and weak lensing shear, one often approximates the likelihood of the power spectrum with a Gaussian distribution. The posterior probability derived from such a likelihood deviates considerably from the exact posterior on the largest scales probed by any survey, where the central limit theorem does not apply. We show that various forms of Gaussian likelihoods can have a significant impact on the estimation of the primordial non-Gaussianity parameter f{sub NL} from the galaxy angular power spectrum. The Gaussian plus log-normal likelihood, which has been applied successfully in analyses of the cosmic microwave background, outperforms the Gaussian likelihoods. Nevertheless, even if the exact likelihood of the power spectrum is used, the estimated parameters may be still biased. As such, the likelihoods and estimators need to be thoroughly examined for potential systematic errors.
Burns, K.L.
1987-07-01
Patterns in reflected sonic intensity recognized during examination of televiewer logs of basement gneiss at the Hot Dry Rock Site, Fenton Hill, New Mexico, are due to geological fractures and foliations and to incipient breakouts. These features are obscured by artifacts caused by wellbore ellipticity, tool off-centering, and tool oscillations. An interactive method, developed for extraction of the structural features (fractures and foliations), uses human perception as a pattern detector and a chi-square test of harmonic form as a pattern discriminator. From imagery of GT-2, 733 structures were recovered. The acceptance rate of the discriminator was 54%. Despite these positive results, the general conclusion of this study is that intensity-mode imagery from Fenton Hill is not directly invertible for geological information because of the complexity of the televiewer imaging process. Developing a forward model of the intensity-imaging process, or converting to caliper-mode imagery, or doing both, will be necessary for high-fidelity feature extraction from televiewer data.
Preliminary observations and logs of BARB 1 and BARB 2: komatiites from the Tjakastad site
NASA Astrophysics Data System (ADS)
Coetzee, Grace; Arndt, Nicholas; Wilson, Allan
2013-04-01
The BARB 1 and BARB 2 cores intersect a suite of komatiite flows and komatiitic basalts as well as fragmental rocks of the Komati Formation of the Onverwacht Group, Barberton Greenstone Belt. The cores give important and previously unattainable information on the structures, textures and contact relationships between individual komatiite flows and different lithological units within the flows. BARB 1 was drilled at -48° on a 5° azimuth to a depth of 419.9 m. This core contains a unique volcanic tumulus succession in the stratigraphically lower 100 m and the rest of the core consists of about 59 flows of spinifex-textured komatiite (1-3 m thick), massive komatiite (0.5-10 m thick), komatiitic basalt (1-9 m thick) and a single basalt layer (10 m thick), intruded by gabbro (0.5-2 m thick) and a single dolerite dyke (18 m thick). BARB 2, approximately 50 m from BARB 1 and parallel to it, was drilled at -45°on an 8° azimuth to a depth of 431.5 m. This core contains approximately 39 flows of komatiite (0.5-10 m thick) and komatiitic basalt (2-23 m thick) which contain possible selvages of pillows. Basalt flows are more numerous (0.3-4 m thick) in BARB 2 whilst gabbro (0.6-7 m thick) is less prevalent. The dolerite dyke observed in BARB 1 does not occur in BARB 2. As the Barberton strata young towards the east, the cores intersected the stratigraphy in a reverse sequence. The cores were drilled such that there exists a 141 m overlap in stratigraphy between them. The section 141 m from the base of BARB 1 should theoretically correlate with the top 141 m of BARB 2. However, this overlap is not evident in the core or in the core logs. A single gabbro layer appears to be lithologically correlatable between both holes. There is no apparent correlation between the pattern of the komatiite flows leading to an initial conclusion that the komatiite flows were not laterally extensive or changed laterally in form over short distances. In both cores the proportion of komatiitic
CORRELATOR 5.2 - A program for interactive lithostratigraphic correlation of wireline logs
Olea, R.A.
2004-01-01
The limited radius of investigation of petrophysical measurements made in boreholes and the relatively large distances between wells result in an incomplete sensing of the subsurface through well logging. CORRELATOR is a program for estimating geological properties between logged boreholes. An initial and fundamental step is the lithostratigraphic correlation of logs in different wells. The method employed by the program closely emulates the process of visual inspection used by experienced subsurface geologists in manual correlation. Mathematically, the determination of lithostratigraphical equivalence is based on the simultaneous assessment of similarity in shale content, similarity in the patterns of vertical variation in a petrophysical property that is measured with high vertical resolution, and spatial consistency of stratigraphic relationships as determined by an expert system. Multiple additional options for processing log readings allow maximization in the extraction of information from pairs of logs per well and great flexibility in the final display of results in the form of cross sections and dip diagrams. ?? 2004 Elsevier Ltd. All rights reserved.
Energy Science and Technology Software Center (ESTSC)
2012-05-21
Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input it will input data from files, standard input, and syslog, (2) Parser it will parse the logmore » file based on regular expressions into structured data (JSNO format), (3) Output it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.« less
Lyle, W.D.
1989-03-07
This patent describes a method for conducting a gravimetry survey of an earth formation, comprising the steps of: continuously traversing the formation with a gravity logging tool having at least two piezoelectric force transducers mounted at spaced-apart positions within the tool, exciting the piezoelectric transducers to vibrate at a characteristic resonant frequency, measuring the periods of vibration of the piezoelectric transducers as the logging tool continuously traverses the formation, the periods of vibration changing in response to the force exerted on the piezoelectric transducer by the acceleration of gravity and acceleration due to tool motion along the formation, and determining the difference in the measured periods of vibration of the piezoelectric transducers compensated for temperature relating force to the periods of vibration within the formation.
Tibshirani, R.J.
1984-12-01
In this work, we extend the idea of local averaging to likelihood-based regression models. One application is in the class of generalized linear models (Nelder and Wedderburn (1972). We enlarge this class by replacing the covariate form chi..beta.. with an unspecified smooth function s(chi). This function is estimated from the data by a technique we call Local Likelihood Estimation - a type of local averaging. Multiple covariates are incorporated through a forward stepwise algorithm. In a number of real data examples, the local likelihood technique proves to be effective in uncovering non-linear dependencies. Finally, we give some asymptotic results for local likelihood estimates and provide some methods for inference.
Inductive resistivity logging in steel-cased boreholes. SBIR Phase 2 progress report
Wilt, M.
1999-07-01
SBIR Phase 2 project 40145-97-I calls for the design and construction of a prototype inductive logging device to measure formation resistivity from within a steel-cased borehole. The SCIL (Steel Casing Induction Logger) tool is intended for reservoir characterization and process monitoring in an oil field environment. This report summarizes findings from the initial project period. In this phase, bench model measurements were made to test casing compensation schemes, numerical models were calculated to optimize the tool configuration and associated formation sensitivity and the preliminary design of the tool was completed. The bench tests constitute fundamental research on determining the characteristics of steel well casing and on developing means of separating the effects of the casing and the formation. This technology is crucial to the success of the project and significant progress has been made towards the goal of recovering the formation resistivity from inside the casing. Next, a series of sensitivity and tool configuration studies have been completed through partner Dr. David Alumbaugh at Sandia National Laboratories. These numerical results help to optimize the tool configuration and allow one to calculate the expected formation sensitivity. These models are preliminary to data interpretation software to be developed in the next project period. The initial hardware design of the tool has been completed, and ordering parts has begun for later manufacture and assembly. The tool, which is designed for maximum flexibility of deployment, will have a powerful transmitter, an array of three component sensors and sufficient dynamic range to operate in standard oil field steel-cased boreholes.
McGee, Steven
2002-01-01
Likelihood ratios are one of the best measures of diagnostic accuracy, although they are seldom used, because interpreting them requires a calculator to convert back and forth between “probability” and “odds” of disease. This article describes a simpler method of interpreting likelihood ratios, one that avoids calculators, nomograms, and conversions to “odds” of disease. Several examples illustrate how the clinician can use this method to refine diagnostic decisions at the bedside.
Clow, G.D.; Saltus, R.W.; Waddington, E.D.
1996-01-01
We describe a high-precision (0.1-1.0 mK) borehole-temperature (BT) logging system developed at the United States Geological Survey (USGS) for use in remote polar regions. We discuss calibration, operational and data-processing procedures, and present an analysis of the measurement errors. The system is modular to facilitate calibration procedures and field repairs. By interchanging logging cables and temperature sensors, measurements can be made in either shallow air-filled boreholes or liquid-filled holes up to 7 km deep. Data can be acquired in either incremental or continuous-logging modes. The precision of data collected by the new logging system is high enough to detect and quantify various thermal effects at the milli-Kelvin level. To illustrate this capability, we present sample data from the 3 km deep borehole at GISP2, Greenland, and from a 130m deep air-filled hole at Taylor Dome, Antarctica. The precision of the processed GTSP2 continuous temperature logs is 0.25-0.34 mK, while the accuracy is estimated to be 4.5 mK. The effects of fluid convection and the dissipation of the thermal disturbance caused by drilling the borehole are clearly visible in the data. The precision of the incremental Taylor Dome measurements varies from 0.11 to 0.32mK, depending on the wind strength during the experiments. With this precision, we found that temperature fluctuations and multi-hour trends in the BT measurements correlate well with atmospheric-pressure changes.
Addendum 2: Logs of monitor wells drilled May 1988 through December 1992
Stout, J.; Qualheim, B.; McPherrin, R.; Barber, K.; Hedegaard, R.; McConihe, W.; Miller, T.
1993-11-01
The logs in this addendum were plotted in a new format by the same software package (LOGGER by Rockware, Denver, CO) that was used in the original publication. The scale remains the same, 1 inch = 15 foot. The header is totally automated with a subheading indexing the well-construction symbols. Geophysical curves are labeled in their respective channels, and percentage core recovery is plotted in a histogram. Lithologic symbols are plotted to scale in a channel similar to previous logs. The lithologic description also has been automated to assure consistency in terminology. Descriptions are more extensive and are referenced by leader lines to the lithologic symbol. Additional figures included for this Addendum are: a plot of all the monitoring well locations at the LLNL Main site and a plot detailing the gasoline spill area well locations in the vicinity of Building 403.
Whiley, Phillip J.; Parsons, Michael T.; Leary, Jennifer; Tucker, Kathy; Warwick, Linda; Dopita, Belinda; Thorne, Heather; Lakhani, Sunil R.; Goldgar, David E.; Brown, Melissa A.; Spurdle, Amanda B.
2014-01-01
Rare exonic, non-truncating variants in known cancer susceptibility genes such as BRCA1 and BRCA2 are problematic for genetic counseling and clinical management of relevant families. This study used multifactorial likelihood analysis and/or bioinformatically-directed mRNA assays to assess pathogenicity of 19 BRCA1 or BRCA2 variants identified following patient referral to clinical genetic services. Two variants were considered to be pathogenic (Class 5). BRCA1:c.4484G> C(p.Arg1495Thr) was shown to result in aberrant mRNA transcripts predicted to encode truncated proteins. The BRCA1:c.122A>G(p.His41Arg) RING-domain variant was found from multifactorial likelihood analysis to have a posterior probability of pathogenicity of 0.995, a result consistent with existing protein functional assay data indicating lost BARD1 binding and ubiquitin ligase activity. Of the remaining variants, seven were determined to be not clinically significant (Class 1), nine were likely not pathogenic (Class 2), and one was uncertain (Class 3).These results have implications for genetic counseling and medical management of families carrying these specific variants. They also provide additional multifactorial likelihood variant classifications as reference to evaluate the sensitivity and specificity of bioinformatic prediction tools and/or functional assay data in future studies. PMID:24489791
NASA Technical Reports Server (NTRS)
Peters, B. C., Jr.; Walker, H. F.
1975-01-01
A general iterative procedure is given for determining the consistent maximum likelihood estimates of normal distributions. In addition, a local maximum of the log-likelihood function, Newtons's method, a method of scoring, and modifications of these procedures are discussed.
Grid Logging: Best Practices Guide
Tierney, Brian L; Tierney, Brian L; Gunter, Dan
2008-04-01
The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.
Correlation of lithologic and sonic logs from the COST No. B-2 well with seismic reflection data
King, K.C.
1979-01-01
The purpose of this study was to correlate events recorded on seismic records with changes in lithology recorded from sample descriptions from the Continental Offshore Stratigraphic Test (COST) No. B-2 well. The well is located on the U.S. mid-Atlantic Outer Continental Shelf about 146 km east of Atlantic City, N.J. (see location map). Lithologic data are summarized from the sample descriptions of Smith and others (1976). Sonic travel times were read at 0.15 m intervals in the well using a long-space sonic logging tool. Interval velocities, reflection coefficients and a synthetic seismogram were calculated from the sonic log.
Atwood, J.W.; Killpack, T.J.; Glenn, W.E.; Nutter, C.
1980-11-01
WELLOG is a software system that has been developed to plot digitized well log data in a manner suitable for analysis. Multiple logs can be plotted side by side for correlation analysis, and up to three logs can be plotted on a cross plot. Data entry, editing, and modification functions are also provided by the program. Digitizing is accomplished by a TEKTRONIX 4954 (on-line) digitizing tablet, and plotting is done on a TEKTRONIX 4014 graphics terminal, a STATOS 42 electrostatic plotter, or a CALCOMP pen plotter using a device independent plotting system. This program (WELLOG.REV2) is not as system-dependent as the former version (WELLOG.REV1). The user must supply a program to digitize the data and supply subroutines to interface the program with file manipulation and plotting routines of their system. One major improvement is the use of an on-line digitizing system whereby the program accesses disk files rather than reading the data from tape. In REV2 the merge file has been automated such that the file is initialized automatically upon creation and also delete protected. The randomly spaced data capabilities have been greatly improved allowing the averaging and cross plotting of the data. Routines have been added which allow all of the cross plots excepting the Z-plot to be printed on a line printer. Dresser Atlas' A-K plot has also been added. The program is almost completely self-contained needing only a few interfacing and system subroutines.
Collett, T.S.
1999-01-01
The JAPEX/JNOC/GSC Mallik 2L-38 gas hydrate research well project was designed to investigate the occurrence of in situ natural gas hydrate in the Mallik area of the Mackenzie Delta of Canada. Because gas hydrate is unstable at surface pressure and temperature conditions, a major emphasis was placed on the downhole logging program to determine the in situ physical properties of the gas-hydrate-bearing sediments. Downhole logging tool strings deployed in the Mallik 2L-38 well included the Schlumberger Platform Express with a high resolution laterolog, Array Induction Imager Tool, Dipole Shear Sonic Imager, and a Fullbore Formation Microlmager. The downhole log data obtained from the log- and core-inferred gas-hydrate-bearing sedimentary interval (897.25-1109.5 m log depth) in the Mallik 2L-38 well is depicted in a series of well displays. Also shown are numerous reservoir parameters, including gas hydrate saturation and sediment porosity log traces, calculated from available downhole well-log and core data. The gas hydrate accumulation delineated by the Mallik 2L-38 well has been determined to contain as much as 4.15109 m3 of gas in the 1 km2 area surrounding the drill site.
An O(log sup 2 N) parallel algorithm for computing the eigenvalues of a symmetric tridiagonal matrix
NASA Technical Reports Server (NTRS)
Swarztrauber, Paul N.
1989-01-01
An O(log sup 2 N) parallel algorithm is presented for computing the eigenvalues of a symmetric tridiagonal matrix using a parallel algorithm for computing the zeros of the characteristic polynomial. The method is based on a quadratic recurrence in which the characteristic polynomial is constructed on a binary tree from polynomials whose degree doubles at each level. Intervals that contain exactly one zero are determined by the zeros of polynomials at the previous level which ensures that different processors compute different zeros. The exact behavior of the polynomials at the interval endpoints is used to eliminate the usual problems induced by finite precision arithmetic.
Likelihood and clinical trials.
Hill, G; Forbes, W; Kozak, J; MacNeill, I
2000-03-01
The history of the application of statistical theory to the analysis of clinical trials is reviewed. The current orthodoxy is a somewhat illogical hybrid of the original theory of significance tests of Edgeworth, Karl Pearson, and Fisher, and the subsequent decision theory approach of Neyman, Egon Pearson, and Wald. This hegemony is under threat from Bayesian statisticians. A third approach is that of likelihood, stemming from the work of Fisher and Barnard. This approach is illustrated using hypothetical data from the Lancet articles by Bradford Hill, which introduced clinicians to statistical theory. PMID:10760630
Likelihood Analysis for Mega Pixel Maps
NASA Technical Reports Server (NTRS)
Kogut, Alan J.
1999-01-01
The derivation of cosmological parameters from astrophysical data sets routinely involves operations counts which scale as O(N(exp 3) where N is the number of data points. Currently planned missions, including MAP and Planck, will generate sky maps with N(sub d) = 10(exp 6) or more pixels. Simple "brute force" analysis, applied to such mega-pixel data, would require years of computing even on the fastest computers. We describe an algorithm which allows estimation of the likelihood function in the direct pixel basis. The algorithm uses a conjugate gradient approach to evaluate X2 and a geometric approximation to evaluate the determinant. Monte Carlo simulations provide a correction to the determinant, yielding an unbiased estimate of the likelihood surface in an arbitrary region surrounding the likelihood peak. The algorithm requires O(N(sub d)(exp 3/2) operations and O(Nd) storage for each likelihood evaluation, and allows for significant parallel computation.
Likelihood of meeting the EU limit values for NO 2 and PM 10 concentrations in the Netherlands
NASA Astrophysics Data System (ADS)
Velders, Guus J. M.; Diederen, Hub S. M. A.
In 2007, the European limit values for annual average nitrogen dioxide (NO 2) concentration and for daily average particulate matter (PM 10) concentration were exceeded along motorways and city streets in the Netherlands. While the road length along which the exceedance occurred is uncertain, model calculations show that the NO 2 concentration was likely to have been exceeded (chance >66%) along about 300 km and PM 10 concentration along about 75 km. In addition, the limit values were exceeded 'about as likely as not' (chance 33-66%) along a total of 1000 km for NO 2 and 1600 km for PM 10. PM 10 and NO 2 concentrations must be below the limit values everywhere in Europe, ultimately by 2011 and 2015, respectively. Since estimates of future local concentrations have an uncertainty of about 15-20%, no absolute statements can be made whether concentrations will be below the limit values within the specified time. Model calculations accounting for the effects of current and proposed national and European legislation, and using average meteorology for large-scale and local traffic contributions show strong decreases in likely limit value exceedances in the Netherlands. However, limit value exceedances are still possible (chance >33%) along about 350 km for PM 10 by 2011, and about 150 km for NO 2, by 2015. These possible exceedances depend not only on the uncertainties and on national and European policies and their effectiveness, but also on contributions by specific additional local measures. The Netherlands Government has proposed a plan, which includes local measures to meet the limit values everywhere, in time. Although not assessed here due to their specific character, such local measures could reduce exceedances. As the effects of local measures and estimates of concentrations are uncertain, continuous monitoring - possibly together with additional measures - will be needed to adhere to the limit values.
Real Time Mud Gas Logging During Drilling of DFDP-2B
NASA Astrophysics Data System (ADS)
Mathewson, L. A.; Toy, V.; Menzies, C. D.; Zimmer, M.; Erzinger, J.; Niedermann, S.; Cox, S.
2015-12-01
The Deep Fault Drilling Project (DFDP) aims to improve our understanding of the Alpine Fault Zone, a tectonically active mature fault system in New Zealand known to rupture in large events, by deep scientific drilling. The borehole DFDP-2B approached the Alpine Fault at depth, reaching a final depth of 892 m (820 m true vertical depth). Online gas analysis (OLGA) while drilling tracked changes in the composition of gases extracted from the circulating drill mud. The composition of fluids from fault zones can provide information about their origins, flow rates and -paths, fluid-rock interactions along these paths, and the permeability structure of the faulted rock mass. Apart from an atmospheric input, the gases in drilling mud derive from the pore space of rock, crushed at the drill bit, and from permeable layers intersected by the borehole. The rapid formation of mud wall cake seals the borehole from further fluid inflow, hence formation-derived gases enter mostly at the depth of the drill bit. OLGA analyses N2, O2, Ar, CO2, CH4, He, and H2 on a mass spectrometer, hydrocarbons CH4, C2H6, C3H8, i-C4H10, and n-C4H10 on a gas chromatograph, and Rn using a lucas-cell detector. Gas was sampled for offline analyses on noble gas and stable isotopes to complement the OLGA dataset. The principle formation-derived gases found in drilling mud during drilling of DFDP-2 were CO2 and CH4, with smaller component of H2 and He2. High radon activity is interpreted to reflect intervals of active fluid flow through highly fractured and faulted rock. 3He/4He values in many samples were extremely air-contaminated, i.e. there was almost no excess of non-atmospheric He. The 3He/4He values measured at 236 m and 610 m, which are the only analyses with uncertainties <100%, are very similar to those measured in hot springs along the Alpine Fault, e.g. Fox River (0.64 Ra), Copland (0.42 Ra), Lower Wanganui (0.81 Ra). We will compare these data to those gathered using OLGA and discuss the
Hertel, Thomas W.; Ramankutty, Navin; Baldos, Uris Lantz C.
2014-01-01
There has been a resurgence of interest in the impacts of agricultural productivity on land use and the environment. At the center of this debate is the assertion that agricultural innovation is land sparing. However, numerous case studies and global empirical studies have found little evidence of higher yields being accompanied by reduced area. We find that these studies overlook two crucial factors: estimation of a true counterfactual scenario and a tendency to adopt a regional, rather than a global, perspective. This paper introduces a general framework for analyzing the impacts of regional and global innovation on long run crop output, prices, land rents, land use, and associated CO2 emissions. In so doing, it facilitates a reconciliation of the apparently conflicting views of the impacts of agricultural productivity growth on global land use and environmental quality. Our historical analysis demonstrates that the Green Revolution in Asia, Latin America, and the Middle East was unambiguously land and emissions sparing, compared with a counterfactual world without these innovations. In contrast, we find that the environmental impacts of a prospective African Green Revolution are potentially ambiguous. We trace these divergent outcomes to relative differences between the innovating region and the rest of the world in yields, emissions efficiencies, cropland supply response, and intensification potential. Globalization of agriculture raises the potential for adverse environmental consequences. However, if sustained for several decades, an African Green Revolution will eventually become land sparing. PMID:25201962
2010-01-01
Background The development, in the last decade, of stochastic heuristics implemented in robust application softwares has made large phylogeny inference a key step in most comparative studies involving molecular sequences. Still, the choice of a phylogeny inference software is often dictated by a combination of parameters not related to the raw performance of the implemented algorithm(s) but rather by practical issues such as ergonomics and/or the availability of specific functionalities. Results Here, we present MetaPIGA v2.0, a robust implementation of several stochastic heuristics for large phylogeny inference (under maximum likelihood), including a Simulated Annealing algorithm, a classical Genetic Algorithm, and the Metapopulation Genetic Algorithm (metaGA) together with complex substitution models, discrete Gamma rate heterogeneity, and the possibility to partition data. MetaPIGA v2.0 also implements the Likelihood Ratio Test, the Akaike Information Criterion, and the Bayesian Information Criterion for automated selection of substitution models that best fit the data. Heuristics and substitution models are highly customizable through manual batch files and command line processing. However, MetaPIGA v2.0 also offers an extensive graphical user interface for parameters setting, generating and running batch files, following run progress, and manipulating result trees. MetaPIGA v2.0 uses standard formats for data sets and trees, is platform independent, runs in 32 and 64-bits systems, and takes advantage of multiprocessor and multicore computers. Conclusions The metaGA resolves the major problem inherent to classical Genetic Algorithms by maintaining high inter-population variation even under strong intra-population selection. Implementation of the metaGA together with additional stochastic heuristics into a single software will allow rigorous optimization of each heuristic as well as a meaningful comparison of performances among these algorithms. MetaPIGA v2
Jerling, M; Merlé, Y; Mentré, F; Mallet, A
1994-01-01
Therapeutic drug monitoring data for nortriptyline (674 analyses from 578 patients) were evaluated with the nonparametric maximum likelihood (NPML) method in order to determine the population kinetic parameters of this drug and their relation to age, body weight and duration of treatment. Clearance of nortriptyline during monotherapy exhibited a large interindividual variability and a skewed distribution. A small, separate fraction with a very high clearance, constituting between 0.5% and 2% of the population, was seen in both men and women. This may be explained by the recent discovery of subjects with multiple copies of the gene encoding the cytochrome-P450-enzyme CYP2D6, which catalyses the hydroxylation of nortriptyline. However, erratic compliance with the prescription may also add to this finding. A separate distribution of low clearance values with a frequency corresponding to that of poor metabolizers of CYP2D6 (circa 7% in Caucasian populations) could not be detected. Concomitant therapy with drugs that inhibit CYP2D6 resulted in a major increase in the plasma nortriptyline concentrations. This was caused by a decrease in nortriptyline clearance, whereas the volume of distribution was unchanged. The demographic factors age and body weight had a minor influence on the clearance of nortriptyline which was also unaffected by the duration of treatment. PMID:7893588
NASA Astrophysics Data System (ADS)
Suh, Youngjoo; Kim, Hoirin
2014-12-01
In this paper, a new discriminative likelihood score weighting technique is proposed for speaker identification. The proposed method employs a discriminative weighting of frame-level log-likelihood scores with acoustic-phonetic classification in the Gaussian mixture model (GMM)-based speaker identification. Experiments performed on the Aurora noise-corrupted TIMIT database showed that the proposed approach provides meaningful performance improvement with an overall relative error reduction of 15.8% over the maximum likelihood-based baseline GMM approach.
Cetron, Martin
2016-03-18
Since May 2015, when Zika virus, a flavivirus transmitted primarily by Aedes aegypti mosquitoes, was reported in Brazil, the virus has rapidly spread across the Region of the Americas and the Caribbean. The association between maternal Zika virus infection and adverse fetal and reproductive outcomes, including microcephaly, prompted CDC to issue a Level 2 alert travel notice* for the 37 countries and U.S. territories (at the national and territorial level) that have reported recent Zika virus transmission as of March 11, 2016. In addition to mosquito bite precautions for all travelers, CDC advises that pregnant women postpone travel to affected countries and U.S. territories. Within a nation's borders, ecologic characteristics, which determine the distribution of mosquito vectors, can vary considerably. CDC conducted a spatial analysis, focusing on the probability of occurrence of Ae. aegypti, to support the demarcation for subnational travel alerts. Based on results of this analysis, travel that is limited to elevations higher than 2,000 m (6,562 ft) above sea level is considered to have minimal (approximately 1%) likelihood for mosquito-borne Zika virus transmission, even within countries reporting active transmission. Women who are pregnant should avoid travel to elevations <2,000 m in countries with active Zika virus transmission. PMID:26985965
SpaceOps 2012 Plus 2: Social Tools to Simplify ISS Flight Control Communications and Log Keeping
NASA Technical Reports Server (NTRS)
Cowart, Hugh S.; Scott, David W.
2014-01-01
A paper written for the SpaceOps 2012 Conference (Simplify ISS Flight Control Communications and Log Keeping via Social Tools and Techniques) identified three innovative concepts for real time flight control communications tools based on social mechanisms: a) Console Log Tool (CoLT) - A log keeping application at Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) that provides "anywhere" access, comment and notifications features similar to those found in Social Networking Systems (SNS), b) Cross-Log Communication via Social Techniques - A concept from Johnsson Space Center's (JSC) Mission Control Center Houston (MCC-H) that would use microblogging's @tag and #tag protocols to make information/requests visible and/or discoverable in logs owned by @Destination addressees, and c) Communications Dashboard (CommDash) - A MSFC concept for a Facebook-like interface to visually integrate and manage basic console log content, text chat streams analogous to voice loops, text chat streams dedicated to particular conversations, generic and position-specific status displays/streams, and a graphically based hailing display. CoLT was deployed operationally at nearly the same time as SpaceOps 2012, the Cross- Log Communications idea is currently waiting for a champion to carry it forward, and CommDash was approved as a NASA Iinformation Technoloby (IT) Labs project. This paper discusses lessons learned from two years of actual CoLT operations, updates CommDash prototype development status, and discusses potential for using Cross-Log Communications in both MCC-H and/or POIC environments, and considers other ways for synergizing console applcations.
Doughty, Christine; Takeuchi, Shinji; Amano, Kenji; Shimo, Michito; Tsang, Chin-Fu
2004-10-04
The flowing fluid electric conductivity (FEC) logging method, wellbore fluid is replaced with de-ionized water, following which FEC profiles in the wellbore are measured at a series of times while the well is pumped at a constant rate. Locations were fluid enters the wellbore show peaks in the FEC logs, which may be analyzed to infer inflow strengths and salinities of permeable features intersected by the wellbore. In multi-rate flowing FEC logging, the flowing FEC logging method is repeated using two or more pumping rates, which enables the transmissivities and inherent pressure heads of these features to be estimated as well. We perform multi-rate FEC logging on a deep borehole in fractured granitic rock, using three different pumping rates. Results identify 19 hydraulically conducting fractures and indicate that transmissivity, pressure head, and salinity vary significantly among them. By using three pumping rates rather than the minimum number of two, we obtain an internal consistency check on the analysis that provides a measure of the uncertainty of the results. Good comparisons against static FEC profiles and against independent chemical, geological, and hydrogeological data have further enhanced confidence in the results of the multi-rate flowing FEC logging method.
Maximum-likelihood density modification
Terwilliger, Thomas C.
2000-01-01
A likelihood-based approach to density modification is developed that can be applied to a wide variety of cases where some information about the electron density at various points in the unit cell is available. The key to the approach consists of developing likelihood functions that represent the probability that a particular value of electron density is consistent with prior expectations for the electron density at that point in the unit cell. These likelihood functions are then combined with likelihood functions based on experimental observations and with others containing any prior knowledge about structure factors to form a combined likelihood function for each structure factor. A simple and general approach to maximizing the combined likelihood function is developed. It is found that this likelihood-based approach yields greater phase improvement in model and real test cases than either conventional solvent flattening and histogram matching or a recent reciprocal-space solvent-flattening procedure [Terwilliger (1999 ▶), Acta Cryst. D55, 1863–1871]. PMID:10944333
Toyooka, Kiminori; Sato, Mayuko; Wakazaki, Mayumi; Matsuoka, Ken
2016-03-01
We developed a wide-range and high-resolution transmission electron microscope acquisition system and obtained giga-pixel images of tobacco BY-2 cells during the log and stationary phases of cell growth. We demonstrated that the distribution and ultrastructure of compartments involved in membrane traffic (i.e., Golgi apparatus, multivesicular body, and vesicle cluster) change during the log-to-stationary transition. Mitochondria, peroxisomes, and plastids were also enumerated. Electron densities of mitochondria and peroxisomes were altered during the growth-phase shift, while their numbers were reduced by nearly half. Plastid structure dramatically changed from atypical to spherical with starch granules. Nearly the same number of plastids was observed in both log and stationary phases. These results indicate that mechanisms regulating organelle populations differ from organelle to organelle. PMID:26855065
Toyooka, Kiminori; Sato, Mayuko; Wakazaki, Mayumi; Matsuoka, Ken
2016-01-01
ABSTRACT We developed a wide-range and high-resolution transmission electron microscope acquisition system and obtained giga-pixel images of tobacco BY-2 cells during the log and stationary phases of cell growth. We demonstrated that the distribution and ultrastructure of compartments involved in membrane traffic (i.e., Golgi apparatus, multivesicular body, and vesicle cluster) change during the log-to-stationary transition. Mitochondria, peroxisomes, and plastids were also enumerated. Electron densities of mitochondria and peroxisomes were altered during the growth-phase shift, while their numbers were reduced by nearly half. Plastid structure dramatically changed from atypical to spherical with starch granules. Nearly the same number of plastids was observed in both log and stationary phases. These results indicate that mechanisms regulating organelle populations differ from organelle to organelle. PMID:26855065
Likelihood alarm displays. [for human operator
NASA Technical Reports Server (NTRS)
Sorkin, Robert D.; Kantowitz, Barry H.; Kantowitz, Susan C.
1988-01-01
In a likelihood alarm display (LAD) information about event likelihood is computed by an automated monitoring system and encoded into an alerting signal for the human operator. Operator performance within a dual-task paradigm was evaluated with two LADs: a color-coded visual alarm and a linguistically coded synthetic speech alarm. The operator's primary task was one of tracking; the secondary task was to monitor a four-element numerical display and determine whether the data arose from a 'signal' or 'no-signal' condition. A simulated 'intelligent' monitoring system alerted the operator to the likelihood of a signal. The results indicated that (1) automated monitoring systems can improve performance on primary and secondary tasks; (2) LADs can improve the allocation of attention among tasks and provide information integrated into operator decisions; and (3) LADs do not necessarily add to the operator's attentional load.
ERIC Educational Resources Information Center
Hidalgo, Mª Dolores; Gómez-Benito, Juana; Zumbo, Bruno D.
2014-01-01
The authors analyze the effectiveness of the R[superscript 2] and delta log odds ratio effect size measures when using logistic regression analysis to detect differential item functioning (DIF) in dichotomous items. A simulation study was carried out, and the Type I error rate and power estimates under conditions in which only statistical testing…
A hybrid likelihood algorithm for risk modelling.
Kellerer, A M; Kreisheimer, M; Chmelevsky, D; Barclay, D
1995-03-01
The risk of radiation-induced cancer is assessed through the follow-up of large cohorts, such as atomic bomb survivors or underground miners who have been occupationally exposed to radon and its decay products. The models relate to the dose, age and time dependence of the excess tumour rates, and they contain parameters that are estimated in terms of maximum likelihood computations. The computations are performed with the software package EPI-CURE, which contains the two main options of person-by person regression or of Poisson regression with grouped data. The Poisson regression is most frequently employed, but there are certain models that require an excessive number of cells when grouped data are used. One example involves computations that account explicitly for the temporal distribution of continuous exposures, as they occur with underground miners. In past work such models had to be approximated, but it is shown here that they can be treated explicitly in a suitably reformulated person-by person computation of the likelihood. The algorithm uses the familiar partitioning of the log-likelihood into two terms, L1 and L0. The first term, L1, represents the contribution of the 'events' (tumours). It needs to be evaluated in the usual way, but constitutes no computational problem. The second term, L0, represents the event-free periods of observation. It is, in its usual form, unmanageable for large cohorts. However, it can be reduced to a simple form, in which the number of computational steps is independent of cohort size. The method requires less computing time and computer memory, but more importantly it leads to more stable numerical results by obviating the need for grouping the data. The algorithm may be most relevant to radiation risk modelling, but it can facilitate the modelling of failure-time data in general. PMID:7604154
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Smith, Margaret H.; Barringer, Howard; Groce, Alex
2012-01-01
LogScope is a software package for analyzing log files. The intended use is for offline post-processing of such logs, after the execution of the system under test. LogScope can, however, in principle, also be used to monitor systems online during their execution. Logs are checked against requirements formulated as monitors expressed in a rule-based specification language. This language has similarities to a state machine language, but is more expressive, for example, in its handling of data parameters. The specification language is user friendly, simple, and yet expressive enough for many practical scenarios. The LogScope software was initially developed to specifically assist in testing JPL s Mars Science Laboratory (MSL) flight software, but it is very generic in nature and can be applied to any application that produces some form of logging information (which almost any software does).
The dud-alternative effect in likelihood judgment.
Windschitl, Paul D; Chambers, John R
2004-01-01
The judged likelihood of a focal outcome should generally decrease as the list of alternative possibilities increases. For example, the likelihood that a runner will win a race goes down when 2 new entries are added to the field. However, 6 experiments demonstrate that the presence of implausible alternatives (duds) often increases the judged likelihood of a focal outcome. This dud-alternative effect was detected for judgments involving uncertainty about trivia facts and stochastic events. Nonnumeric likelihood measures and betting measures reliably detected the effect, but numeric likelihood measures did not. Time pressure increased the magnitude of the effect. The results were consistent with a contrast-effect account: The inclusion of duds increases the perceived strength of the evidence for the focal outcome, thereby affecting its judged likelihood. PMID:14736307
High-temperature scintillation properties of orthorhombic Gd2Si2O7 aiming at well logging
NASA Astrophysics Data System (ADS)
Tsubota, Youichi; Kaneko, Junichi H.; Higuchi, Mikio; Nishiyama, Shusuke; Ishibashi, Hiroyuki
2015-06-01
Scintillation and luminescence properties of orthorhombic Gd2Si2O7:Ce (GPS:Ce) single-crystal scintillators were investigated for temperatures ranging from room temperature (RT) to 573 K. Orthorhombic GPS crystals were grown by using a top-seeded solution growth (TSSG) method. The scintillation light yield of the orthorhombic GPS at RT was ∼2.9 times higher than that of Gd2SiO5:Ce (GSO). The light yield values of the orthorhombic GPS (Ce = 2.5%) were almost unchanged for temperatures ranging from RT to 523 K, and at 523 K, were higher than twice the light yield of GSO at RT. These GPS scintillators are expected to contribute to oil exploration at greater depths.
NASA Astrophysics Data System (ADS)
Shang, Yilun
2016-08-01
How complex a network is crucially impacts its function and performance. In many modern applications, the networks involved have a growth property and sparse structures, which pose challenges to physicists and applied mathematicians. In this paper, we introduce the forest likelihood as a plausible measure to gauge how difficult it is to construct a forest in a non-preferential attachment way. Based on the notions of admittable labeling and path construction, we propose algorithms for computing the forest likelihood of a given forest. Concrete examples as well as the distributions of forest likelihoods for all forests with some fixed numbers of nodes are presented. Moreover, we illustrate the ideas on real-life networks, including a benzenoid tree, a mathematical family tree, and a peer-to-peer network.
Selective Logging, Fire, and Biomass in Amazonia
NASA Technical Reports Server (NTRS)
Houghton, R. A.
1999-01-01
Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.
Abulencia, A.; Acosta, D.; Adelman, Jahred A.; Affolder, Anthony A.; Akimoto, T.; Albrow, M.G.; Ambrose, D.; Amerio, S.; Amidei, D.; Anastassov, A.; Anikeev, K.; /Taiwan, Inst. Phys. /Argonne /Barcelona, IFAE /Baylor U. /INFN, Bologna /Bologna U. /Brandeis U. /UC, Davis /UCLA /UC, San Diego /UC, Santa Barbara
2005-12-01
This report describes a measurement of the top quark mass, M{sub top}, with the dynamical likelihood method (DLM) using the CDF II detector at the Fermilab Tevatron. The Tevatron produces top/anti-top (t{bar t}) pairs in p{bar p} collisions at a center-of-mass energy of 1.96 TeV. The data sample used in this analysis was accumulated from March 2002 through August 2004, which corresponds to an integrated luminosity of 318 pb{sup -1}. They use the t{bar t} candidates in the ''lepton+jets'' decay channel, requiring at least one jet identified as a b quark by finding an displaced secondary vertex. The DLM defines a likelihood for each event based on the differential cross section as a function of M{sub top} per unit phase space volume of the final partons, multiplied by the transfer functions from jet to parton energies. The method takes into account all possible jet combinations in an event, and the likelihood is multiplied event by event to derive the top quark mass by the maximum likelihood method. Using 63 t{bar t} candidates observed in the data, with 9.2 events expected from background, they measure the top quark mass to be 173.2{sub -2.4}{sup +2.6}(stat.) {+-} 3.2(syst.) GeV/c{sup 2}, or 173.2{sub -4.0}{sup +4.1} GeV/c{sup 2}.
NASA Astrophysics Data System (ADS)
Reid, Beth A.
2013-06-01
This software computes likelihoods for the Luminous Red Galaxies (LRG) data from the Sloan Digital Sky Survey (SDSS). It includes a patch to the existing CAMB software (the February 2009 release) to calculate the theoretical LRG halo power spectrum for various models. The code is written in Fortran 90 and has been tested with the Intel Fortran 90 and GFortran compilers.
NASA Astrophysics Data System (ADS)
Cherkasova, T.; Timkin, T.; Savinova, O.
2015-02-01
The nanomineralogic investigation results of ore minerals in metasomatites (garnet skarns) of Ribny Log- 2 gold ore in Topolninsk ore deposit (Gorny Altai) revealed the native silica impurities (Si) of 1 - 5 nm within the grains of native bismuth (Bi). Polished sections were examined by using Tescan Vega 3 scanning electron microscope (SEM) with Oxford energy-dispersive spectrometer at the Department of Geology and Mineral Exploration, Institute of Natural Resources, Tomsk Polytechnic University.
Approximate maximum likelihood estimation of scanning observer templates
NASA Astrophysics Data System (ADS)
Abbey, Craig K.; Samuelson, Frank W.; Wunderlich, Adam; Popescu, Lucretiu M.; Eckstein, Miguel P.; Boone, John M.
2015-03-01
In localization tasks, an observer is asked to give the location of some target or feature of interest in an image. Scanning linear observer models incorporate the search implicit in this task through convolution of an observer template with the image being evaluated. Such models are becoming increasingly popular as predictors of human performance for validating medical imaging methodology. In addition to convolution, scanning models may utilize internal noise components to model inconsistencies in human observer responses. In this work, we build a probabilistic mathematical model of this process and show how it can, in principle, be used to obtain estimates of the observer template using maximum likelihood methods. The main difficulty of this approach is that a closed form probability distribution for a maximal location response is not generally available in the presence of internal noise. However, for a given image we can generate an empirical distribution of maximal locations using Monte-Carlo sampling. We show that this probability is well approximated by applying an exponential function to the scanning template output. We also evaluate log-likelihood functions on the basis of this approximate distribution. Using 1,000 trials of simulated data as a validation test set, we find that a plot of the approximate log-likelihood function along a single parameter related to the template profile achieves its maximum value near the true value used in the simulation. This finding holds regardless of whether the trials are correctly localized or not. In a second validation study evaluating a parameter related to the relative magnitude of internal noise, only the incorrect localization images produces a maximum in the approximate log-likelihood function that is near the true value of the parameter.
The Dud-Alternative Effect in Likelihood Judgment
ERIC Educational Resources Information Center
Windschitl, Paul D.; Chambers, John R.
2004-01-01
The judged likelihood of a focal outcome should generally decrease as the list of alternative possibilities increases. For example, the likelihood that a runner will win a race goes down when 2 new entries are added to the field. However, 6 experiments demonstrate that the presence of implausible alternatives (duds) often increases the judged…
Well log and 2D seismic data character of the Wilcox Group in south-central Louisiana
Enomoto, Catherine B.
2014-01-01
The Wilcox Group is productive in updip areas of Texas and Louisiana from fluvial, deltaic, and near-shore marine shelf sandstones. The reported presence of porous sandstones at 29,000 feet within the Wilcox Group containing about 200 feet of gas in the Davy Jones 1 discovery well in the offshore Louisiana South Marsh Island area illustrates a sand-rich system developed during the Paleocene and early Eocene. This study describes some of the well log and reflection seismic data characteristics of the slope and basin-floor reservoirs with gas-discovery potential that may be in the area between the producing trend onshore Louisiana and the offshore discovery.
Mail LOG: Program operating instructions
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.
CRYPTOSPORIDIUM LOG INACTIVATION CALCULATION METHODS
Appendix O of the Surface Water Treatment Rule (SWTR) Guidance Manual introduces the CeffT10 (i.e., reaction zone outlet C value and T10 time) method for calculating ozone CT value and Giardia and virus log inactivation. The LT2ESWTR Pre-proposal Draft Regulatory Language for St...
Contrastive Pessimistic Likelihood Estimation for Semi-Supervised Classification.
Loog, Marco
2016-03-01
Improvement guarantees for semi-supervised classifiers can currently only be given under restrictive conditions on the data. We propose a general way to perform semi-supervised parameter estimation for likelihood-based classifiers for which, on the full training set, the estimates are never worse than the supervised solution in terms of the log-likelihood. We argue, moreover, that we may expect these solutions to really improve upon the supervised classifier in particular cases. In a worked-out example for LDA, we take it one step further and essentially prove that its semi-supervised version is strictly better than its supervised counterpart. The two new concepts that form the core of our estimation principle are contrast and pessimism. The former refers to the fact that our objective function takes the supervised estimates into account, enabling the semi-supervised solution to explicitly control the potential improvements over this estimate. The latter refers to the fact that our estimates are conservative and therefore resilient to whatever form the true labeling of the unlabeled data takes on. Experiments demonstrate the improvements in terms of both the log-likelihood and the classification error rate on independent test sets. PMID:27046491
Scandurra, D; Albertini, F; van der Meer, R; Meier, G; Weber, D C; Bolsi, A; Lomax, A
2016-02-01
Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within +/- 1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field. PMID:26767316
NASA Astrophysics Data System (ADS)
Scandurra, D.; Albertini, F.; van der Meer, R.; Meier, G.; Weber, D. C.; Bolsi, A.; Lomax, A.
2016-02-01
Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within +/- 1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field.
NASA Astrophysics Data System (ADS)
Higashinaka, Ryuji; Asano, Takuya; Nakashima, Takuya; Fushiya, Kengo; Mizuguchi, Yoshikazu; Miura, Osuke; Matsuda, Tatsuma D.; Aoki, Yuji
2015-02-01
The low-temperature properties of CeOBiS2 single crystals are studied by electrical resistivity, magnetization, and specific heat measurements. Ce 4f-electrons are found to be in a well-localized state split by crystalline-electric-field (CEF) effects. The CEF ground state is a pure Jz = ±1/2 doublet, and excited doublets are located far above. At low temperatures in zero field, we observe pronounced -log T divergence in the specific heat, revealing the presence of quantum critical fluctuations of 4f magnetic moments near a quantum critical point (QCP). Considering that CeOBiS2 is a nonmetal, this phenomenon cannot be attributed to the competition between Kondo and the Ruderman-Kittel-Kasuya-Yosida (RKKY) interactions as in numerous f-electron-based strongly correlated metals, indicating an unconventional mechanism. We suggest that CeOBiS2 is the first material found to be located at a QCP among geometrically frustrated nonmetallic magnets.
Postfire logging in riparian areas.
Reeves, Gordon H; Bisson, Peter A; Rieman, Bruce E; Benda, Lee E
2006-08-01
We reviewed the behavior of wildfire in riparian zones, primarily in the western United States, and the potential ecological consequences of postfire logging. Fire behavior in riparian zones is complex, but many aquatic and riparian organisms exhibit a suite of adaptations that allow relatively rapid recovery after fire. Unless constrained by other factors, fish tend to rebound relatively quickly, usually within a decade after a wildfire. Additionally, fire and subsequent erosion events contribute wood and coarse sediment that can create and maintain productive aquatic habitats over time. The potential effects of postfire logging in riparian areas depend on the landscape context and disturbance history of a site; however available evidence suggests two key management implications: (1) fire in riparian areas creates conditions that may not require intervention to sustain the long-term productivity of the aquatic network and (2) protection of burned riparian areas gives priority to what is left rather than what is removed. Research is needed to determine how postfire logging in riparian areas has affected the spread of invasive species and the vulnerability of upland forests to insect and disease outbreaks and how postfire logging will affect the frequency and behavior of future fires. The effectiveness of using postfire logging to restore desired riparian structure and function is therefore unproven, but such projects are gaining interest with the departure of forest conditions from those that existed prior to timber harvest, fire suppression, and climate change. In the absence of reliable information about the potential consequence of postfire timber harvest, we conclude that providing postfire riparian zones with the same environmental protections they received before they burned isjustified ecologically Without a commitment to monitor management experiments, the effects of postfire riparian logging will remain unknown and highly contentious. PMID:16922216
Rivenes, A S
2004-12-08
The purpose of this document is to investigate Oracle database log buffer queuing and its affect on the ability to load data using a specialized data loading system. Experiments were carried out on a Linux system using an Oracle 9.2 database. Previous experiments on a Sun 4800 running Solaris had shown that 100,000 entities per minute was an achievable rate. The question was then asked, can we do this on Linux, and where are the bottlenecks? A secondary question was also lurking, how can the loading be further scaled to handle even higher throughput requirements? Testing was conducted using a Dell PowerEdge 6650 server with four CPUs and a Dell PowerVault 220s RAID array with 14 36GB drives and 128 MB of cache. Oracle Enterprise Edition 9.2.0.4 was used for the database and Red Hat Linux Advanced Server 2.1 was used for the operating system. This document will detail the maximum observed throughputs using the same test suite that was used for the Sun tests. A detailed description of the testing performed along with an analysis of bottlenecks encountered will be made. Issues related to Oracle and Linux will also be detailed and some recommendations based on the findings.
6. Log calving barn. Interior view showing log postandbeam support ...
6. Log calving barn. Interior view showing log post-and-beam support system and animal stalls. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT
Anderson, David W.
1993-12-15
Two logs of EE-3A were performed during the last couple of weeks. The first of which, was a Temperature/Casing-Collar Locator (CCL) log, which took place on Friday, December 10th., 1993. The second log was a Caliper log which was done in cooperation with the Dia-Log Company, of Odessa, TX. on Monday, December, 13th., 1993.
Ehrmann, Brett J.; Anderson, Robert M.; Piatt, Gretchen A.; Funnell, Martha M.; Rashid, Hira; Shedden, Kerby; Douyon, Liselle
2014-01-01
Purpose The purpose of this pilot study is to investigate the utility of, and areas of refinement for, digital photography as an educational tool for food logging in obese patients with type 2 diabetes (T2DM). Methods Thirty-three patients aged 18-70 with T2DM, BMI at least 30 kg/m2, and A1C 7.5-9% were recruited from an endocrinology clinic and randomized to a week of food logging using a digital camera (DC) or paper diary (PD), crossing over for week two. Patients then viewed a presentation about dietary effects on blood glucose, using patient DC and blood glucose entries. Outcomes of adherence (based on number of weekly entries), changes in mean blood glucose and frequency of blood glucose checks, and patient satisfaction were compared between methods. Patient feedback on the DC intervention and presentation was also analyzed. Results Thirty patients completed the study. Adherence was identical across methods. The mean difference in number of entries was not significant between methods. This difference increased and neared statistical significance (favoring DC) among patients who were adherent for at least one week (21 entries, with 2 entries per day for 5 of 7 days, n=25). Mean blood glucose did not significantly decrease in either method. Patient satisfaction was similar between interventions. Feedback indicated concerns over photograph accuracy, forgetting to use the cameras, and embarrassment using them in public. Conclusion Though comparable to PD in adherence, blood glucose changes, and patient satisfaction in this pilot trial, patient feedback suggested specific areas of refinement to maximize utility of DC-based food logging as an educational tool in T2DM. PMID:24168836
Priiatkina, S N
2002-05-01
For mapping nonlinked interacting genes relative to marker loci, the recombination fractions can be calculated by using the log-likelihood functions were derived that permit estimation of recombinant fractions by solving the ML equations on the basis of F2 data at various types of interaction. In some cases, the recombinant fraction estimates are obtained in the analytical form while in others they are numerically calculated from concrete experimental data. With the same type of epistasis the log-functions were shown to differ depending on the functional role (suppression or epistasis) of the mapped gene. Methods for testing the correspondence of the model and the recombination fraction estimates to the experimental data are discussed. In ambiguous cases, analysis of the linked marker behavior makes it possible to differentiate gene interaction from distorted single-locus segregation, which at some forms of interaction imitate phenotypic ratios. PMID:12068553
Acoustic paramagnetic logging tool
Vail, III, William B.
1988-01-01
New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth3 s magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation . The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores.
NASA Technical Reports Server (NTRS)
Balch, D. T.; Lombardi, J.
1985-01-01
A model scale hover test was conducted in the Sikorsky Aircraft Model Rotor hover Facility to identify and quantify the impact of the tail rotor on the demonstrated advantages of advanced geometry tip configurations. The existence of mutual interference between hovering main rotor and a tail rotor was acknowledged in the test. The test was conducted using the Basic Model Test Rig and two scaled main rotor systems, one representing a 1/5.727 scale UH-60A BLACK HAWK and the others a 1/4.71 scale S-76. Eight alternate rotor tip configurations were tested, 3 on the BLACK HAWK rotor and 6 on the S-76 rotor. Four of these tips were then selected for testing in close proximity to an operating tail rotor (operating in both tractor and pusher modes) to determine if the performance advantages that could be obtained from the use of advanced geometry tips in a main rotor only environment would still exist in the more complex flow field involving a tail rotor. This volume contains the test run log and tabulated data.
Walsh, David O; Turner, Peter
2014-05-27
Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.
4. Log chicken house (far left foreground), log bunkhouse (far ...
4. Log chicken house (far left foreground), log bunkhouse (far left background), one-room log cabin (left of center background), log root cellar (center), post-and-beam center in foreground, and blacksmith shop (far right foreground). View to southeast. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT
Maximum Likelihood Estimation in Generalized Rasch Models.
ERIC Educational Resources Information Center
de Leeuw, Jan; Verhelst, Norman
1986-01-01
Maximum likelihood procedures are presented for a general model to unify the various models and techniques that have been proposed for item analysis. Unconditional maximum likelihood estimation, proposed by Wright and Haberman, and conditional maximum likelihood estimation, proposed by Rasch and Andersen, are shown as important special cases. (JAZ)
Structural basis for cytokinin production by LOG from Corynebacterium glutamicum.
Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin
2016-01-01
"Lonely guy" (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a "PGGXGTXXE" motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425
Structural basis for cytokinin production by LOG from Corynebacterium glutamicum
Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin
2016-01-01
“Lonely guy” (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a “PGGXGTXXE” motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425
Taylor, J.A.; Taylor, K.G.
1987-02-03
A method is described of logging earth formations traversed by a well bore and utilizing a logging tool having a neutron source and a short spaced and a long spaced thermal neutron detector which produce an independent response as a function of depth of the logging tool in a well bore. The method comprises: moving the logging tool through a well bore to locate a section of the earth formations which has minimum porosity and obtaining measurement responses from each of the long and short spaced detectors; normalizing the responses of the long and short spaced detectors by matching the sensitivity of response of the long spaced detector to the sensitivity of response of the short spaced detector for an earth formation which has minimum porosity so that the normalized responses track one another in an earth formation which has minimum porosity; and moving the tool over the length of the well bore to be surveyed while recording the normalized responses of the long and short spaced neutron detectors as a function of depth.
ERIC Educational Resources Information Center
Butler, Kevin
2010-01-01
A classroom lecture at Capistrano Connections Academy in Southern California involves booting up the home computer, logging on to a Web site, and observing a teacher conducting a PowerPoint presentation of that day's lesson entirely online. Through microphone headsets, students can watch on their home computers, respond to the teacher's questions,…
ERIC Educational Resources Information Center
Deaton, Cynthia Minchew; Deaton, Benjamin E.; Leland, Katina
2010-01-01
The authors created an interactive reflective log (IRL) to provide teachers with an opportunity to use a journal approach to record, evaluate, and communicate student understanding of science concepts. Unlike a traditional journal, the IRL incorporates prompts to encourage students to discuss their understanding of science content and science…
Energy Science and Technology Software Center (ESTSC)
2007-04-04
A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.
ERIC Educational Resources Information Center
National Aeronautics and Space Administration, Washington, DC.
The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)
Coal-log pipeline system development
Liu, H.
1991-12-01
Project tasks include: (1) Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. (2) Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and tests the logs produced. (3) Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. (VC)
Maximum likelihood density modification by pattern recognition of structural motifs
Terwilliger, Thomas C.
2004-04-13
An electron density for a crystallographic structure having protein regions and solvent regions is improved by maximizing the log likelihood of a set of structures factors {F.sub.h } using a local log-likelihood function: (x)+p(.rho.(x).vertline.SOLV)p.sub.SOLV (x)+p(.rho.(x).vertline.H)p.sub.H (x)], where p.sub.PROT (x) is the probability that x is in the protein region, p(.rho.(x).vertline.PROT) is the conditional probability for .rho.(x) given that x is in the protein region, and p.sub.SOLV (x) and p(.rho.(x).vertline.SOLV) are the corresponding quantities for the solvent region, p.sub.H (x) refers to the probability that there is a structural motif at a known location, with a known orientation, in the vicinity of the point x; and p(.rho.(x).vertline.H) is the probability distribution for electron density at this point given that the structural motif actually is present. One appropriate structural motif is a helical structure within the crystallographic structure.
Log-Concavity and Strong Log-Concavity: a review
Saumard, Adrien; Wellner, Jon A.
2016-01-01
We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning. PMID:27134693
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... by station licensees shall be retained by them for a period of 2 years. However, logs involving... inspection of logs pursuant to § 73.1226, availability to FCC of station logs and records. (2) Reproduction... any application; or placed in the station's local public inspection file as part of an application;...
Energy saving and endurance log for a log building
Dolata, G.
1987-03-17
A log is described for construction of a log building which comprises: an elongated peeled log of substantially uniform diameter along its length with parallel end faces, a bottom surface of the log having a concave surface configuration centered on a diametrical line of the log, a rounded top surface directly opposite from the concave bottom surface which mates with a concave surface of another log when placed upon the rounded top surface, a vertically extending longitudinal slot in the top surface of the log that extends the length of the log, a vertically extending longitudinal slot along at least one side of the log with the slot extending vertically substantially parallel with the diametrical line with the slot being formed outwardly of the concave surface, the log including at least one butt end, the butt end including an end slot along the diametrical line which extends from a top of the log down through the butt end to the concave surface; and the butt includes at least one short, longitudinally extending arcuate groove near an outer surface of the log which extends from a line juxtaposed the end slot down to at least one longitudinal slot in the log.
Pitts, R.W.
1980-02-19
A well logging system includes a logging tool adapted to be passed through a borehole traversing an earth formation. The logging tool contains a sensor sensing a condition of the earth formation and providing electrical pulses corresponding in number and peak amplitude to the sensed condition. A first electrical pulse from the sensor occurring during each predetermined time period of a plurality of predetermined time periods, is stretched and then converted to parallel digital signals. A register receives the parallel digital signals and provides a serial digital signal in response to the shift pulses. A network provides an electrical synchronization pulse each time period prior to the occurrence of the shift pulses. A light emitting diode converts the synchronization pulses and the serial digital signals to corresponding light pulses. A cable including a fiber optic conductor transmits the light pulses uphole to the surface. Surface electronics includes a light-to-electrical converter for providing corresponding electrical pulses in accordance with the light pulses, so that the light-to-electrical converter provides a synchronization pulse followed by a serial digital signal each time period. Another circuit provides a set of shift pulses in response to the synchronizing pulse from the light-to-electrical converter, and an output circuit provides parallel output digital signals corresponding to the sensed condition in accordance with the shift pulses and the serial digital signals from the light-to -electrical converter.
West, H.I. Jr.; Glasgow, J.E.
1983-12-01
The feasibility of using oil-well bore-hole logging techniques for assaying the carbonate content of a potential shot site has been investigated. The procedure makes use of the detection of the 4439-keV ..gamma.. ray of carbon and the 6130-keV ..gamma.. ray of oxygen produced by the inelastic scattering of 14-MeV neutrons in the bore-hole medium. For the needs of the containment program, a sensitivity of detection of CO/sub 2/ to less than or equal to 0.02 weight fraction must be achieved. Laboratory measurements indicate that only with considerable development effort could a tool be made that would come close to achieving this goal.
NASA Astrophysics Data System (ADS)
Salehin, Z.; Woobaidullah, A. S. M.; Snigdha, S. S.
2015-12-01
Bengal Basin with its prolific gas rich province provides needed energy to Bangladesh. Present energy situation demands more Hydrocarbon explorations. Only 'Semutang' is discovered in the high amplitude structures, where rest of are in the gentle to moderate structures of western part of Chittagong-Tripura Fold Belt. But it has some major thrust faults which have strongly breached the reservoir zone. The major objectives of this research are interpretation of gas horizons and faults, then to perform velocity model, structural and property modeling to obtain reservoir properties. It is needed to properly identify the faults and reservoir heterogeneities. 3D modeling is widely used to reveal the subsurface structure in faulted zone where planning and development drilling is major challenge. Thirteen 2D seismic and six well logs have been used to identify six gas bearing horizons and a network of faults and to map the structure at reservoir level. Variance attributes were used to identify faults. Velocity model is performed for domain conversion. Synthetics were prepared from two wells where sonic and density logs are available. Well to seismic tie at reservoir zone shows good match with Direct Hydrocarbon Indicator on seismic section. Vsh, porosity, water saturation and permeability have been calculated and various cross plots among porosity logs have been shown. Structural modeling is used to make zone and layering accordance with minimum sand thickness. Fault model shows the possible fault network, those liable for several dry wells. Facies model have been constrained with Sequential Indicator Simulation method to show the facies distribution along the depth surfaces. Petrophysical models have been prepared with Sequential Gaussian Simulation to estimate petrophysical parameters away from the existing wells to other parts of the field and to observe heterogeneities in reservoir. Average porosity map for each gas zone were constructed. The outcomes of the research
12. Upstream view showing thelower log pond log chute in ...
12. Upstream view showing thelower log pond log chute in the main channel of the Hudson River. The log chute in the dam can be seen in the background. Facing southwest. - Glens Falls Dam, 100' to 450' West of U.S. Route 9 Bridge Spanning Hudson River, Glens Falls, Warren County, NY
Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation
NASA Astrophysics Data System (ADS)
Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.
2015-11-01
We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Agricultural Employment of H-2A Aliens in the United States, 75 FR 6884 (Feb. 12, 2010). The effect of... rulemaking (NPRM). 74 FR 45906 (Sept. 4, 2009). After considering comments from the public on the subject... errors and ambiguities. J. Plain Language The Department drafted this rule in plain language....
Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.; Goodrich, J.H.; Kumar, R.M.
1981-11-01
This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712 feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.
NASA Astrophysics Data System (ADS)
Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.
2015-12-01
An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.
Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete
2015-01-01
An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.
SedLog: A shareware program for drawing graphic logs and log data manipulation
NASA Astrophysics Data System (ADS)
Zervas, Dimitrios; Nichols, Gary J.; Hall, Robert; Smyth, Helen R.; Lüthje, Charlotta; Murtagh, Fionn
2009-10-01
SedLog is a free multi-platform software package for creating graphic sediment logs providing an intuitive graphical user interface. The graphic sediment logs generated by SedLog can be exported as PDF, Scalable Vector Graphics (SVG), or JPEG for use by other drawing applications or for publications. Log data can be imported and exported in Comma Separated Values (CSV) format. The logs can also be printed to any paper size the user wants. Zoom In, Zoom Out, Fit page, Fit Height and Fit Width facilities are also provided to enable the user to customise the workspace size.
Continuous gravity gradient logging
Fitch, J.L.; Lyle, W.D. Jr.
1986-07-29
A method is described for conducting a gravimetry survey of an earth formation, comprising the steps of: (a) continuously traversing the earth formation with a gravity logging tool having a column of fluid within the tool, (b) measuring a first pressure difference along a first interval within the column of fluid, (c) measuring a second pressure difference along a second interval within the column of fluid, (d) differencing the first and second pressure differences to determine the gravity gradient along the earth formation between the first and second intervals.
Oliver, R.L.; Lindblom, S.R.; Covell, J.R.
1991-02-01
The Rocky Mountain 1 (RM1) Underground Coal Gasification (UCG) site consisted of two different module configurations: the controlled retracting injection point (CRIP) and elongated linked well (ELW) configurations. The postburn coring of the RM1 UCG site was designed in two phases to fulfill seven objectives outlined in Western Research Institute`s Annual Project Plan for 1989 (Western Research Institute 1989). The seven objectives were to (1) delineate the areal extent of the cavities, (2) identify the extent of roof collapse, (3) obtain samples of all major cavity rock types, (4) characterize outflow channels and cavity stratigraphy, (5) characterize the area near CRIP points and ignition points, (6) further define the structural geology of the site, and (7) identify the vertical positioning of the horizontal process wells within the coal seam. Phase 1 of the coring was completed during the summer of 1989 and served to partially accomplish all seven objectives. A detailed description of Phase 1 results was presented in a separate report (Lindblom et al. 1990). Phase 2, completed during the summer of 1990, was designed to complete the seven objectives; more specifically, to further define the areal extent and location of the cavities, to evaluate the outflow channels for both modules, and to further characterize the structural geology in the ELW module area.
Oliver, R.L.; Lindblom, S.R.; Covell, J.R.
1991-02-01
The Rocky Mountain 1 (RM1) Underground Coal Gasification (UCG) site consisted of two different module configurations: the controlled retracting injection point (CRIP) and elongated linked well (ELW) configurations. The postburn coring of the RM1 UCG site was designed in two phases to fulfill seven objectives outlined in Western Research Institute's Annual Project Plan for 1989 (Western Research Institute 1989). The seven objectives were to (1) delineate the areal extent of the cavities, (2) identify the extent of roof collapse, (3) obtain samples of all major cavity rock types, (4) characterize outflow channels and cavity stratigraphy, (5) characterize the area near CRIP points and ignition points, (6) further define the structural geology of the site, and (7) identify the vertical positioning of the horizontal process wells within the coal seam. Phase 1 of the coring was completed during the summer of 1989 and served to partially accomplish all seven objectives. A detailed description of Phase 1 results was presented in a separate report (Lindblom et al. 1990). Phase 2, completed during the summer of 1990, was designed to complete the seven objectives; more specifically, to further define the areal extent and location of the cavities, to evaluate the outflow channels for both modules, and to further characterize the structural geology in the ELW module area.
Weekly Log Record Sort (WLSORT).
ERIC Educational Resources Information Center
Foote, Thomas
Computer routines to sort the weekly log records submitted by teachers participating in the Southwest Regional Laboratory's communications skills monitoring program are described. Written in Univac FORTRAN V, Weekly Log Record Sort (WLSORT) sorts log records on magnetic tape to enable subsequent computer programs to interpret the input data by…
NASA Astrophysics Data System (ADS)
Isyaku, Aminu A.; Rust, Derek; Teeuw, Richard; Whitworth, Malcolm
2016-09-01
Structural and stratigraphic mapping within the Bornu Basin in north east Nigeria was commonly carried out using traditional field geological methods. However, such traditional approaches remain inadequate in the semi-arid region characterised by topographically flat areas and lack of continuous bedrock outcrops that are mostly concealed beneath sand cover. Previous studies in the north-eastern part of the basin carried out using ditch cuttings from few wells and disconnected seismic data were largely inadequate and the resulting stratigraphic analyses were more often generalised. This paper presents an integrated structural and stratigraphic study of the basin using combined subsurface geophysical datasets. A Combined Log Pattern (CLP) method is a well log analysis, which utilises various well log data including gamma ray, resistivity, bulk density and sonic logs to identify lithology and stratigraphic boundaries of subsurface formations. This method is applied to constrain the subsurface stratigraphy of the north-eastern part of the Bornu Basin bordering the Lake Chad. In addition to qualitative combined well log analysis, the time-depth relationship of the sonic log and seismic data was quantitatively determined by tying a well with an intersecting seismic section to validate the stratigraphic facies horizons identified. Four well log facies and their environments of deposition were characterised from the combined well log analysis of the different log types. It is discovered that the Cretaceous basement structural features controlled the deposition of overlying formations in the basin. Without intact core data, the shallower wells were discovered to have bottomed over subsurface horst features while deeper wells penetrated into the basal facies contained mainly within the grabens. Main subsurface structural lineaments in the area include NW-SE, NE-SW and NNW-SSE trending faults, which mainly formed the horst and graben features. Some stratigraphic formations
NASA Astrophysics Data System (ADS)
Bloshanskaya, S. K.; Bloshanskii, I. L.; Y Roslova, T.
1998-06-01
For an arbitrary open set \\Omega\\subset I^2= \\lbrack 0,1)^2 and an arbitrary function f\\in L\\log^+L\\log^+\\log^+L(I^2) such that f=0 on \\Omega the double Fourier series of f with respect to the trigonometric system \\Psi=\\mathscr E and the Walsh-Paley system \\Psi=W is shown to converge to zero (over rectangles) almost everywhere on \\Omega. Thus, it is proved that generalized localization almost everywhere holds on arbitrary open subsets of the square I^2 for the double trigonometric Fourier series and the Walsh-Fourier series of functions in the class L\\log^+L\\log^+\\log^+L (in the case of summation over rectangles). It is also established that such localization breaks down on arbitrary sets that are not dense in I^2, in the classes \\Phi_\\Psi(L)(I^2) for the orthonormal system \\Psi=\\mathscr E and an arbitrary function such that \\Phi_{\\mathscr E}(u)=o(u\\log^+\\log^+u) as u\\to\\infty or for \\Phi_W(u)=u(\\log^+\\log^+u)^{1-\\varepsilon}, 0<\\varepsilon<1.
Schweder, Tore
2003-12-01
Maximum likelihood estimates of abundance are obtained from repeated photographic surveys of a closed stratified population with naturally marked and unmarked individuals. Capture intensities are assumed log-linear in stratum, year, and season. In the chosen model, an approximate confidence distribution for total abundance of bowhead whales, with an accompanying likelihood reduced of nuisance parameters, is found from a parametric bootstrap experiment. The confidence distribution depends on the assumed study protocol. A confidence distribution that is exact (except for the effect of discreteness) is found by conditioning in the unstratified case without unmarked individuals. PMID:14969476
NASA Astrophysics Data System (ADS)
Halabe, Udaya B.; Pyakurel, Sandeep
2007-03-01
There has been a lack of an effective NDE technique to locate internal defects within wooden logs. The few available elastic wave propagation based techniques are limited to predicting E values. Other techniques such as X-rays have not been very successful in detecting internal defects in logs. If defects such as embedded metals could be identified before the sawing process, the saw mills could significantly increase their production by reducing the probability of damage to the saw blade and the associated downtime and the repair cost. Also, if the internal defects such as knots and decayed areas could be identified in logs, the sawing blade can be oriented to exclude the defective portion and optimize the volume of high valued lumber that can be obtained from the logs. In this research, GPR has been successfully used to locate internal defects (knots, decays and embedded metals) within the logs. This paper discusses GPR imaging and mapping of the internal defects using both 2D and 3D interpretation methodology. Metal pieces were inserted in a log and the reflection patterns from these metals were interpreted from the radargrams acquired using 900 MHz antenna. Also, GPR was able to accurately identify the location of knots and decays. Scans from several orientations of the log were collected to generate 3D cylindrical volume. The actual location of the defects showed good correlation with the interpreted defects in the 3D volume. The time/depth slices from 3D cylindrical volume data were useful in understanding the extent of defects inside the log.
Salvage logging, ecosystem processes, and biodiversity conservation.
Lindenmayer, D B; Noss, R F
2006-08-01
We summarize the documented and potential impacts of salvage logging--a form of logging that removes trees and other biological material from sites after natural disturbance. Such operations may reduce or eliminate biological legacies, modify rare postdisturbance habitats, influence populations, alter community composition, impair natural vegetation recovery, facilitate the colonization of invasive species, alter soil properties and nutrient levels, increase erosion, modify hydrological regimes and aquatic ecosystems, and alter patterns of landscape heterogeneity These impacts can be assigned to three broad and interrelated effects: (1) altered stand structural complexity; (2) altered ecosystem processes and functions; and (3) altered populations of species and community composition. Some impacts may be different from or additional to the effects of traditional logging that is not preceded by a large natural disturbance because the conditions before, during, and after salvage logging may differ from those that characterize traditional timber harvesting. The potential impacts of salvage logging often have been overlooked, partly because the processes of ecosystem recovery after natural disturbance are still poorly understood and partly because potential cumulative effects of natural and human disturbance have not been well documented. Ecologically informed policies regarding salvage logging are needed prior to major natural disturbances so that when they occur ad hoc and crisis-mode decision making can be avoided. These policies should lead to salvage-exemption zones and limits on the amounts of disturbance-derived biological legacies (e.g., burned trees, logs) that are removed where salvage logging takes place. Finally, we believe new terminology is needed. The word salvage implies that something is being saved or recovered, whereas from an ecological perspective this is rarely the case. PMID:16922212
Refining clinical diagnosis with likelihood ratios.
Grimes, David A; Schulz, Kenneth F
Likelihood ratios can refine clinical diagnosis on the basis of signs and symptoms; however, they are underused for patients' care. A likelihood ratio is the percentage of ill people with a given test result divided by the percentage of well individuals with the same result. Ideally, abnormal test results should be much more typical in ill individuals than in those who are well (high likelihood ratio) and normal test results should be most frequent in well people than in sick people (low likelihood ratio). Likelihood ratios near unity have little effect on decision-making; by contrast, high or low ratios can greatly shift the clinician's estimate of the probability of disease. Likelihood ratios can be calculated not only for dichotomous (positive or negative) tests but also for tests with multiple levels of results, such as creatine kinase or ventilation-perfusion scans. When combined with an accurate clinical diagnosis, likelihood ratios from ancillary tests improve diagnostic accuracy in a synergistic manner. PMID:15850636
Coal log pipeline: Development status of the first commercial system
Marrero, T.R.
1996-12-31
The coal log pipeline (CLP) is an innovative means for long-distance transportation of coal. In the CLP concept, coal is pressed into the form of cylinders--coal logs--that are propelled by water flowing through underground pipe. A coal log pipeline has many advantages when compared to coal transport by unit train, slurry pipeline and long-distance trucking: low-cost, low energy consumption, low-water consumption, simple dewatering at pipeline exit, safe, and environmentally friendly. The coal logs travel butted together, as trains. Between the coal log {open_quotes}trains,{close_quotes} some space is allowed for valve switching. The optimum diameter of a coal log is approximately 90 to 95% the inside diameter of the pipe. The coal-to-water ratio is about 4 to 1. A 200 mm diameter CLP can transport about 2 million tonnes of coal per year. The coal logs at their destination come out of the pipeline onto a moving conveyer which transports the logs to a crusher or stock pile. Coal logs are crushed to match the size of existing fuel. The water effluent is treated and reused at the power plant; there is no need for its discharge. Coal logs can be manufactured with and without the use of binder. By using less than 2 percent emulsified asphalt as binder, no heat is required to compact coal logs. Binderless coal logs can be compacted at less than 90{degrees}C. Compaction pressures, for coal logs made with or without binder, are about 70 MPa. The coal particle size distribution and moisture content must be controlled. The economics of coal log pipeline system have been studied. Results indicate that a new coal log pipeline is cost-competitive with existing railroads for distances greater than 80 km, approximately. CLP is much more economical than coal slurry pipeline of the same diameter. This paper describes the current R&D and commercialization plan for CLP. 4 refs.
Swift, T.E.; Goodrich, J.H.; Kumar, R.M.; McCoy, R.L.; Wilhelm, M.H.; Glascock, M.R.
1982-01-01
The coring, logging and testing of Bennett Ranch Unit well No. 310 was a cooperative effort between Texas Pacific, owner of the well, and Gruy Federal, Inc. The requirements of the contract, which are summarized in Enclosure 1, Appendix A, include drilling and coring activities. The pressure-coring and associated logging and testing programs in selected wells are intended to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report presents detailed information on the first such project. This project demonstrates the usefulness of integrating pressure core, log and production data to realistically evaluate a reservoir for carbon dioxide flood. The engineering of tests and analysis of such experimental data requires original thinking, but the reliability of the results is higher than data derived from conventional tests.
Model Fit after Pairwise Maximum Likelihood
Barendse, M. T.; Ligtvoet, R.; Timmerman, M. E.; Oort, F. J.
2016-01-01
Maximum likelihood factor analysis of discrete data within the structural equation modeling framework rests on the assumption that the observed discrete responses are manifestations of underlying continuous scores that are normally distributed. As maximizing the likelihood of multivariate response patterns is computationally very intensive, the sum of the log–likelihoods of the bivariate response patterns is maximized instead. Little is yet known about how to assess model fit when the analysis is based on such a pairwise maximum likelihood (PML) of two–way contingency tables. We propose new fit criteria for the PML method and conduct a simulation study to evaluate their performance in model selection. With large sample sizes (500 or more), PML performs as well the robust weighted least squares analysis of polychoric correlations. PMID:27148136
Smith, H. D. Jr.; Schultz, W. E.
1985-01-08
This invention relates to an improved method for determining the oil saturation of subsurface earth formations in the vicinity of a well borehole. High energy neutrons irradiate the subsurface earth formations and gamma rays caused by inelastic scatter with the subsurface earth formation constituent materials are measured. For a chosen borehole depth, gamma ray logs are taken in different situations: first, with the formation fluid water and oil mixture in an undisturbed state; second, after flushing the formation with alcohol to displace the formation water and oil mixture; and, finally, after flushing the alcohol from the formation with water to obtain a measurement with no oil in the formation. The gamma ray measurements obtained are then used to determine the oil saturation without requiring knowledge of the porosity of the earth formation, borehole conditions or formation type. When the original oil content of the formation is at a naturally flushed, or residual, oil saturation, the present invention may be used to determine the residual oil saturation.
Maximum-Likelihood Detection Of Noncoherent CPM
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Simon, Marvin K.
1993-01-01
Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.
3. Log bunkhouse (far left), log chicken house (left of ...
3. Log bunkhouse (far left), log chicken house (left of center), equipment shed (center), and workshop (far right). View to northwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT
Quasi-likelihood for Spatial Point Processes
Guan, Yongtao; Jalilian, Abdollah; Waagepetersen, Rasmus
2014-01-01
Summary Fitting regression models for intensity functions of spatial point processes is of great interest in ecological and epidemiological studies of association between spatially referenced events and geographical or environmental covariates. When Cox or cluster process models are used to accommodate clustering not accounted for by the available covariates, likelihood based inference becomes computationally cumbersome due to the complicated nature of the likelihood function and the associated score function. It is therefore of interest to consider alternative more easily computable estimating functions. We derive the optimal estimating function in a class of first-order estimating functions. The optimal estimating function depends on the solution of a certain Fredholm integral equation which in practise is solved numerically. The derivation of the optimal estimating function has close similarities to the derivation of quasi-likelihood for standard data sets. The approximate solution is further equivalent to a quasi-likelihood score for binary spatial data. We therefore use the term quasi-likelihood for our optimal estimating function approach. We demonstrate in a simulation study and a data example that our quasi-likelihood method for spatial point processes is both statistically and computationally efficient. PMID:26041970
Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon
NASA Astrophysics Data System (ADS)
Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin
2014-04-01
The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.
Logging concessions enable illegal logging crisis in the Peruvian Amazon.
Finer, Matt; Jenkins, Clinton N; Sky, Melissa A Blue; Pine, Justin
2014-01-01
The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms. PMID:24743552
Cox, Murray P.; Mendez, Fernando L.; Karafet, Tatiana M.; Pilkington, Maya Metni; Kingan, Sarah B.; Destro-Bisol, Giovanni; Strassmann, Beverly I.; Hammer, Michael F.
2008-01-01
A 2.4-kb stretch within the RRM2P4 region of the X chromosome, previously sequenced in a sample of 41 globally distributed humans, displayed both an ancient time to the most recent common ancestor (e.g., a TMRCA of ∼2 million years) and a basal clade composed entirely of Asian sequences. This pattern was interpreted to reflect a history of introgressive hybridization from archaic hominins (most likely Asian Homo erectus) into the anatomically modern human genome. Here, we address this hypothesis by resequencing the 2.4-kb RRM2P4 region in 131 African and 122 non-African individuals and by extending the length of sequence in a window of 16.5 kb encompassing the RRM2P4 pseudogene in a subset of 90 individuals. We find that both the ancient TMRCA and the skew in non-African representation in one of the basal clades are essentially limited to the central 2.4-kb region. We define a new summary statistic called the minimum clade proportion (pmc), which quantifies the proportion of individuals from a specified geographic region in each of the two basal clades of a binary gene tree, and then employ coalescent simulations to assess the likelihood of the observed central RRM2P4 genealogy under two alternative views of human evolutionary history: recent African replacement (RAR) and archaic admixture (AA). A molecular-clock-based TMRCA estimate of 2.33 million years is a statistical outlier under the RAR model; however, the large variance associated with this estimate makes it difficult to distinguish the predictions of the human origins models tested here. The pmc summary statistic, which has improved power with larger samples of chromosomes, yields values that are significantly unlikely under the RAR model and fit expectations better under a range of archaic admixture scenarios. PMID:18202385
Log interpretation of shaly sandstones
Baker, J.F.
1988-01-01
The determination of water saturation from electrical resistivity measurements to evaluate the potential of reservoirs is a fundamental tool of the oil industry. Shaly sandstones are difficult to evaluate because clays are conductive and they lower the resistivity of the rock. A review of shaly-sandstone research concerning ''volume-of-shale'' equations reveals three theoretical categories: (1) laminated clay equations, (2) dispersed clay equations, and (3) equations that assume that the effect of the clays on the conductivity measurement is directly related to water saturation. A new model for predicting the relative amounts of laminated and dispersed shales and accounting for their effects according to their abundance can be used for any sandstone, clean or shaly. Equations representing each of the three theoretical categories and the new equation were tested on cored Wilcox sandstones from two wells. Cores were analyzed to determine the volume and distribution of clays and to correlate porosity with the well logs.
Well Logging with Californium-252
Boulogne, A.R.
2003-01-06
Californium-252 is an intense neutron emitter that has only recently become available for experimental well logging. The purpose of this research is to investigate the application of well logging to groundwater hydrology; however, most of the techniques and purposes are quite similar to applications in the petroleum industry.
Code of Federal Regulations, 2013 CFR
2013-10-01
... expressed in Coordinated Universal Time (UTC). (2) “ON WATCH” must be entered by the operator beginning a... until the claim or complaint has been satisfied or barred by statute limiting the time for filing suits... log by the operator's signature. (2) The date and time of making an entry must be shown opposite...
Code of Federal Regulations, 2012 CFR
2012-10-01
... expressed in Coordinated Universal Time (UTC). (2) “ON WATCH” must be entered by the operator beginning a... until the claim or complaint has been satisfied or barred by statute limiting the time for filing suits... log by the operator's signature. (2) The date and time of making an entry must be shown opposite...
Code of Federal Regulations, 2014 CFR
2014-10-01
... expressed in Coordinated Universal Time (UTC). (2) “ON WATCH” must be entered by the operator beginning a... until the claim or complaint has been satisfied or barred by statute limiting the time for filing suits... log by the operator's signature. (2) The date and time of making an entry must be shown opposite...
NASA Astrophysics Data System (ADS)
Schuchardt, A.; Morche, D.
2015-12-01
Rivers cover only a small part of the Earth`s surface, yet they transfer sediment in globally significant quantities. In mountainous regions, the majority of the total channel length occurs in headwater streams. Those mountain channels are influenced in terms of sediment connectivity by processes on the slopes. For example in such a sediment routing system, sediment originating from debris flows on the slopes is delivered along sediment pathways to the channel system and can be transported further downstream as solid load. Interruption of instream coarse sediment connectivity is closely related to the existence of channel blocking barriers which also can be formed by biota. By storing sediment large wood (LW) log jams disrupt in-channel sediment connectivity. We present a study design in order to decipher the short to long term effects (c. 10-2-102 years) of sediment (dis)connectivity effects of large wood. The study areas are two basins in mountain ranges in Germany and Austria. In Austria the drainage area of the river Fugnitz was chosen which is located in the National Park Thayatal. The other drainage area of the river Sieber in Saxony-Anhalt, Germany, is located in the Harz National Park. Since studies on LW and its geomorphological effects in Central European rivers are still rare the main goals of the project are: •to identify important triggers for LW transport from slopes into the channels •to examine the spatial distribution and characterization of LW in main and slope channels by mapping and dGPS measurements •to determine the effects of LW on channel hydraulic parameters (e.g. slope, width, grains size composition, roughness) by field measurements of channel long profiles and cross section with dGPS and Wolman particle counts •to quantify the direct effects of LW on discharge and bed load transport by measuring flow velocity with an Ott-Nautilus current meter and to measure bed load up- and downstream of log jams using a portable Helley
Boldman, K G; Van Vleck, L D
1991-12-01
Estimation of (co)variance components by derivative-free REML requires repeated evaluation of the log-likelihood function of the data. Gaussian elimination of the augmented mixed model coefficient matrix is often used to evaluate the likelihood function, but it can be costly for animal models with large coefficient matrices. This study investigated the use of a direct sparse matrix solver to obtain the log-likelihood function. The sparse matrix package SPARSPAK was used to reorder the mixed model equations once and then repeatedly to solve the equations by Cholesky factorization to generate the terms required to calculate the likelihood. The animal model used for comparison contained 19 fixed levels, 470 maternal permanent environmental effects, and 1586 direct and 1586 maternal genetic effects, resulting in a coefficient matrix of order 3661 with .3% nonzero elements after including numerator relationships. Compared with estimation via Gaussian elimination of the unordered system, utilization of SPARSPAK required 605 and 240 times less central processing unit time on mainframes and personal computers, respectively. The SPARSPAK package also required less memory and provided solutions for all effects in the model. PMID:1787202
Seasonal logging, process response, and geomorphic work
NASA Astrophysics Data System (ADS)
Mohr, C.; Zimmermann, A.; Korup, O.; Iroume, A.; Francke, T.; Bronstert, A.
2013-12-01
Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3-min) measurements of stream discharge and sediment concentrations in similar-sized (~0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors which in turn provides model uncertainties. We find that, where no logging occurred, ~80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. Particularly dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.
A new approach for deriving pseudovelocity logs from resistivity logs
Dos Santos, W.L.B.; Ulrych, T.J.; De Lima, O.A.L.
1988-01-01
This paper describes a method of generating pseudovelocity logs using measurements of electrical resistivity. A theoretical relation between electrical resistivity and transit time, which is applicable to a wide range of lithologies, has been developed. The application of this relation using a method which defines lithoresistivity zones as lithological intervals related to the same formation and showing small resistivity variations, has been tested in the Reconcavo sedimentary basin in Bahia, Brazil. A comparison of derived pseudovelocity logs with actual sonic logs for five wells shows the validity of the present approach.
NASA Technical Reports Server (NTRS)
Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.
1982-01-01
An all-sky survey of X-ray sources was performed, complete to a limiting sensitivity of 3.1 x 10 to the -11 ergs/sq cm/s in the 2-10 keV band. The complete sample has allowed construction of luminosity functions based on a flux-limited sample for clusters of galaxies and active galactic nuclei. Integration of the best-fit luminosity functions indicates that clusters of galaxies contribute about 4% of the 2-10 keV DXRB, and active galactic nuclei about 20%. It is predicted that many of the objects seen in the deep survey should be local, relatively low luminosity active galactic nuclei and clusters of galaxies.
Likelihoods for fixed rank nomination networks.
Hoff, Peter; Fosdick, Bailey; Volfovsky, Alex; Stovel, Katherine
2013-12-01
Many studies that gather social network data use survey methods that lead to censored, missing, or otherwise incomplete information. For example, the popular fixed rank nomination (FRN) scheme, often used in studies of schools and businesses, asks study participants to nominate and rank at most a small number of contacts or friends, leaving the existence of other relations uncertain. However, most statistical models are formulated in terms of completely observed binary networks. Statistical analyses of FRN data with such models ignore the censored and ranked nature of the data and could potentially result in misleading statistical inference. To investigate this possibility, we compare Bayesian parameter estimates obtained from a likelihood for complete binary networks with those obtained from likelihoods that are derived from the FRN scheme, and therefore accommodate the ranked and censored nature of the data. We show analytically and via simulation that the binary likelihood can provide misleading inference, particularly for certain model parameters that relate network ties to characteristics of individuals and pairs of individuals. We also compare these different likelihoods in a data analysis of several adolescent social networks. For some of these networks, the parameter estimates from the binary and FRN likelihoods lead to different conclusions, indicating the importance of analyzing FRN data with a method that accounts for the FRN survey design. PMID:25110586
NASA Astrophysics Data System (ADS)
G, A., Major; Fretwell, H. M.; Dugdale, S. B.; Alam, M. A.
1998-11-01
A novel method for reconstructing the Fermi surface from experimental two-dimensional angular correlation of positron annihilation radiation (2D-ACAR) projections is proposed. In this algorithm, the 3D electron momentum-density distribution is expanded in terms of a basis of wavelet-like functions. The parameters of the model, the wavelet coefficients, are determined by maximizing the likelihood function corresponding to the experimental data and the projections calculated from the model. In contrast to other expansions, in the case of that in terms of wavelets a relatively small number of model parameters are sufficient for representing the relevant parts of the 3D distribution, thus keeping computation times reasonably short. Unlike other reconstruction methods, this algorithm takes full account of the statistical information content of the data and therefore may help to reduce the amount of time needed for data acquisition. An additional advantage of wavelet expansion may be the possibility of retrieving the Fermi surface directly from the wavelet coefficients rather than indirectly using the reconstructed 3D distribution.
Bonner, B; Hutchings, L; Kasameyer, P
2006-06-14
We devise a strategy for analysis of Vp and Vs microearthquake tomography results in the Salton Sea geothermal field to identify important features of the geothermal reservoir. We first interpret rock properties in State 2-14 borehole based upon logged core through the reservoir. Then, we interpret seismic recordings in the well (Daley et al., 1988) to develop the strategy. We hypothesize that mapping Poisson's ratio has two applications for the Salton Sea geothermal reservoir: (1) to map the top of the reservoir, and (2) as a diagnostic for permeable zones. Poisson's ratio can be obtained from Vp and Vs. In the State 2-14 borehole, Poisson's ratio calculated from large scale averages ({approx} 150 m) shows a monotonic decrease with depth to about 1300 m, at which point it increases with depth. Our model is that the monotonic decrease is due to compaction, and the increase below 1300 m is due to the rocks being hydrothermally altered. We hypothesize we can map the depth to alteration by identifying the transition from decreasing to increasing values; and thus, map the top of the reservoir, which is associated with a known increase in sulfite, chlorite, and epidote alteration that may be indicative of hydrothermal activity. We also observe (from Daley et. al. plots) an anomalous drop in Poisson's ratio at a depth of about 900 m, within a sandstone formation. The sandstone has a P-wave velocity significantly higher than the siltstone above it but a lower velocity in the lower half of the formation relative to the upper half. We interpret the relative decrease in velocity to be due to fracturing and chemical alteration caused by permeability. We conclude that using Vp and Vs tomography results to obtain images of Poisson's ratio has the potential to identify significant features in the geothermal reservoir in this geologic setting. Seismic attenuation tomography results (mapped as Qp and Qs) should also be useful for evaluating geothermal reservoirs, but that is not
MAIL LOG, program summary and specifications
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.
Collaborative double robust targeted maximum likelihood estimation.
van der Laan, Mark J; Gruber, Susan
2010-01-01
Collaborative double robust targeted maximum likelihood estimators represent a fundamental further advance over standard targeted maximum likelihood estimators of a pathwise differentiable parameter of a data generating distribution in a semiparametric model, introduced in van der Laan, Rubin (2006). The targeted maximum likelihood approach involves fluctuating an initial estimate of a relevant factor (Q) of the density of the observed data, in order to make a bias/variance tradeoff targeted towards the parameter of interest. The fluctuation involves estimation of a nuisance parameter portion of the likelihood, g. TMLE has been shown to be consistent and asymptotically normally distributed (CAN) under regularity conditions, when either one of these two factors of the likelihood of the data is correctly specified, and it is semiparametric efficient if both are correctly specified. In this article we provide a template for applying collaborative targeted maximum likelihood estimation (C-TMLE) to the estimation of pathwise differentiable parameters in semi-parametric models. The procedure creates a sequence of candidate targeted maximum likelihood estimators based on an initial estimate for Q coupled with a succession of increasingly non-parametric estimates for g. In a departure from current state of the art nuisance parameter estimation, C-TMLE estimates of g are constructed based on a loss function for the targeted maximum likelihood estimator of the relevant factor Q that uses the nuisance parameter to carry out the fluctuation, instead of a loss function for the nuisance parameter itself. Likelihood-based cross-validation is used to select the best estimator among all candidate TMLE estimators of Q(0) in this sequence. A penalized-likelihood loss function for Q is suggested when the parameter of interest is borderline-identifiable. We present theoretical results for "collaborative double robustness," demonstrating that the collaborative targeted maximum
Collaborative Double Robust Targeted Maximum Likelihood Estimation*
van der Laan, Mark J.; Gruber, Susan
2010-01-01
Collaborative double robust targeted maximum likelihood estimators represent a fundamental further advance over standard targeted maximum likelihood estimators of a pathwise differentiable parameter of a data generating distribution in a semiparametric model, introduced in van der Laan, Rubin (2006). The targeted maximum likelihood approach involves fluctuating an initial estimate of a relevant factor (Q) of the density of the observed data, in order to make a bias/variance tradeoff targeted towards the parameter of interest. The fluctuation involves estimation of a nuisance parameter portion of the likelihood, g. TMLE has been shown to be consistent and asymptotically normally distributed (CAN) under regularity conditions, when either one of these two factors of the likelihood of the data is correctly specified, and it is semiparametric efficient if both are correctly specified. In this article we provide a template for applying collaborative targeted maximum likelihood estimation (C-TMLE) to the estimation of pathwise differentiable parameters in semi-parametric models. The procedure creates a sequence of candidate targeted maximum likelihood estimators based on an initial estimate for Q coupled with a succession of increasingly non-parametric estimates for g. In a departure from current state of the art nuisance parameter estimation, C-TMLE estimates of g are constructed based on a loss function for the targeted maximum likelihood estimator of the relevant factor Q that uses the nuisance parameter to carry out the fluctuation, instead of a loss function for the nuisance parameter itself. Likelihood-based cross-validation is used to select the best estimator among all candidate TMLE estimators of Q0 in this sequence. A penalized-likelihood loss function for Q is suggested when the parameter of interest is borderline-identifiable. We present theoretical results for “collaborative double robustness,” demonstrating that the collaborative targeted maximum
Enhanced carbon-oxygen log interpretations using supplemental log curves
Wyatt, D.F. Jr.; Jacobson, L.A.; Fox, P.
1994-12-31
Supplemental measurements from induced nuclear spectrometry tools are examined to demonstrate what additional information they provide about the well and reservoir conditions. Logs in shut-in wells from Indonesia provide examples of oxygen activation measurements showing cross-flow from one reservoir to another via open perforations. Leaking squeezed perforations were also observed. An example from Alaska shows radioactive scale build-up in the casing which spectral analysis identifies as a mixture of uranium and thorium salts. Another log, where the casing fluid was replaced with crude oil, demonstrates a technique for identifying cement channels. Logs from Nigeria comparing oil saturation estimates before and after a squeeze operation illustrate the effect of casing fluid flushing of the formation through open perforations. Understanding the diagnostic character of these curves leads to higher confidence in the overall log interpretation process.
Maximum likelihood clustering with dependent feature trees
NASA Technical Reports Server (NTRS)
Chittineni, C. B. (Principal Investigator)
1981-01-01
The decomposition of mixture density of the data into its normal component densities is considered. The densities are approximated with first order dependent feature trees using criteria of mutual information and distance measures. Expressions are presented for the criteria when the densities are Gaussian. By defining different typs of nodes in a general dependent feature tree, maximum likelihood equations are developed for the estimation of parameters using fixed point iterations. The field structure of the data is also taken into account in developing maximum likelihood equations. Experimental results from the processing of remotely sensed multispectral scanner imagery data are included.
Park, Chung Gyoo; Son, Joon-Ki; Lee, Byung-Ho; Cho, Jin Hoon; Ren, Yonglin
2014-12-01
The Bursaphelenchus xylophilus, commonly known as pinewood nematode in Japan, is a quarantine pest and is most often associated with beetles of the genus Monochamus, the pine sawyers, particularly Monochamus alternatus. Long-distance dispersal of the nematode and its vectors led to widespread losses in pine forests. Two fumigation trials were conducted for treatment of logs naturally infested with both M. alternatus and B. xylophilus. The logs were treated with ethanedinitrile or metam sodium at low temperature (-7-25.7°C and -3.7-23.1°C) for 3-d exposure in winter and early spring. Fumigation with ethanedinitrile at concentrations of 48, 68, 97 and 158 g/m(3) resulted in 34.6-58.3, 91.5-97.2, 100, and 100% mortality for M. alternatus and 88.4, 77.9, 96.4, and 98.0% mortality for B. xylophilus, respectively. With Metam sodium fumigation at a dose rate of 1162 g/m(3), 100% M. alternatus and 97.4% B. xylophilus were killed. These results suggest that 97 g/m(3) of ethanedinitrile is adequate for complete control of M. alternatus in pine wood and >158 g/m(3) is required for eradication of B. xylophilus at low temperature fumigation. These results suggest that 97 g/m(3) of ethanedinitrile offers complete control of M. alternatus in pine wood and control of >98% B. xylophilus in winter or spring fumigation at a dosage rate of 158 g/m(3). Therefore, ethanedinitrile has great potential for treatment of fresh pine wooden logs to manage the nematodes and the vector insects at low temperature. PMID:26470069
Mohaghegh, S.; Balan, B.; Ameri, S.
1995-12-31
The ultimate test for any technique that bears the claim of permeability prediction from well log data, is accurate and verifiable prediction of permeability for wells from which only the well log data is available. So far all the available models and techniques have been tried on data that includes both well logs and the corresponding permeability values. This approach at best is nothing more than linear or nonlinear curve fitting. The objective of this paper is to test the capability of the most promising of these techniques in independent (where corresponding permeability values are not available or have not been used in development of the model) prediction of permeability in a heterogeneous formation. These techniques are {open_quotes}Multiple Regression{close_quotes} and {open_quotes}Virtual Measurements using Artificial Neural Networks.{close_quotes} For the purposes of this study several wells from a heterogeneous formation in West Virginia were selected. Well log data and corresponding permeability values for these wells were available. The techniques were applied to the remaining data and a permeability model for the field was developed. The model was then applied to the well that was separated from the rest of the data earlier and the results were compared. This approach will test the generalization power of each technique. The result will show that although Multiple Regression provides acceptable results for wells that were used during model development, (good curve fitting,) it lacks a consistent generalization capability, meaning that it does not perform as well with data it has not been exposed to (the data from well that has been put aside). On the other hand, Virtual Measurement technique provides a steady generalization power. This technique is able to perform the permeability prediction task even for the entire wells with no prior exposure to their permeability profile.
Franco-Pedroso, Javier; Ramos, Daniel; Gonzalez-Rodriguez, Joaquin
2016-01-01
In forensic science, trace evidence found at a crime scene and on suspect has to be evaluated from the measurements performed on them, usually in the form of multivariate data (for example, several chemical compound or physical characteristics). In order to assess the strength of that evidence, the likelihood ratio framework is being increasingly adopted. Several methods have been derived in order to obtain likelihood ratios directly from univariate or multivariate data by modelling both the variation appearing between observations (or features) coming from the same source (within-source variation) and that appearing between observations coming from different sources (between-source variation). In the widely used multivariate kernel likelihood-ratio, the within-source distribution is assumed to be normally distributed and constant among different sources and the between-source variation is modelled through a kernel density function (KDF). In order to better fit the observed distribution of the between-source variation, this paper presents a different approach in which a Gaussian mixture model (GMM) is used instead of a KDF. As it will be shown, this approach provides better-calibrated likelihood ratios as measured by the log-likelihood ratio cost (Cllr) in experiments performed on freely available forensic datasets involving different trace evidences: inks, glass fragments and car paints. PMID:26901680
NASA Astrophysics Data System (ADS)
Fu, Qiang; Luk, Wai-Shing; Tao, Jun; Zeng, Xuan; Cai, Wei
In this paper, a novel intra-die spatial correlation extraction method referred to as MLEMTC (Maximum Likelihood Estimation for Multiple Test Chips) is presented. In the MLEMTC method, a joint likelihood function is formulated by multiplying the set of individual likelihood functions for all test chips. This joint likelihood function is then maximized to extract a unique group of parameter values of a single spatial correlation function, which can be used for statistical circuit analysis and design. Moreover, to deal with the purely random component and measurement error contained in measurement data, the spatial correlation function combined with the correlation of white noise is used in the extraction, which significantly improves the accuracy of the extraction results. Furthermore, an LU decomposition based technique is developed to calculate the log-determinant of the positive definite matrix within the likelihood function, which solves the numerical stability problem encountered in the direct calculation. Experimental results have shown that the proposed method is efficient and practical.
New materials for fireplace logs
NASA Technical Reports Server (NTRS)
Kieselback, D. J.; Smock, A. W.
1971-01-01
Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.
Growing local likelihood network: Emergence of communities
NASA Astrophysics Data System (ADS)
Chen, S.; Small, M.
2015-10-01
In many real situations, networks grow only via local interactions. New nodes are added to the growing network with information only pertaining to a small subset of existing nodes. Multilevel marketing, social networks, and disease models can all be depicted as growing networks based on local (network path-length) distance information. In these examples, all nodes whose distance from a chosen center is less than d form a subgraph. Hence, we grow networks with information only from these subgraphs. Moreover, we use a likelihood-based method, where at each step we modify the networks by changing their likelihood to be closer to the expected degree distribution. Combining the local information and the likelihood method, we grow networks that exhibit novel features. We discover that the likelihood method, over certain parameter ranges, can generate networks with highly modulated communities, even when global information is not available. Communities and clusters are abundant in real-life networks, and the method proposed here provides a natural mechanism for the emergence of communities in scale-free networks. In addition, the algorithmic implementation of network growth via local information is substantially faster than global methods and allows for the exploration of much larger networks.
Numerical likelihood analysis of cosmic ray anisotropies
Carlos Hojvat et al.
2003-07-02
A numerical likelihood approach to the determination of cosmic ray anisotropies is presented which offers many advantages over other approaches. It allows a wide range of statistically meaningful hypotheses to be compared even when full sky coverage is unavailable, can be readily extended in order to include measurement errors, and makes maximum unbiased use of all available information.
Efficient Bit-to-Symbol Likelihood Mappings
NASA Technical Reports Server (NTRS)
Moision, Bruce E.; Nakashima, Michael A.
2010-01-01
This innovation is an efficient algorithm designed to perform bit-to-symbol and symbol-to-bit likelihood mappings that represent a significant portion of the complexity of an error-correction code decoder for high-order constellations. Recent implementation of the algorithm in hardware has yielded an 8- percent reduction in overall area relative to the prior design.
Woody, Michael S; Lewis, John H; Greenberg, Michael J; Goldman, Yale E; Ostap, E Michael
2016-07-26
We present MEMLET (MATLAB-enabled maximum-likelihood estimation tool), a simple-to-use and powerful program for utilizing maximum-likelihood estimation (MLE) for parameter estimation from data produced by single-molecule and other biophysical experiments. The program is written in MATLAB and includes a graphical user interface, making it simple to integrate into the existing workflows of many users without requiring programming knowledge. We give a comparison of MLE and other fitting techniques (e.g., histograms and cumulative frequency distributions), showing how MLE often outperforms other fitting methods. The program includes a variety of features. 1) MEMLET fits probability density functions (PDFs) for many common distributions (exponential, multiexponential, Gaussian, etc.), as well as user-specified PDFs without the need for binning. 2) It can take into account experimental limits on the size of the shortest or longest detectable event (i.e., instrument "dead time") when fitting to PDFs. The proper modification of the PDFs occurs automatically in the program and greatly increases the accuracy of fitting the rates and relative amplitudes in multicomponent exponential fits. 3) MEMLET offers model testing (i.e., single-exponential versus double-exponential) using the log-likelihood ratio technique, which shows whether additional fitting parameters are statistically justifiable. 4) Global fitting can be used to fit data sets from multiple experiments to a common model. 5) Confidence intervals can be determined via bootstrapping utilizing parallel computation to increase performance. Easy-to-follow tutorials show how these features can be used. This program packages all of these techniques into a simple-to-use and well-documented interface to increase the accessibility of MLE fitting. PMID:27463130
Scientific LogAnalyzer: a web-based tool for analyses of server log files in psychological research.
Reips, Ulf-Dietrich; Stieger, Stefan
2004-05-01
Scientific LogAnalyzer is a platform-independent interactive Web service for the analysis of log files. Scientific LogAnalyzer offers several features not available in other log file analysis tools--for example, organizational criteria and computational algorithms suited to aid behavioral and social scientists. Scientific LogAnalyzer is highly flexible on the input side (unlimited types of log file formats), while strictly keeping a scientific output format. Features include (1) free definition of log file format, (2) searching and marking dependent on any combination of strings (necessary for identifying conditions in experiment data), (3) computation of response times, (4) detection of multiple sessions, (5) speedy analysis of large log files, (6) output in HTML and/or tab-delimited form, suitable for import into statistics software, and (7) a module for analyzing and visualizing drop-out. Several methodological features specifically needed in the analysis of data collected in Internet-based experiments have been implemented in the Web-based tool and are described in this article. A regression analysis with data from 44 log file analyses shows that the size of the log file and the domain name lookup are the two main factors determining the duration of an analysis. It is less than a minute for a standard experimental study with a 2 x 2 design, a dozen Web pages, and 48 participants (ca. 800 lines, including data from drop-outs). The current version of Scientific LogAnalyzer is freely available for small log files. Its Web address is http://genpsylab-logcrunsh.unizh.ch/. PMID:15354696
Likelihood approaches for proportional likelihood ratio model with right-censored data.
Zhu, Hong
2014-06-30
Regression methods for survival data with right censoring have been extensively studied under semiparametric transformation models such as the Cox regression model and the proportional odds model. However, their practical application could be limited because of possible violation of model assumption or lack of ready interpretation for the regression coefficients in some cases. As an alternative, in this paper, the proportional likelihood ratio model introduced by Luo and Tsai is extended to flexibly model the relationship between survival outcome and covariates. This model has a natural connection with many important semiparametric models such as generalized linear model and density ratio model and is closely related to biased sampling problems. Compared with the semiparametric transformation model, the proportional likelihood ratio model is appealing and practical in many ways because of its model flexibility and quite direct clinical interpretation. We present two likelihood approaches for the estimation and inference on the target regression parameters under independent and dependent censoring assumptions. Based on a conditional likelihood approach using uncensored failure times, a numerically simple estimation procedure is developed by maximizing a pairwise pseudo-likelihood. We also develop a full likelihood approach, and the most efficient maximum likelihood estimator is obtained by a profile likelihood. Simulation studies are conducted to assess the finite-sample properties of the proposed estimators and compare the efficiency of the two likelihood approaches. An application to survival data for bone marrow transplantation patients of acute leukemia is provided to illustrate the proposed method and other approaches for handling non-proportionality. The relative merits of these methods are discussed in concluding remarks. PMID:24500821
Odic, Darko; Im, Hee Yeon; Eisinger, Robert; Ly, Ryan; Halberda, Justin
2016-06-01
A simple and popular psychophysical model-usually described as overlapping Gaussian tuning curves arranged along an ordered internal scale-is capable of accurately describing both human and nonhuman behavioral performance and neural coding in magnitude estimation, production, and reproduction tasks for most psychological dimensions (e.g., time, space, number, or brightness). This model traditionally includes two parameters that determine how a physical stimulus is transformed into a psychological magnitude: (1) an exponent that describes the compression or expansion of the physical signal into the relevant psychological scale (β), and (2) an estimate of the amount of inherent variability (often called internal noise) in the Gaussian activations along the psychological scale (σ). To date, linear slopes on log-log plots have traditionally been used to estimate β, and a completely separate method of averaging coefficients of variance has been used to estimate σ. We provide a respectful, yet critical, review of these traditional methods, and offer a tutorial on a maximum-likelihood estimation (MLE) and a Bayesian estimation method for estimating both β and σ [PsiMLE(β,σ)], coupled with free software that researchers can use to implement it without a background in MLE or Bayesian statistics (R-PsiMLE). We demonstrate the validity, reliability, efficiency, and flexibility of this method through a series of simulations and behavioral experiments, and find the new method to be superior to the traditional methods in all respects. PMID:25987306
Predicting reservoir wettability via well logs
NASA Astrophysics Data System (ADS)
Feng, Cheng; Fu, Jinhua; Shi, Yujiang; Li, Gaoren; Mao, Zhiqiang
2016-06-01
Wettability is an important factor in controlling the distribution of oil and water. However, its evaluation has so far been a difficult problem because no log data can directly indicate it. In this paper, a new method is proposed for quantitatively predicting reservoir wettability via well log analysis. Specifically, based on the J function, diagenetic facies classification and the piecewise power functions, capillary pressure curves are constructed from conventional logs and a nuclear magnetic resonance (NMR) log respectively. Under the influence of wettability, the latter is distorted while the former remains unaffected. Therefore, the ratio of the median radius obtained from the two kinds of capillary pressure curve is calculated to reflect wettability, a quantitative relationship between the ratio and reservoir wettability is then established. According to the low-permeability core sample capillary pressure curve, NMR {{T}2} spectrum and contact angle experimental data from the bottom of the Upper Triassic reservoirs in western Ordos Basin, China, two kinds of constructing capillary pressure curve models and a predictive wettability model are calibrated. The wettability model is verified through the Amott wettability index and saturation exponent from resistivity measurement and their determined wettability levels are comparable, indicating that the proposed model is quite reliable. In addition, the model’s good application effect is exhibited in the field study. Thus, the quantitatively predicting reservoir wettability model proposed in this paper provides an effective tool for formation evaluation, field development and the improvement of oil recovery.
Well Logging and Logging Analysis of UHP metamorphic Rocks in CCSD Main Hole (0-2000m)
NASA Astrophysics Data System (ADS)
Pan, H.; Niu, Y.; Wang, W.; Zhu, L.; Xu, D.; Wu, H.; Li, S.; Luo, M.
2004-12-01
CCSD logging engineering gather many modern high technologies and employs various advanced logging tools to survey the sidewall continuously. This can obtain various physical, chemical, geometrical, etc in-situ information of the borehole's profile. So well logging is one of the most important parts and pivotal technologies in the project of CCSD. The main logging methods in CCSD-MH(0-2000m) are laterolog (Rd,Rs), gamma ray(GR), nature gamma spectrometry(U, TH, K), density(DEN), photo electric section exponent (Pe), compensated neutron(CNL), multipole array acoustic (Vp, Vs, Vst), Simultaneous Acoustic-Resistivity-image(Star-II), temperature(T),magnetic susceptibility(MS), three component borehole magnetic and redox potential log,etc. The various metamorphic rocks can be classified by logging curves,and their physical parameters can be acquired by analyzing the response characters of various metamorphic rocks and by statistics. According to the logging cross plot, We can research the clustering of metamorphite's physical property. Five lithologic segments can be obtainend by logging curves. The GR, Th, U, K logging values of segment 1 is lower than the third, fourth and fiveth segment, higher than segment 2; The DEN, Pe values of segment 1 higher than the third, fourth and fiveth segments. The main rocks in segment 1,2,3,4,5 are eclogites, serpentinites, paragneiss, orthogneiss, and eclogites(containing silicon and muscovite ) respectively. Generally, eclogite contain rutile, silicon, muscovite, etc. minerals. These minerals have response obviously on log curves.There are rutile,ilmenite, pyrite mineralized, etc. Making use of DEN, Pe, susceptibility log values, these mineralized layers can be goodly demarcation. For example, on the rutile mineralzed layer, the logging curve response characters are of high density and Pe obviously. The key data of the synthetical seismic record is wave impedance. In this paper, Utilize the data of AC, DEN curves to calculate the
cosmoabc: Likelihood-free inference for cosmology
NASA Astrophysics Data System (ADS)
Ishida, Emille E. O.; Vitenti, Sandro D. P.; Penna-Lima, Mariana; Trindade, Arlindo M.; Cisewski, Jessi; M.; de Souza, Rafael; Cameron, Ewan; Busti, Vinicius C.
2015-05-01
Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogs. cosmoabc is a Python Approximate Bayesian Computation (ABC) sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code can be coupled to an external simulator to allow incorporation of arbitrary distance and prior functions. When coupled with the numcosmo library, it has been used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function.
Spectral likelihood expansions for Bayesian inference
NASA Astrophysics Data System (ADS)
Nagel, Joseph B.; Sudret, Bruno
2016-03-01
A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.
Stepwise Signal Extraction via Marginal Likelihood
Du, Chao; Kao, Chu-Lan Michael
2015-01-01
This paper studies the estimation of stepwise signal. To determine the number and locations of change-points of the stepwise signal, we formulate a maximum marginal likelihood estimator, which can be computed with a quadratic cost using dynamic programming. We carry out extensive investigation on the choice of the prior distribution and study the asymptotic properties of the maximum marginal likelihood estimator. We propose to treat each possible set of change-points equally and adopt an empirical Bayes approach to specify the prior distribution of segment parameters. Detailed simulation study is performed to compare the effectiveness of this method with other existing methods. We demonstrate our method on single-molecule enzyme reaction data and on DNA array CGH data. Our study shows that this method is applicable to a wide range of models and offers appealing results in practice. PMID:27212739
Lienkaemper, James J.; Williams, Patrick L.; Sickler, Robert R.; Fumal, Thomas E.
2005-01-01
This publication makes available a detailed trench log (sheets 1 and 2) of a 110-m trench we excavated in 2004 across a tectonic sag pond in the Hayward fault zone. Also included are revised stratigraphic unit descriptions from this fifth field season of subsurface investigation of the Hayward fault at Tyson's Lagoon (Tule Pond). Preliminary findings based on fieldwork done in 2000 have been published (Lienkaemper and others: data archive, 2002a; report, 2002b), as were the logs and data for 2001-2003 (Lienkaemper and others, 2003, L03). A continuous exposure of the geologic section across the entire pond made in 2004 (Fig. 1, 04A) has revealed some critical miscorrelations of units made in the original on-line version of L03, hence users of these earlier trench data should only use the 2005 revised version 2.0 of L03 for correlation purposes. Lienkaemper, Williams, and Sickler interpreted the geology and logged the trenches. Fumal did most of the trench photography. The Hayward fault is recognized to be among the most hazardous in the United States (Working Group on California Earthquake Probabilities, 2003). Establishing a chronology of prehistoric or paleoearthquakes is of immediate use in resolving the likelihood of future large earthquakes Hayward fault. This document makes available geologic evidence for historical and prehistoric surface-rupturing earthquakes preserved at the site. A second, formal report on our conclusions based on these data is in preparation.
Likelihood-Based Climate Model Evaluation
NASA Technical Reports Server (NTRS)
Braverman, Amy; Cressie, Noel; Teixeira, Joao
2012-01-01
Climate models are deterministic, mathematical descriptions of the physics of climate. Confidence in predictions of future climate is increased if the physics are verifiably correct. A necessary, (but not sufficient) condition is that past and present climate be simulated well. Quantify the likelihood that a (summary statistic computed from a) set of observations arises from a physical system with the characteristics captured by a model generated time series. Given a prior on models, we can go further: posterior distribution of model given observations.
Seasonal logging, process response, and geomorphic work
NASA Astrophysics Data System (ADS)
Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.
2013-09-01
Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but potentially overlook important geomorphic responses on shorter time scales immediately following timber harvest. Sediments fluxes are commonly estimated from linear regression of intermittent measurements of water and sediment discharge using sediment rating curves (SRCs). However, these often unsatisfactorily reproduce non-linear effects such as discharge-load hystereses. We resolve such important dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3 min) measurements of stream discharge and sediment concentrations in similar-sized (~ 0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest (RF) algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ~ 80% of the total sediment load was transported during rare but high magnitude runoff events during only 5% of the monitoring period. The variability of sediment flux of these rare events spans four orders of magnitude. In particular dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more moderate events. We show that QRFs outperforms traditional SRCs in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment discharge at high temporal resolution.
Seasonal logging, process response, and geomorphic work
NASA Astrophysics Data System (ADS)
Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.
2014-03-01
Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux often focus on annual timescales but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics with non-parametric quantile regression forests (QRF) based on high-frequency (3 min) discharge measurements and sediment concentration data sampled every 30-60 min in similar-sized (˜0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the random forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ˜80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. In particular, dry-season logging dampened the relative role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We show that QRFs outperform traditional sediment rating curves (SRCs) in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.
Evaluating network models: A likelihood analysis
NASA Astrophysics Data System (ADS)
Wang, Wen-Qiang; Zhang, Qian-Ming; Zhou, Tao
2012-04-01
Many models are put forward to mimic the evolution of real networked systems. A well-accepted way to judge the validity is to compare the modeling results with real networks subject to several structural features. Even for a specific real network, we cannot fairly evaluate the goodness of different models since there are too many structural features while there is no criterion to select and assign weights on them. Motivated by the studies on link prediction algorithms, we propose a unified method to evaluate the network models via the comparison of the likelihoods of the currently observed network driven by different models, with an assumption that the higher the likelihood is, the more accurate the model is. We test our method on the real Internet at the Autonomous System (AS) level, and the results suggest that the Generalized Linear Preferential (GLP) model outperforms the Tel Aviv Network Generator (Tang), while both two models are better than the Barabási-Albert (BA) and Erdös-Rényi (ER) models. Our method can be further applied in determining the optimal values of parameters that correspond to the maximal likelihood. The experiment indicates that the parameters obtained by our method can better capture the characters of newly added nodes and links in the AS-level Internet than the original methods in the literature.
Maximum likelihood continuity mapping for fraud detection
Hogden, J.
1997-05-01
The author describes a novel time-series analysis technique called maximum likelihood continuity mapping (MALCOM), and focuses on one application of MALCOM: detecting fraud in medical insurance claims. Given a training data set composed of typical sequences, MALCOM creates a stochastic model of sequence generation, called a continuity map (CM). A CM maximizes the probability of sequences in the training set given the model constraints, CMs can be used to estimate the likelihood of sequences not found in the training set, enabling anomaly detection and sequence prediction--important aspects of data mining. Since MALCOM can be used on sequences of categorical data (e.g., sequences of words) as well as real valued data, MALCOM is also a potential replacement for database search tools such as N-gram analysis. In a recent experiment, MALCOM was used to evaluate the likelihood of patient medical histories, where ``medical history`` is used to mean the sequence of medical procedures performed on a patient. Physicians whose patients had anomalous medical histories (according to MALCOM) were evaluated for fraud by an independent agency. Of the small sample (12 physicians) that has been evaluated, 92% have been determined fraudulent or abusive. Despite the small sample, these results are encouraging.
Likelihood methods for point processes with refractoriness.
Citi, Luca; Ba, Demba; Brown, Emery N; Barbieri, Riccardo
2014-02-01
Likelihood-based encoding models founded on point processes have received significant attention in the literature because of their ability to reveal the information encoded by spiking neural populations. We propose an approximation to the likelihood of a point-process model of neurons that holds under assumptions about the continuous time process that are physiologically reasonable for neural spike trains: the presence of a refractory period, the predictability of the conditional intensity function, and its integrability. These are properties that apply to a large class of point processes arising in applications other than neuroscience. The proposed approach has several advantages over conventional ones. In particular, one can use standard fitting procedures for generalized linear models based on iteratively reweighted least squares while improving the accuracy of the approximation to the likelihood and reducing bias in the estimation of the parameters of the underlying continuous-time model. As a result, the proposed approach can use a larger bin size to achieve the same accuracy as conventional approaches would with a smaller bin size. This is particularly important when analyzing neural data with high mean and instantaneous firing rates. We demonstrate these claims on simulated and real neural spiking activity. By allowing a substantive increase in the required bin size, our algorithm has the potential to lower the barrier to the use of point-process methods in an increasing number of applications. PMID:24206384
Likelihood-based modification of experimental crystal structure electron density maps
Terwilliger, Thomas C.
2005-04-16
A maximum-likelihood method for improves an electron density map of an experimental crystal structure. A likelihood of a set of structure factors {F.sub.h } is formed for the experimental crystal structure as (1) the likelihood of having obtained an observed set of structure factors {F.sub.h.sup.OBS } if structure factor set {F.sub.h } was correct, and (2) the likelihood that an electron density map resulting from {F.sub.h } is consistent with selected prior knowledge about the experimental crystal structure. The set of structure factors {F.sub.h } is then adjusted to maximize the likelihood of {F.sub.h } for the experimental crystal structure. An improved electron density map is constructed with the maximized structure factors.
Method for induced polarization logging
Vinegar, H.J.; Waxman, M.H.
1987-04-14
A method is described for generating a log of the formation phase shift, resistivity and spontaneous potential of an earth formation from data obtained from the earth formation with a multi-electrode induced polarization logging tool. The method comprises obtaining data samples from the formation at measurement points equally spaced in time of the magnitude and phase of the induced voltage and the magnitude and phase of the current supplied by a circuit through a reference resistance R/sub 0/ to a survey current electrode associated with the tool.
Applying exclusion likelihoods from LHC searches to extended Higgs sectors
NASA Astrophysics Data System (ADS)
Bechtle, Philip; Heinemeyer, Sven; Stål, Oscar; Stefaniak, Tim; Weiglein, Georg
2015-09-01
LHC searches for non-standard Higgs bosons decaying into tau lepton pairs constitute a sensitive experimental probe for physics beyond the Standard Model (BSM), such as supersymmetry (SUSY). Recently, the limits obtained from these searches have been presented by the CMS collaboration in a nearly model-independent fashion - as a narrow resonance model - based on the full dataset. In addition to publishing a exclusion limit, the full likelihood information for the narrow resonance model has been released. This provides valuable information that can be incorporated into global BSM fits. We present a simple algorithm that maps an arbitrary model with multiple neutral Higgs bosons onto the narrow resonance model and derives the corresponding value for the exclusion likelihood from the CMS search. This procedure has been implemented into the public computer code HiggsBounds (version 4.2.0 and higher). We validate our implementation by cross-checking against the official CMS exclusion contours in three Higgs benchmark scenarios in the Minimal Supersymmetric Standard Model (MSSM), and find very good agreement. Going beyond validation, we discuss the combined constraints of the search and the rate measurements of the SM-like Higgs at in a recently proposed MSSM benchmark scenario, where the lightest Higgs boson obtains SM-like couplings independently of the decoupling of the heavier Higgs states. Technical details for how to access the likelihood information within HiggsBounds are given in the appendix. The program is available at http://higgsbounds.hepforge.org.
Maximal likelihood correspondence estimation for face recognition across pose.
Li, Shaoxin; Liu, Xin; Chai, Xiujuan; Zhang, Haihong; Lao, Shihong; Shan, Shiguang
2014-10-01
Due to the misalignment of image features, the performance of many conventional face recognition methods degrades considerably in across pose scenario. To address this problem, many image matching-based methods are proposed to estimate semantic correspondence between faces in different poses. In this paper, we aim to solve two critical problems in previous image matching-based correspondence learning methods: 1) fail to fully exploit face specific structure information in correspondence estimation and 2) fail to learn personalized correspondence for each probe image. To this end, we first build a model, termed as morphable displacement field (MDF), to encode face specific structure information of semantic correspondence from a set of real samples of correspondences calculated from 3D face models. Then, we propose a maximal likelihood correspondence estimation (MLCE) method to learn personalized correspondence based on maximal likelihood frontal face assumption. After obtaining the semantic correspondence encoded in the learned displacement, we can synthesize virtual frontal images of the profile faces for subsequent recognition. Using linear discriminant analysis method with pixel-intensity features, state-of-the-art performance is achieved on three multipose benchmarks, i.e., CMU-PIE, FERET, and MultiPIE databases. Owe to the rational MDF regularization and the usage of novel maximal likelihood objective, the proposed MLCE method can reliably learn correspondence between faces in different poses even in complex wild environment, i.e., labeled face in the wild database. PMID:25163062
[Log-linear model used in the hybrid design of case-parents triad/control-mother dyad].
Peng, W J; Zhang, H; Li, Y X; Li, C Y; Yan, W R
2016-06-01
This study introduced the application of a log-linear model in the hybrid design of case-parents triad/control-mother dyad. Data related to the association between cleft lip with palate (CLP) and methylenetetrahydrofolate reductase (MTHFR) gene A1298C diversity was analyzed. Log-linear model based on likelihood ratio tests (LRTs) was used to analyze the relationships between mother, offspring genotypes and CLP. Data from our study noticed that children of mothers carrying the CC genotype presented a lower risk of CLP, comparing with the children of mothers carrying the AA genotype, with S2=0.45 (95%CI: 0.26-0.79). Offspring that carrying the AC genotype presented a lower risk of CLP, comparing with the offspring that carrying the AA genotype, with R1=0.69 (95% CI: 0.48-0.97). However, no other types of relationships were found. The power of hybrid design was greater than the case-parents study (0.86>0.78). MTHFR A1298C polymorphism seemed to have played an important role in the etiology on both cleft lip and palate. Data from the hybrid design and the log-linear model could help researchers to explore the effects of genotypes from both mothers and the offspring. This study design would present stronger power than the regular case-parents studies thus suitable for studies on the etiology of diseases in early lives, as birth defects. PMID:27346122
Using Web Logs in the Science Classroom
ERIC Educational Resources Information Center
Duplichan, Staycle C.
2009-01-01
As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…
Outdoor Education Student Log Book.
ERIC Educational Resources Information Center
Garbutt, Barbara; And Others.
A student log book for outdoor education was developed to aid Oakland County (Michigan) teachers and supervisors of outdoor education in preparing student campers for their role and responsibilities in the total program. A sample letter to sixth graders explains the purpose of the booklet. General camp rules (10) are presented, followed by 6 woods…
ERIC Educational Resources Information Center
Miles, Donna
2001-01-01
In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…
Dual spectra well logging system
Nussbaum, T.W.
1982-09-07
A dual spectra well logging system includes a well logging tool which is adapted to pass through a bore hole in an earth formation. The well logging tool includes at least two sensors which sense at least one condition of the earth formation and provides corresponding pulse signals. A circuit connected to the sensors provides a combined pulse signal wherein the pulses of the pulse signal from one sensor has one polarity and the pulses of the pulse signal from the other sensor has pulses of an opposite polarity. A circuit applies the combined pulse signal to a well logging cable which conducts the combined pulse signal to the surface of the earth formation. Surface apparatus includes a network connected to the cable which provides control signals in accordance with the polarity of the pulses in the combined pulse signal. A network connected to the cable inverts the combined pulse signal and provides a combined pulse signal and an inverted combined pulse signal. A first switching network receiving the combined pulse signal passes the pulses derived from the pulses of the one polarity in acccordance with the control signals to provide a first pulse signal while a second switching network receiving the inverted combined pulse signal passes the pulses derived from the pulses of the opposite polarity in accordance with the control signals to provide a second pulse signal. An output network processes the two pulse signals to provide an indication of the earth's condition in accordance with the processed pulse signals.
Statistical log analysis made practical
Mitchell, W.K.; Nelson, R.J. )
1991-06-01
This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.
Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.
2015-01-01
We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.
West Pembina Nisku reef log-core correlations
McFadzean, T.B.
1983-01-01
A study was initiated to correlate log parameters and core data from West Pembina Nisku (D-2) Pinnacle reefs. The primary objective was to derive basic input data for making volumetric estimates of reserves and for providing initial input data for doing enhanced recovery model studies. A secondary objective was to determine if a set of log analysis parameters could be derived that would work universally in the many pinnacle reefs of the West Pembina area. The results of the study indicate that porosity log response in the West Pembina Nisku reefs deviates a fair amount from the standards used in log analysis chart books. A multilinear regression formula for determining porosity was derived which worked well for 18 wells studied, on which there was both log and core data. A brief description of the Nisku reef geology, a description of the methods used in doing the study, and a graphic presentation of the results are included.
A note on the asymptotic distribution of likelihood ratio tests to test variance components.
Visscher, Peter M
2006-08-01
When using maximum likelihood methods to estimate genetic and environmental components of (co)variance, it is common to test hypotheses using likelihood ratio tests, since such tests have desirable asymptotic properties. In particular, the standard likelihood ratio test statistic is assumed asymptotically to follow a chi2 distribution with degrees of freedom equal to the number of parameters tested. Using the relationship between least squares and maximum likelihood estimators for balanced designs, it is shown why the asymptotic distribution of the likelihood ratio test for variance components does not follow a chi2 distribution with degrees of freedom equal to the number of parameters tested when the null hypothesis is true. Instead, the distribution of the likelihood ratio test is a mixture of chi2 distributions with different degrees of freedom. Implications for testing variance components in twin designs and for quantitative trait loci mapping are discussed. The appropriate distribution of the likelihood ratio test statistic should be used in hypothesis testing and model selection. PMID:16899155
Model-free linkage analysis using likelihoods
Curtis, D.; Sham, P.C.
1995-09-01
Misspecification of transmission model parameters can produce artifactually lod scores at small recombination fractions and in multipoint analysis. To avoid this problem, we have tried to devise a test that aims to detect a genetic effect at a particular locus, rather than attempting to estimate the map position of a locus with specified effect. Maximizing likelihoods over transmission model parameters, as well as linkage parameters, can produce seriously biased parameter estimates and so yield tests that lack power for the detection of linkage. However, constraining the transmission model parameters to produce the correct population prevalence largely avoids this problem. For computational convenience, we recommend that the likelihoods under linkage and nonlinkage are independently maximized over a limited set of transmission models, ranging from Mendelian dominant to null effect and from null effect to Mendelian recessive. In order to test for a genetic effect at a given map position, the likelihood under linkage is maximized over admixture, the proportion of families linked. Application to simulated data for a wide range of transmission models in both affected sib pairs and pedigrees demonstrates that the new method is well behaved under the null hypothesis and provides a powerful test for linkage when it is present. This test requires no specification of transmission model parameters, apart from an approximate estimate of the population prevalence. It can be applied equally to sib pairs and pedigrees, and, since it does not diminish the lod score at test positions very close to a marker, it is suitable for application to multipoint data. 24 refs., 1 fig., 4 tabs.
Estimating the Likelihood of Extreme Seismogenic Tsunamis
NASA Astrophysics Data System (ADS)
Geist, E. L.
2011-12-01
Because of high levels of destruction to coastal communities and critical facilities from recent tsunamis, estimating the likelihood of extreme seismogenic tsunamis has gained increased attention. Seismogenic tsunami generating capacity is directly related to the scalar seismic moment of the earthquake. As such, earthquake size distributions and recurrence can inform the likelihood of tsunami occurrence. The probability of extreme tsunamis is dependent on how the right-hand tail of the earthquake size distribution is specified. As evidenced by the 2004 Sumatra-Andaman and 2011 Tohoku earthquakes, it is likely that there is insufficient historical information to estimate the maximum earthquake magnitude (Mmax) for any specific subduction zone. Mmax may in fact not be a useful concept for subduction zones of significant length. Earthquake size distributions with a soft corner moment appear more consistent with global observations. Estimating the likelihood of extreme local tsunami runup is complicated by the fact that there is significant uncertainty in the scaling relationship between seismic moment and maximum local tsunami runup. This uncertainty arises from variations in source parameters specific to tsunami generation and the near-shore hydrodynamic response. The primary source effect is how slip is distributed along the fault relative to the overlying water depth. For high slip beneath deep water, shoaling amplification of the tsunami increases substantially according to Green's Law, compared to an equivalent amount of slip beneath shallow water. Both stochastic slip models and dynamic rupture models of tsunamigenic earthquakes are explored in a probabilistic context. The nearshore hydrodynamic response includes attenuating mechanisms, such as wave breaking, and amplifying mechanisms, such as constructive interference of trapped and non-trapped modes. Probabilistic estimates of extreme tsunamis are therefore site specific, as indicated by significant variations
Modelling autoimmune rheumatic disease: a likelihood rationale.
Ulvestad, E
2003-07-01
Immunoglobulins (Igs) and autoantibodies are commonly tested in sera from patients with suspected rheumatic disease. To evaluate the clinical utility of the tests in combination, we investigated sera from 351 patients with autoimmune rheumatic disease (ARD) rheumatoid arthritis (RA), systemic lupus erythematosus (SLE) and Sjögren's syndrome (SS) and 96 patients with nonautoimmune rheumatic disease (NAD) (fibromyalgia, osteoarthritis, etc.). Antinuclear antibodies (ANA), rheumatoid factor (RF), antibodies against DNA and extractable nuclear antigens (anti-ENA), IgG, IgA and IgM were measured for all patients. Logistic regression analysis of test results was used to calculate each patient's probability for belonging to the ARD or NAD group as well as likelihood ratios for disease. Test accuracy was investigated using receiver-operating characteristic (ROC) plots and nonparametric ROC analysis. Neither concentrations of IgG, IgA, IgM, anti-DNA nor anti-ENA gave a significant effect on diagnostic outcome. Probabilities for disease and likelihood ratios calculated by combining RF and ANA performed significantly better at predicting ARD than utilization of the diagnostic tests in isolation (P < 0.001). At a cut-off level of P = 0.73 and likelihood ratio = 1, the logistic model gave a specificity of 93% and a sensitivity of 75% for the differentiation between ARD and NAD. When compared at the same level of specificity, ANA gave a sensitivity of 37% and RF gave a sensitivity of 56.6%. Dichotomizing ANA and RF as positive or negative did not reduce the performance characteristics of the model. Combining results obtained from serological analysis of ANA and RF according to this model will increase the diagnostic utility of the tests in rheumatological practice. PMID:12828565
Intelligence's likelihood and evolutionary time frame
NASA Astrophysics Data System (ADS)
Bogonovich, Marc
2011-04-01
This paper outlines hypotheses relevant to the evolution of intelligent life and encephalization in the Phanerozoic. If general principles are inferable from patterns of Earth life, implications could be drawn for astrobiology. Many of the outlined hypotheses, relevant data, and associated evolutionary and ecological theory are not frequently cited in astrobiological journals. Thus opportunity exists to evaluate reviewed hypotheses with an astrobiological perspective. A quantitative method is presented for testing one of the reviewed hypotheses (hypothesis i; the diffusion hypothesis). Questions are presented throughout, which illustrate that the question of intelligent life's likelihood can be expressed as multiple, broadly ranging, more tractable questions.
Score-based likelihood ratios for handwriting evidence.
Hepler, Amanda B; Saunders, Christopher P; Davis, Linda J; Buscaglia, JoAnn
2012-06-10
Score-based approaches for computing forensic likelihood ratios are becoming more prevalent in the forensic literature. When two items of evidential value are entangled via a scorefunction, several nuances arise when attempting to model the score behavior under the competing source-level propositions. Specific assumptions must be made in order to appropriately model the numerator and denominator probability distributions. This process is fairly straightforward for the numerator of the score-based likelihood ratio, entailing the generation of a database of scores obtained by pairing items of evidence from the same source. However, this process presents ambiguities for the denominator database generation - in particular, how best to generate a database of scores between two items of different sources. Many alternatives have appeared in the literature, three of which we will consider in detail. They differ in their approach to generating denominator databases, by pairing (1) the item of known source with randomly selected items from a relevant database; (2) the item of unknown source with randomly generated items from a relevant database; or (3) two randomly generated items. When the two items differ in type, perhaps one having higher information content, these three alternatives can produce very different denominator databases. While each of these alternatives has appeared in the literature, the decision of how to generate the denominator database is often made without calling attention to the subjective nature of this process. In this paper, we compare each of the three methods (and the resulting score-based likelihood ratios), which can be thought of as three distinct interpretations of the denominator proposition. Our goal in performing these comparisons is to illustrate the effect that subtle modifications of these propositions can have on inferences drawn from the evidence evaluation procedure. The study was performed using a data set composed of cursive writing
How much to trust the senses: likelihood learning.
Sato, Yoshiyuki; Kording, Konrad P
2014-01-01
Our brain often needs to estimate unknown variables from imperfect information. Our knowledge about the statistical distributions of quantities in our environment (called priors) and currently available information from sensory inputs (called likelihood) are the basis of all Bayesian models of perception and action. While we know that priors are learned, most studies of prior-likelihood integration simply assume that subjects know about the likelihood. However, as the quality of sensory inputs change over time, we also need to learn about new likelihoods. Here, we show that human subjects readily learn the distribution of visual cues (likelihood function) in a way that can be predicted by models of statistically optimal learning. Using a likelihood that depended on color context, we found that a learned likelihood generalized to new priors. Thus, we conclude that subjects learn about likelihood. PMID:25398975
Maximum likelihood decoding of Reed Solomon Codes
Sudan, M.
1996-12-31
We present a randomized algorithm which takes as input n distinct points ((x{sub i}, y{sub i})){sup n}{sub i=1} from F x F (where F is a field) and integer parameters t and d and returns a list of all univariate polynomials f over F in the variable x of degree at most d which agree with the given set of points in at least t places (i.e., y{sub i} = f (x{sub i}) for at least t values of i), provided t = {Omega}({radical}nd). The running time is bounded by a polynomial in n. This immediately provides a maximum likelihood decoding algorithm for Reed Solomon Codes, which works in a setting with a larger number of errors than any previously known algorithm. To the best of our knowledge, this is the first efficient (i.e., polynomial time bounded) algorithm which provides some maximum likelihood decoding for any efficient (i.e., constant or even polynomial rate) code.
Sensor registration using airlanes: maximum likelihood solution
NASA Astrophysics Data System (ADS)
Ong, Hwa-Tung
2004-01-01
In this contribution, the maximum likelihood estimation of sensor registration parameters, such as range, azimuth and elevation biases in radar measurements, using airlane information is proposed and studied. The motivation for using airlane information for sensor registration is that it is freely available as a source of reference and it provides an alternative to conventional techniques that rely on synchronised and correctly associated measurements from two or more sensors. In the paper, the problem is first formulated in terms of a measurement model that is a nonlinear function of the unknown target state and sensor parameters, plus sensor noise. A probabilistic model of the target state is developed based on airlane information. The maximum likelihood and also maximum a posteriori solutions are given. The Cramer-Rao lower bound is derived and simulation results are presented for the case of estimating the biases in radar range, azimuth and elevation measurements. The accuracy of the proposed method is compared against the Cramer-Rao lower bound and that of an existing two-sensor alignment method. It is concluded that sensor registration using airlane information is a feasible alternative to existing techniques.
Sensor registration using airlanes: maximum likelihood solution
NASA Astrophysics Data System (ADS)
Ong, Hwa-Tung
2003-12-01
In this contribution, the maximum likelihood estimation of sensor registration parameters, such as range, azimuth and elevation biases in radar measurements, using airlane information is proposed and studied. The motivation for using airlane information for sensor registration is that it is freely available as a source of reference and it provides an alternative to conventional techniques that rely on synchronised and correctly associated measurements from two or more sensors. In the paper, the problem is first formulated in terms of a measurement model that is a nonlinear function of the unknown target state and sensor parameters, plus sensor noise. A probabilistic model of the target state is developed based on airlane information. The maximum likelihood and also maximum a posteriori solutions are given. The Cramer-Rao lower bound is derived and simulation results are presented for the case of estimating the biases in radar range, azimuth and elevation measurements. The accuracy of the proposed method is compared against the Cramer-Rao lower bound and that of an existing two-sensor alignment method. It is concluded that sensor registration using airlane information is a feasible alternative to existing techniques.
CORA: Emission Line Fitting with Maximum Likelihood
NASA Astrophysics Data System (ADS)
Ness, Jan-Uwe; Wichmann, Rainer
2011-12-01
The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.
CORA - emission line fitting with Maximum Likelihood
NASA Astrophysics Data System (ADS)
Ness, J.-U.; Wichmann, R.
2002-07-01
The advent of pipeline-processed data both from space- and ground-based observatories often disposes of the need of full-fledged data reduction software with its associated steep learning curve. In many cases, a simple tool doing just one task, and doing it right, is all one wishes. In this spirit we introduce CORA, a line fitting tool based on the maximum likelihood technique, which has been developed for the analysis of emission line spectra with low count numbers and has successfully been used in several publications. CORA uses a rigorous application of Poisson statistics. From the assumption of Poissonian noise we derive the probability for a model of the emission line spectrum to represent the measured spectrum. The likelihood function is used as a criterion for optimizing the parameters of the theoretical spectrum and a fixed point equation is derived allowing an efficient way to obtain line fluxes. As an example we demonstrate the functionality of the program with an X-ray spectrum of Capella obtained with the Low Energy Transmission Grating Spectrometer (LETGS) on board the Chandra observatory and choose the analysis of the Ne IX triplet around 13.5 Å.
Temperature compensated well logging tool
Riedesel, R.G.; Nussbaum, T.W.; Warren, W.F.
1984-01-24
A well logging tool adapted for use in a borehole traversing an earth formation includes at least one sensor sensing at least one characteristic of the earth formation. Another sensor senses the ambient temperature and provides a corresponding temperature signal. An output circuit provides a temperature compensated output signal corresponding to the sensed characteristic of the earth formation in accordance with the temperature signal and the characteristic signal.
Chemical logging of geothermal wells
Allen, C.A.; McAtee, R.E.
The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.
Chemical logging of geothermal wells
Allen, Charles A.; McAtee, Richard E.
1981-01-01
The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.
Audit Log for Forensic Photography
NASA Astrophysics Data System (ADS)
Neville, Timothy; Sorell, Matthew
We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.
Developmental Changes in Children's Understanding of Future Likelihood and Uncertainty
ERIC Educational Resources Information Center
Lagattuta, Kristin Hansen; Sayfan, Liat
2011-01-01
Two measures assessed 4-10-year-olds' and adults' (N = 201) understanding of future likelihood and uncertainty. In one task, participants sequenced sets of event pictures varying by one physical dimension according to increasing future likelihood. In a separate task, participants rated characters' thoughts about the likelihood of future events,…
ERIC Educational Resources Information Center
Wothke, Werner; Burket, George; Chen, Li-Sue; Gao, Furong; Shu, Lianghua; Chia, Mike
2011-01-01
It has been known for some time that item response theory (IRT) models may exhibit a likelihood function of a respondent's ability which may have multiple modes, flat modes, or both. These conditions, often associated with guessing of multiple-choice (MC) questions, can introduce uncertainty and bias to ability estimation by maximum likelihood…
TIOGA LAKE, HALL NATURAL AREA, LOG CABIN-SADDLEBAG, AND HORSE MEADOWS ROADLESS AREAS, CALIFORNIA.
Seitz, J.F.; Federspiel, F.E.
1984-01-01
Studies of the geology and mineral resources of the Tioga Lake, Hall Natural Area, Log Cabin-Saddlebag, and Horse Meadows Roadless Areas in California indicate that parts of the Log Cabin-Saddlebag and Hall Natural Roadless Areas have a substantiated resource potential for gold and (or) silver resources, and a probable potential for tungsten and molybdenum resources. Tioga Lake Roadless Area has little likelihood for the occurrence of mineral resources and the Horse Meadows Roadless Area has a probable potential for low-grade tungsten, gold, and (or) silver resources. The geologic terrane in the roadless areas precludes the occurrence of organic fuel resources.
Approximate maximum likelihood decoding of block codes
NASA Technical Reports Server (NTRS)
Greenberger, H. J.
1979-01-01
Approximate maximum likelihood decoding algorithms, based upon selecting a small set of candidate code words with the aid of the estimated probability of error of each received symbol, can give performance close to optimum with a reasonable amount of computation. By combining the best features of various algorithms and taking care to perform each step as efficiently as possible, a decoding scheme was developed which can decode codes which have better performance than those presently in use and yet not require an unreasonable amount of computation. The discussion of the details and tradeoffs of presently known efficient optimum and near optimum decoding algorithms leads, naturally, to the one which embodies the best features of all of them.
Groups, information theory, and Einstein's likelihood principle
NASA Astrophysics Data System (ADS)
Sicuro, Gabriele; Tempesta, Piergiulio
2016-04-01
We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts.
Modelling default and likelihood reasoning as probabilistic
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
Groups, information theory, and Einstein's likelihood principle.
Sicuro, Gabriele; Tempesta, Piergiulio
2016-04-01
We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts. PMID:27176234
A Maximum-Likelihood Approach to Force-Field Calibration.
Zaborowski, Bartłomiej; Jagieła, Dawid; Czaplewski, Cezary; Hałabis, Anna; Lewandowska, Agnieszka; Żmudzińska, Wioletta; Ołdziej, Stanisław; Karczyńska, Agnieszka; Omieczynski, Christian; Wirecki, Tomasz; Liwo, Adam
2015-09-28
A new approach to the calibration of the force fields is proposed, in which the force-field parameters are obtained by maximum-likelihood fitting of the calculated conformational ensembles to the experimental ensembles of training system(s). The maximum-likelihood function is composed of logarithms of the Boltzmann probabilities of the experimental conformations, calculated with the current energy function. Because the theoretical distribution is given in the form of the simulated conformations only, the contributions from all of the simulated conformations, with Gaussian weights in the distances from a given experimental conformation, are added to give the contribution to the target function from this conformation. In contrast to earlier methods for force-field calibration, the approach does not suffer from the arbitrariness of dividing the decoy set into native-like and non-native structures; however, if such a division is made instead of using Gaussian weights, application of the maximum-likelihood method results in the well-known energy-gap maximization. The computational procedure consists of cycles of decoy generation and maximum-likelihood-function optimization, which are iterated until convergence is reached. The method was tested with Gaussian distributions and then applied to the physics-based coarse-grained UNRES force field for proteins. The NMR structures of the tryptophan cage, a small α-helical protein, determined at three temperatures (T = 280, 305, and 313 K) by Hałabis et al. ( J. Phys. Chem. B 2012 , 116 , 6898 - 6907 ), were used. Multiplexed replica-exchange molecular dynamics was used to generate the decoys. The iterative procedure exhibited steady convergence. Three variants of optimization were tried: optimization of the energy-term weights alone and use of the experimental ensemble of the folded protein only at T = 280 K (run 1); optimization of the energy-term weights and use of experimental ensembles at all three temperatures (run 2
A maximum likelihood framework for protein design
Kleinman, Claudia L; Rodrigue, Nicolas; Bonnard, Cécile; Philippe, Hervé; Lartillot, Nicolas
2006-01-01
Background The aim of protein design is to predict amino-acid sequences compatible with a given target structure. Traditionally envisioned as a purely thermodynamic question, this problem can also be understood in a wider context, where additional constraints are captured by learning the sequence patterns displayed by natural proteins of known conformation. In this latter perspective, however, we still need a theoretical formalization of the question, leading to general and efficient learning methods, and allowing for the selection of fast and accurate objective functions quantifying sequence/structure compatibility. Results We propose a formulation of the protein design problem in terms of model-based statistical inference. Our framework uses the maximum likelihood principle to optimize the unknown parameters of a statistical potential, which we call an inverse potential to contrast with classical potentials used for structure prediction. We propose an implementation based on Markov chain Monte Carlo, in which the likelihood is maximized by gradient descent and is numerically estimated by thermodynamic integration. The fit of the models is evaluated by cross-validation. We apply this to a simple pairwise contact potential, supplemented with a solvent-accessibility term, and show that the resulting models have a better predictive power than currently available pairwise potentials. Furthermore, the model comparison method presented here allows one to measure the relative contribution of each component of the potential, and to choose the optimal number of accessibility classes, which turns out to be much higher than classically considered. Conclusion Altogether, this reformulation makes it possible to test a wide diversity of models, using different forms of potentials, or accounting for other factors than just the constraint of thermodynamic stability. Ultimately, such model-based statistical analyses may help to understand the forces shaping protein sequences, and
Avian responses to selective logging shaped by species traits and logging practices.
Burivalova, Zuzana; Lee, Tien Ming; Giam, Xingli; Şekercioğlu, Çağan Hakkı; Wilcove, David S; Koh, Lian Pin
2015-06-01
Selective logging is one of the most common forms of forest use in the tropics. Although the effects of selective logging on biodiversity have been widely studied, there is little agreement on the relationship between life-history traits and tolerance to logging. In this study, we assessed how species traits and logging practices combine to determine species responses to selective logging, based on over 4000 observations of the responses of nearly 1000 bird species to selective logging across the tropics. Our analysis shows that species traits, such as feeding group and body mass, and logging practices, such as time since logging and logging intensity, interact to influence a species' response to logging. Frugivores and insectivores were most adversely affected by logging and declined further with increasing logging intensity. Nectarivores and granivores responded positively to selective logging for the first two decades, after which their abundances decrease below pre-logging levels. Larger species of omnivores and granivores responded more positively to selective logging than smaller species from either feeding group, whereas this effect of body size was reversed for carnivores, herbivores, frugivores and insectivores. Most importantly, species most negatively impacted by selective logging had not recovered approximately 40 years after logging cessation. We conclude that selective timber harvest has the potential to cause large and long-lasting changes in avian biodiversity. However, our results suggest that the impacts can be mitigated to a certain extent through specific forest management strategies such as lengthening the rotation cycle and implementing reduced impact logging. PMID:25994673
Tucker Wireline Open Hole Wireline Logging
Milliken, M.
2002-05-23
The Tucker Wireline unit ran a suite of open hole logs right behind the RMOTC logging contractor for comparison purposes. The tools included Dual Laterolog, Phased Induction, BHC Sonic, and Density-Porosity.
Likelihood approach for evaluating bioequivalence of highly variable drugs
Du, Liping; Choi, Leena
2015-01-01
Bioequivalence (BE) is required for approving a generic drug. The Two-One-Sided-Test (TOST, or the 90% confidence interval approach) has been used as the mainstream methodology to test average BE (ABE) on pharmacokinetic parameters such as the area under the blood concentration-time curve and the peak concentration. However, for highly variable drugs (%CV > 30%), it is difficult to demonstrate ABE in a standard cross-over study with the typical number of subjects using the TOST due to lack of power. Recently, the US Food and Drug Administration and the European Medicines Agency recommended similar but not identical reference scaled average bioequivalence (RSABE) approaches to address this issue. Although the power is improved, the new approaches may not guarantee a high level of confidence for the true difference between two drugs at the ABE boundaries. It is also difficult for these approaches to address the issues of population BE (PBE) and individual BE (IBE). We advocate the use of a likelihood approach for representing and interpreting BE data as evidence. Using example data from a full replicate 2 × 4 cross-over study, we demonstrate how to present evidence using the profile likelihoods for the mean difference and standard deviation ratios of the two drugs for the pharmacokinetic parameters. With this approach, we present evidence for PBE and IBE as well as ABE within a unified framework. Our simulations show that the operating characteristics of the proposed likelihood approach are comparable with the RSABE approaches when the same criteria are applied. PMID:25408492
Maximum likelihood analysis of bubble incidence for mixed gas diving.
Tikuisis, P; Gault, K; Carrod, G
1990-03-01
The method of maximum likelihood has been applied to predict the incidence of bubbling in divers for both air and helium diving. Data were obtained from 108 air man-dives and 622 helium man-dives conducted experimentally in a hyperbaric chamber. Divers were monitored for bubbles using Doppler ultrasonics during the period from surfacing until approximately 2 h after surfacing. Bubble grades were recorded according to the K-M code, and the maximum value in the precordial region for each diver was used in the likelihood analysis. Prediction models were based on monoexponential gas kinetics using one and two parallel-compartment configurations. The model parameters were of three types: gas kinetics, gas potency, and compartment gain. When the potency of the gases was not distinguished, the risk criterion used was inherently based on the gas supersaturation ratio, otherwise it was based on the potential bubble volume. The two-compartment model gave a significantly better prediction than the one-compartment model only if the kinetics of nitrogen and helium were distinguished. A further significant improvement with the two-compartment model was obtained when the potency of the two gases was distinguished, thereby making the potential bubble volume criterion a better choice than the gas pressure criterion. The results suggest that when the method of maximum likelihood is applied for the prediction of the incidence of bubbling, more than one compartment should be used and if more than one is used consideration should be given to distinguishing the potencies of the inert gases. PMID:2181767
Limits on Log Odds Ratios for Unidimensional Item Response Theory Models
ERIC Educational Resources Information Center
Haberman, Shelby J.; Holland, Paul W.; Sinharay, Sandip
2007-01-01
Bounds are established for log odds ratios (log cross-product ratios) involving pairs of items for item response models. First, expressions for bounds on log odds ratios are provided for one-dimensional item response models in general. Then, explicit bounds are obtained for the Rasch model and the two-parameter logistic (2PL) model. Results are…
40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.
Code of Federal Regulations, 2013 CFR
2013-07-01
... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...
40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.
Code of Federal Regulations, 2014 CFR
2014-07-01
... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...
40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.
Code of Federal Regulations, 2012 CFR
2012-07-01
... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...
40 CFR 146.87 - Logging, sampling, and testing prior to injection well operation.
Code of Federal Regulations, 2014 CFR
2014-07-01
... installed; and (ii) A cement bond and variable density log to evaluate cement quality radially, and a... cement bond and variable density log, and a temperature log after the casing is set and cemented. (4) A... confining zone(s): (1) Fracture pressure; (2) Other physical and chemical characteristics of the...
40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.
Code of Federal Regulations, 2013 CFR
2013-07-01
... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...
40 CFR 146.66 - Logging, sampling, and testing prior to new well operation.
Code of Federal Regulations, 2012 CFR
2012-07-01
... cement bond and variable density log, and a temperature log after the casing is set and cemented. (ii..., gamma ray, and fracture finder logs before the casing is installed; and (B) A cement bond and variable... Class I hazardous waste injection wells: (1) Fracture pressure; (2) Other physical and...