Transfer Entropy as a Log-Likelihood Ratio
NASA Astrophysics Data System (ADS)
Barnett, Lionel; Bossomaier, Terry
2012-09-01
Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.
Fast inference in generalized linear models via expected log-likelihoods.
Ramirez, Alexandro D; Paninski, Liam
2014-04-01
Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting "expected log-likelihood" can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina.
Fast inference in generalized linear models via expected log-likelihoods
Ramirez, Alexandro D.; Paninski, Liam
2015-01-01
Generalized linear models play an essential role in a wide variety of statistical applications. This paper discusses an approximation of the likelihood in these models that can greatly facilitate computation. The basic idea is to replace a sum that appears in the exact log-likelihood by an expectation over the model covariates; the resulting “expected log-likelihood” can in many cases be computed significantly faster than the exact log-likelihood. In many neuroscience experiments the distribution over model covariates is controlled by the experimenter and the expected log-likelihood approximation becomes particularly useful; for example, estimators based on maximizing this expected log-likelihood (or a penalized version thereof) can often be obtained with orders of magnitude computational savings compared to the exact maximum likelihood estimators. A risk analysis establishes that these maximum EL estimators often come with little cost in accuracy (and in some cases even improved accuracy) compared to standard maximum likelihood estimates. Finally, we find that these methods can significantly decrease the computation time of marginal likelihood calculations for model selection and of Markov chain Monte Carlo methods for sampling from the posterior parameter distribution. We illustrate our results by applying these methods to a computationally-challenging dataset of neural spike trains obtained via large-scale multi-electrode recordings in the primate retina. PMID:23832289
A sampling approach to estimate the log determinant used in spatial likelihood problems
NASA Astrophysics Data System (ADS)
Pace, R. Kelley; Lesage, James P.
2009-09-01
Likelihood-based methods for modeling multivariate Gaussian spatial data have desirable statistical characteristics, but the practicality of these methods for massive georeferenced data sets is often questioned. A sampling algorithm is proposed that exploits a relationship involving log-pivots arising from matrix decompositions used to compute the log determinant term that appears in the model likelihood. We demonstrate that the method can be used to successfully estimate log-determinants for large numbers of observations. Specifically, we produce an log-determinant estimate for a 3,954,400 by 3,954,400 matrix in less than two minutes on a desktop computer. The proposed method involves computations that are independent, making it amenable to out-of-core computation as well as to coarse-grained parallel or distributed processing. The proposed technique yields an estimated log-determinant and associated confidence interval.
Utterance Verification Using State-Level Log-Likelihood Ratio with Frame and State Selection
NASA Astrophysics Data System (ADS)
Kwon, Suk-Bong; Kim, Hoirin
This paper suggests utterance verification system using state-level log-likelihood ratio with frame and state selection. We use hidden Markov models for speech recognition and utterance verification as acoustic models and anti-phone models. The hidden Markov models have three states and each state represents different characteristics of a phone. Thus we propose an algorithm to compute state-level log-likelihood ratio and give weights on states for obtaining more reliable confidence measure of recognized phones. Additionally, we propose a frame selection algorithm to compute confidence measure on frames including proper speech in the input speech. In general, phone segmentation information obtained from speaker-independent speech recognition system is not accurate because triphone-based acoustic models are difficult to effectively train for covering diverse pronunciation and coarticulation effect. So, it is more difficult to find the right matched states when obtaining state segmentation information. A state selection algorithm is suggested for finding valid states. The proposed method using state-level log-likelihood ratio with frame and state selection shows that the relative reduction in equal error rate is 18.1% compared to the baseline system using simple phone-level log-likelihood ratios.
Binary Detection using Multi-Hypothesis Log-Likelihood, Image Processing
2014-03-27
xi I. Introduction ...Projects Agency GEO Geostationary Earth Orbit xi BINARY DETECTION USING MULTI-HYPOTHESIS LOG-LIKELIHOOD, IMAGE PROCESSING I. Introduction Comparing...is important to compare them to another modern technique. The third objective is to compare results from another image detection method, specifically
Posterior propriety for hierarchical models with log-likelihoods that have norm bounds
Michalak, Sarah E.; Morris, Carl N.
2015-07-17
Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonly used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).
Posterior propriety for hierarchical models with log-likelihoods that have norm bounds
Michalak, Sarah E.; Morris, Carl N.
2015-07-17
Statisticians often use improper priors to express ignorance or to provide good frequency properties, requiring that posterior propriety be verified. Our paper addresses generalized linear mixed models, GLMMs, when Level I parameters have Normal distributions, with many commonly-used hyperpriors. It provides easy-to-verify sufficient posterior propriety conditions based on dimensions, matrix ranks, and exponentiated norm bounds, ENBs, for the Level I likelihood. Since many familiar likelihoods have ENBs, which is often verifiable via log-concavity and MLE finiteness, our novel use of ENBs permits unification of posterior propriety results and posterior MGF/moment results for many useful Level I distributions, including those commonlymore » used with multilevel generalized linear models, e.g., GLMMs and hierarchical generalized linear models, HGLMs. Furthermore, those who need to verify existence of posterior distributions or of posterior MGFs/moments for a multilevel generalized linear model given a proper or improper multivariate F prior as in Section 1 should find the required results in Sections 1 and 2 and Theorem 3 (GLMMs), Theorem 4 (HGLMs), or Theorem 5 (posterior MGFs/moments).« less
Maximum likelihood estimation of the mixture of log-concave densities.
Hu, Hao; Wu, Yichao; Yao, Weixin
2016-09-01
Finite mixture models are useful tools and can be estimated via the EM algorithm. A main drawback is the strong parametric assumption about the component densities. In this paper, a much more flexible mixture model is considered, which assumes each component density to be log-concave. Under fairly general conditions, the log-concave maximum likelihood estimator (LCMLE) exists and is consistent. Numeric examples are also made to demonstrate that the LCMLE improves the clustering results while comparing with the traditional MLE for parametric mixture models.
2. Onroom log cabin (right), log root cellar (center), tworoom ...
2. On-room log cabin (right), log root cellar (center), two-room log cabin (left), and post-and-beam garage (background). View to southwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT
A Planar Approximation for the Least Reliable Bit Log-likelihood Ratio of 8-PSK Modulation
NASA Technical Reports Server (NTRS)
Thesling, William H.; Vanderaar, Mark J.
1994-01-01
The optimum decoding of component codes in block coded modulation (BCM) schemes requires the use of the log-likelihood ratio (LLR) as the signal metric. An approximation to the LLR for the least reliable bit (LRB) in an 8-PSK modulation based on planar equations with fixed point arithmetic is developed that is both accurate and easily realizable for practical BCM schemes. Through an error power analysis and an example simulation it is shown that the approximation results in 0.06 dB in degradation over the exact expression at an E(sub s)/N(sub o) of 10 dB. It is also shown that the approximation can be realized in combinatorial logic using roughly 7300 transistors. This compares favorably to a look up table approach in typical systems.
Read, Randy J.; McCoy, Airlie J.
2016-01-01
The crystallographic diffraction experiment measures Bragg intensities; crystallographic electron-density maps and other crystallographic calculations in phasing require structure-factor amplitudes. If data were measured with no errors, the structure-factor amplitudes would be trivially proportional to the square roots of the intensities. When the experimental errors are large, and especially when random errors yield negative net intensities, the conversion of intensities and their error estimates into amplitudes and associated error estimates becomes nontrivial. Although this problem has been addressed intermittently in the history of crystallographic phasing, current approaches to accounting for experimental errors in macromolecular crystallography have numerous significant defects. These have been addressed with the formulation of LLGI, a log-likelihood-gain function in terms of the Bragg intensities and their associated experimental error estimates. LLGI has the correct asymptotic behaviour for data with large experimental error, appropriately downweighting these reflections without introducing bias. LLGI abrogates the need for the conversion of intensity data to amplitudes, which is usually performed with the French and Wilson method [French & Wilson (1978 ▸), Acta Cryst. A35, 517–525], wherever likelihood target functions are required. It has general applicability for a wide variety of algorithms in macromolecular crystallography, including scaling, characterizing anisotropy and translational noncrystallographic symmetry, detecting outliers, experimental phasing, molecular replacement and refinement. Because it is impossible to reliably recover the original intensity data from amplitudes, it is suggested that crystallographers should always deposit the intensity data in the Protein Data Bank. PMID:26960124
Limit Distribution Theory for Maximum Likelihood Estimation of a Log-Concave Density.
Balabdaoui, Fadoua; Rufibach, Kaspar; Wellner, Jon A
2009-06-01
We find limiting distributions of the nonparametric maximum likelihood estimator (MLE) of a log-concave density, i.e. a density of the form f(0) = exp varphi(0) where varphi(0) is a concave function on R. Existence, form, characterizations and uniform rates of convergence of the MLE are given by Rufibach (2006) and Dümbgen and Rufibach (2007). The characterization of the log-concave MLE in terms of distribution functions is the same (up to sign) as the characterization of the least squares estimator of a convex density on [0, infinity) as studied by Groeneboom, Jongbloed and Wellner (2001b). We use this connection to show that the limiting distributions of the MLE and its derivative are, under comparable smoothness assumptions, the same (up to sign) as in the convex density estimation problem. In particular, changing the smoothness assumptions of Groeneboom, Jongbloed and Wellner (2001b) slightly by allowing some higher derivatives to vanish at the point of interest, we find that the pointwise limiting distributions depend on the second and third derivatives at 0 of H(k), the "lower invelope" of an integrated Brownian motion process minus a drift term depending on the number of vanishing derivatives of varphi(0) = log f(0) at the point of interest. We also establish the limiting distribution of the resulting estimator of the mode M(f(0)) and establish a new local asymptotic minimax lower bound which shows the optimality of our mode estimator in terms of both rate of convergence and dependence of constants on population values.
Bayesian penalized log-likelihood ratio approach for dose response clinical trial studies.
Tang, Yuanyuan; Cai, Chunyan; Sun, Liangrui; He, Jianghua
2017-02-13
In literature, there are a few unified approaches to test proof of concept and estimate a target dose, including the multiple comparison procedure using modeling approach, and the permutation approach proposed by Klingenberg. We discuss and compare the operating characteristics of these unified approaches and further develop an alternative approach in a Bayesian framework based on the posterior distribution of a penalized log-likelihood ratio test statistic. Our Bayesian approach is much more flexible to handle linear or nonlinear dose-response relationships and is more efficient than the permutation approach. The operating characteristics of our Bayesian approach are comparable to and sometimes better than both approaches in a wide range of dose-response relationships. It yields credible intervals as well as predictive distribution for the response rate at a specific dose level for the target dose estimation. Our Bayesian approach can be easily extended to continuous, categorical, and time-to-event responses. We illustrate the performance of our proposed method with extensive simulations and Phase II clinical trial data examples.
Krishnamoorthy, K; Oral, Evrim
2015-11-26
Standardized likelihood ratio test (SLRT) for testing the equality of means of several log-normal distributions is proposed. The properties of the SLRT and an available modified likelihood ratio test (MLRT) and a generalized variable (GV) test are evaluated by Monte Carlo simulation and compared. Evaluation studies indicate that the SLRT is accurate even for small samples, whereas the MLRT could be quite liberal for some parameter values, and the GV test is in general conservative and less powerful than the SLRT. Furthermore, a closed-form approximate confidence interval for the common mean of several log-normal distributions is developed using the method of variance estimate recovery, and compared with the generalized confidence interval with respect to coverage probabilities and precision. Simulation studies indicate that the proposed confidence interval is accurate and better than the generalized confidence interval in terms of coverage probabilities. The methods are illustrated using two examples.
Pastor, José V; Arrègle, Jean; García, José M; Zapata, L Daniel
2007-02-20
A methodology for processing images of diesel sprays under different experimental situations is presented. The new approach has been developed for cases where the background does not follow a Gaussian distribution but a positive bias appears. In such cases, the lognormal and the gamma probability density functions have been considered for the background digital level distributions. Two different algorithms have been compared with the standard log-likelihood ratio test (LRT): a threshold defined from the cumulative probability density function of the background shows a sensitive improvement, but the best results are obtained with modified versions of the LRT algorithm adapted to non-Gaussian cases.
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS ...
2. VIEW OF BLOCK AND TACKLE FOR MOVING CEDAR LOGS FROM POND TO JACK LADDER--AN ENDLESS CHAIN CONVEYOR THAT MOVES LOGS INTO MILL - Lester Shingle Mill, 1602 North Eighteenth Street, Sweet Home, Linn County, OR
Loaiza-Brito, Oscar; Martin, Johannes; Nilles, Hans Peter; Ratz, Michael
2005-12-02
Flux compactifications of string theory seem to require the presence of a fine-tuned constant in the superpotential. We discuss a scheme where this constant is replaced by a dynamical quantity which we argue to be a 'continuous Chern-Simons term'. In such a scheme, the gaugino condensate generates the hierarchically small scale of supersymmetry breakdown rather than adjusting its size to a constant. A crucial ingredient is the appearance of the hierarchically small quantity exp(-
MAIL LOG, program theory, volume 2
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
Information relevant to the MAIL LOG program theory is documented. The L-files for mail correspondence, design information release/report, and the drawing/engineering order are given. In addition, sources for miscellaneous external routines and special support routines are documented along with a glossary of terms.
Continuous Tamper-proof Logging using TPM2.0
2014-06-16
Continuous Tamper-proof Logging using TPM2.0 Paul England, Limin Jia, James Lorch, and Arunesh Sinha July 9, 2013 (Revised June 16...00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Continuous Tamper-proof Logging using TPM2.0 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Auditing system logs is an important means of ensuring systems’ security in situations where run-time
2D Log-Gabor Wavelet Based Action Recognition
NASA Astrophysics Data System (ADS)
Li, Ning; Xu, De
The frequency response of log-Gabor function matches well the frequency response of primate visual neurons. In this letter, motion-salient regions are extracted based on the 2D log-Gabor wavelet transform of the spatio-temporal form of actions. A supervised classification technique is then used to classify the actions. The proposed method is robust to the irregular segmentation of actors. Moreover, the 2D log-Gabor wavelet permits more compact representation of actions than the recent neurobiological models using Gabor wavelet.
Core log: Valles caldera No. 2A, New Mexico
Starguist, V.L.
1988-01-01
Scientific core hole VC-2A was drilled into the western ring-fracture zone at Sulphur Springs in the Valles caldera, New Mexico. VC-2A, the second scientific core hole in the caldera, was cored through a faulted and brecciated sequence of intracauldron tuffs and volcaniclastic rocks to a depth of 528 m. As of November 1, 1986, the unequilibrated bottom-hole temperature was 212/degree/C. The rocks penetrated are intensely altered and host sub-ore grade stockwork molybdenite mineralization between 25 and 125 m. This report contains a detailed core log to aid researchers in their studies of the Valles caldera magma hydrothermal system. 3 refs., 2 figs.
NASA Technical Reports Server (NTRS)
Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.
1981-01-01
An experiment was performed in which a complete X-ray survey of the 8.2 steradians of the sky at galactic latitudes where the absolute value of b is 20 deg down to a limiting sensitivity of 3.1 x ten to the minus 11th power ergs/sq cm sec in the 2-10 keV band. Of the 85 detected sources 17 were identified with galactic objects, 61 were identified with extragalactic objects, and 7 remain unidentified. The log N - log S relation for the non-galactic objects is well fit by the Euclidean relationship. The X-ray spectra of these objects were used to construct log N - log S in physical units. The complete sample of identified sources was used to construct X-ray luminosity functions, using the absolute maximum likelihood method, for clusters galaxies and active galactic nuclei.
Functional conservation between mammalian MGRN1 and plant LOG2 ubiquitin ligases.
Guerra, Damian D; Pratelli, Réjane; Kraft, Edward; Callis, Judy; Pilot, Guillaume
2013-11-01
Plant LOSS OF GDU 2 (LOG2) and Mammalian Mahogunin Ring Finger 1 (MGRN1) proteins are RING-type E3 ligases sharing similarity N-terminal to the RING domain. Deletion of this region disrupts the interaction of LOG2 with the plant membrane protein GLUTAMINE DUMPER1 (GDU1). Phylogenetic analysis identified two clades of LOG2/MGRN1-like proteins in vertebrates and plants. The ability of MGRN1 to functionally replace LOG2 was tested. MGRN1 ubiquitylates GDU1 in vitro and can partially substitute for LOG2 in the plant, partially restoring amino acid resistance to a GDU1-myc over-expression, log2-2 background. Altogether, these results suggest a conserved function for the N-terminal domain in evolution.
Assessment of geophysical logs from borehole USW G-2, Yucca Mountain, Nevada
Nelson, P.H.; Schimschal, U.
1993-05-01
Commercial logging contractors, Western Atlas, Schlumberger, and Edcon obtained borehole geophysical logs at the site of a potential high level nuclear waste repository at Yucca Mountain, Nevada. Drill hole USW-G2 was picked for this test of suitable logging tools and logging technology, both representing state-of-the-art technology by these commercial companies. Experience gained by analysis of existing core data and a variety of logs obtained earlier by Birdwell and Dresser Atlas served as a guide to a choice of logs to be obtained. Logs were obtained in water-filled borehole in zeolitized tuff (saturated zone) and in air-filled borehole largely in unaltered welded tuff (unsaturated zone).
NASA Astrophysics Data System (ADS)
Kojima, Yohei; Takeda, Kazuaki; Adachi, Fumiyuki
Frequency-domain equalization (FDE) based on the minimum mean square error (MMSE) criterion can provide better downlink bit error rate (BER) performance of direct sequence code division multiple access (DS-CDMA) than the conventional rake combining in a frequency-selective fading channel. FDE requires accurate channel estimation. In this paper, we propose a new 2-step maximum likelihood channel estimation (MLCE) for DS-CDMA with FDE in a very slow frequency-selective fading environment. The 1st step uses the conventional pilot-assisted MMSE-CE and the 2nd step carries out the MLCE using decision feedback from the 1st step. The BER performance improvement achieved by 2-step MLCE over pilot assisted MMSE-CE is confirmed by computer simulation.
Dreesen, Donald S.; Anderson, David W.
1991-08-14
Oil Well Perforators (OWP) was mobilized on August 5th and arrived on August 6th. OWP's grease injection head was installed on the LANL wireline and the wireline was re-headed with a LANL cable head. OWP provided the following pressure control equipment: a 40 ft long 5-1/2" API 5000 psig WP (working pressure) casing lubricator, a 5-1/2" API 3000 psig WP wireline blow-out-preventer, a 15000 psig WP grease injection head with 0.425-0.430" flow tubes and a grease injection system, and the 15000 psig WP high pressure hoses needed to connect the grease system to the grease head. The grease head included three flow sections, two active and one reserve, and two injection hoses were run to the head. All of the instruments operated throughout the logging operation. There were no problems with lowering or raising the tools. The pressure control equipment performed well and leakage at the grease head was insignificant.
NASA Astrophysics Data System (ADS)
Chiaramonte, L.; Zoback, M.; Friedmann, J.; Stamp, V.
2006-12-01
Geologic sequestration of CO2 has been proposed as a key technology for reducing greenhouse gas in the atmosphere. However, leakage of CO2 from any potential reservoir could seriously compromise the goal of long-term storage. Therefore understanding the likelihood of leakage is a key step toward the feasibility of this technology. Because it has long been known that fluid injection can lead to fault reactivation, assessing the potential of induced fault slip and leakage of CO2 from any potential storage reservoir is critical prior to injection of large quantities of CO2. We have developed a geomechanical model in advance of a proposed injection experiment at Teapot Dome, WY in order to study in detail the processes of a CO2 leak through predicted faults. Teapot Dome is an elongated asymmetrical, basement-cored anticline with a north-northeast axis. It is part of the Salt Creek structural trend, located in the southwestern edge of the Powder River Basin. The anticline is compartmentalized in a series of oblique-slip faults, generally oriented NE-SW, although varying in geometry, displacement, and complexity. In particular, the fault network referred to as S2 shows a greater complexity and a range of geometries and azimuths that enhances its potential for leakage. Furthermore its surface expression show alkali springs and hydrocarbon samples within the fault veins and gouge. The S2 fault intersects the oil-bearing 2nd Wall Creek around 650m, which is close in terms of temperature and pressure to conditions of supercritical CO2 phase injection and oil miscibility. We developed a preliminary geomechanical model of the deeper Tensleep Fm. previously, utilizing observations of wellbore breakouts, drilling induced fractures, leak-off or mini-frac tests and other available geophysical data. We used this information to evaluate the potential for injection to induce slip on the S1 fault, approximately 2 km south of the present study area. These results are compared and
Chen, Li; Voronovich, Zoya; Clark, Kenneth; Hands, Isaac; Mannas, Jonathan; Walsh, Meggen; Nikiforova, Marina N.; Durbin, Eric B.; Weiss, Heidi; Horbinski, Craig
2014-01-01
Background Several variables are associated with the likelihood of isocitrate dehydrogenase 1 or 2 (IDH1/2) mutation in gliomas, though no guidelines yet exist for when testing is warranted, especially when an R132H IDH1 immunostain is negative. Methods A cohort of 89 patients was used to build IDH1/2 mutation prediction models in World Health Organization grades II–IV gliomas, and an external cohort of 100 patients was used for validation. Logistic regression and backward model selection with the Akaike information criterion were used to develop prediction models. Results A multivariable model, incorporating patient age, glioblastoma multiforme diagnosis, and prior history of grade II or III glioma, was developed to predict IDH1/2 mutation probability. This model generated an area under the curve (AUC) of 0.934 (95% CI: 0.878, 0.978) in the external validation cohort and 0.941 (95% CI: 0.918, 0.962) in the cohort of The Cancer Genome Atlas. When R132H IDH1 immunostain information was added, AUC increased to 0.986 (95% CI: 0.967, 0.998). This model had an AUC of 0.947 (95% CI: 0.891, 0.995) in predicting whether an R132H IDH1 immunonegative case harbored a less common IDH1 or IDH2 mutation. The models were also 94% accurate in predicting IDH1/2 mutation status in gliomas from The Cancer Genome Atlas. An interactive web-based application for calculating the probability of an IDH1/2 mutation is now available using these models. Conclusions We have integrated multiple variables to generate a probability of an IDH1/2 mutation. The associated web-based application can help triage diffuse gliomas that would benefit from mutation testing in both clinical and research settings. PMID:24860178
In situ fluid typing and quantification with 1D and 2D NMR logging.
Sun, Boqin
2007-05-01
In situ nuclear magnetic resonance (NMR) fluid typing has recently gained momentum due to data acquisition and inversion algorithm enhancement of NMR logging tools. T(2) distributions derived from NMR logging contain information on bulk fluids and pore size distributions. However, the accuracy of fluid typing is greatly overshadowed by the overlap between T(2) peaks arising from different fluids with similar apparent T(2) relaxation times. Nevertheless, the shapes of T(2) distributions from different fluid components are often different and can be predetermined. Inversion with predetermined T(2) distributions allows us to perform fluid component decomposition to yield individual fluid volume ratios. Another effective method for in situ fluid typing is two-dimensional (2D) NMR logging, which results in proton population distribution as a function of T(2) relaxation time and fluid diffusion coefficient (or T(1) relaxation time). Since diffusion coefficients (or T(1) relaxation time) for different fluid components can be very different, it is relatively easy to separate oil (especially heavy oil) from water signal in a 2D NMR map and to perform accurate fluid typing. Combining NMR logging with resistivity and/or neutron/density logs provides a third method for in situ fluid typing. We shall describe these techniques with field examples.
NASA Technical Reports Server (NTRS)
Peters, B. C., Jr.; Walker, H. F.
1976-01-01
The problem of obtaining numerically maximum likelihood estimates of the parameters for a mixture of normal distributions is addressed. In recent literature, a certain successive approximations procedure, based on the likelihood equations, is shown empirically to be effective in numerically approximating such maximum-likelihood estimates; however, the reliability of this procedure was not established theoretically. Here, a general iterative procedure is introduced, of the generalized steepest-ascent (deflected-gradient) type, which is just the procedure known in the literature when the step-size is taken to be 1. With probability 1 as the sample size grows large, it is shown that this procedure converges locally to the strongly consistent maximum-likelihood estimate whenever the step-size lies between 0 and 2. The step-size which yields optimal local convergence rates for large samples is determined in a sense by the separation of the component normal densities and is bounded below by a number between 1 and 2.
He, Yi; Scheraga, Harold A.; Liwo, Adam
2015-12-28
Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field.
He, Yi; Liwo, Adam; Scheraga, Harold A.
2015-01-01
Coarse-grained models are useful tools to investigate the structural and thermodynamic properties of biomolecules. They are obtained by merging several atoms into one interaction site. Such simplified models try to capture as much as possible information of the original biomolecular system in all-atom representation but the resulting parameters of these coarse-grained force fields still need further optimization. In this paper, a force field optimization method, which is based on maximum-likelihood fitting of the simulated to the experimental conformational ensembles and least-squares fitting of the simulated to the experimental heat-capacity curves, is applied to optimize the Nucleic Acid united-RESidue 2-point (NARES-2P) model for coarse-grained simulations of nucleic acids recently developed in our laboratory. The optimized NARES-2P force field reproduces the structural and thermodynamic data of small DNA molecules much better than the original force field. PMID:26723596
The phylogenetic likelihood library.
Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A
2015-03-01
We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL).
Approximate likelihood for large irregularly spaced spatial data
Fuentes, Montserrat
2008-01-01
SUMMARY Likelihood approaches for large irregularly spaced spatial datasets are often very difficult, if not infeasible, to implement due to computational limitations. Even when we can assume normality, exact calculations of the likelihood for a Gaussian spatial process observed at n locations requires O(n3) operations. We present a version of Whittle’s approximation to the Gaussian log likelihood for spatial regular lattices with missing values and for irregularly spaced datasets. This method requires O(nlog2n) operations and does not involve calculating determinants. We present simulations and theoretical results to show the benefits and the performance of the spatial likelihood approximation method presented here for spatial irregularly spaced datasets and lattices with missing values. We apply these methods to estimate the spatial structure of sea surface temperatures (SST) using satellite data with missing values. PMID:19079638
Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods
McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando
2010-01-01
Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2D in-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349
Code of Federal Regulations, 2014 CFR
2014-10-01
... 43 Public Lands: Interior 2 2014-10-01 2014-10-01 false How soon after I know of the likelihood of... Relating to Public Lands (Continued) BUREAU OF LAND MANAGEMENT, DEPARTMENT OF THE INTERIOR MINERALS... § 3162.2-11 How soon after I know of the likelihood of drainage must I take protective action? (a)...
The Likelihood Function and Likelihood Statistics
NASA Astrophysics Data System (ADS)
Robinson, Edward L.
2016-01-01
The likelihood function is a necessary component of Bayesian statistics but not of frequentist statistics. The likelihood function can, however, serve as the foundation for an attractive variant of frequentist statistics sometimes called likelihood statistics. We will first discuss the definition and meaning of the likelihood function, giving some examples of its use and abuse - most notably in the so-called prosecutor's fallacy. Maximum likelihood estimation is the aspect of likelihood statistics familiar to most people. When data points are known to have Gaussian probability distributions, maximum likelihood parameter estimation leads directly to least-squares estimation. When the data points have non-Gaussian distributions, least-squares estimation is no longer appropriate. We will show how the maximum likelihood principle leads to logical alternatives to least squares estimation for non-Gaussian distributions, taking the Poisson distribution as an example.The likelihood ratio is the ratio of the likelihoods of, for example, two hypotheses or two parameters. Likelihood ratios can be treated much like un-normalized probability distributions, greatly extending the applicability and utility of likelihood statistics. Likelihood ratios are prone to the same complexities that afflict posterior probability distributions in Bayesian statistics. We will show how meaningful information can be extracted from likelihood ratios by the Laplace approximation, by marginalizing, or by Markov chain Monte Carlo sampling.
Glenn, W.E.; Hulen, J.B.; Nielson, D.L.
1981-02-01
Utah State Geothermal Well 9-1 in the Roosevelt Hot Springs KGRA, Beaver County, Utah, has been donated by Phillips Petroleum Company for calibration and testing of well-logging equipment in the hot, corrosive, geothermal environment. It is the second Calibration/Test Well (C/T-2) in the Geothermal Log Interpretation Program. A study of cuttings and well logs from Well C/T-2 was completed. This synthesis and data presentation contains most of the subsurface geologic information needed to effect the total evaluation of geophysical logs acquired in this geothermal calibration/test well, C/T-2.
Evans, D G; Woodward, E R; Howell, S J; Verhoef, S; Howell, A; Lalloo, F
2017-04-01
To assess the need for adjustment in the likelihood of germline BRCA1/2 mutations in women with HER2+ breast cancers. We analysed primary mutation screens on women with breast cancer with unequivocal HER2 overexpression and assessed the likelihood of BRCA1/BRCA2 mutations by age, oestrogen receptor status and Manchester score. Of 1111 primary BRCA screens with confirmed HER2 status only 4/161 (2.5%) of women with HER2 amplification had a BRCA1 mutation identified and 5/161 (3.1%) a BRCA2 mutation. The pathology adjusted Manchester score between 10 and 19% and 20%+ thresholds resulted in a detection rate of only 6.5 and 15% respectively. BOADICEA examples appeared to make even less downward adjustment. There is a very low detection rate of BRCA1 and BRCA2 mutations in women with HER2 amplified breast cancers. The Manchester score and BOADICEA do not make sufficient downward adjustment for HER2 amplification. For unaffected women, assessment of breast cancer risk and BRCA1/2 probability should take into account the pathology of the most relevant close relative. Unaffected women undergoing mutation testing for BRCA1/2 should be advised that there is limited reassurance from a negative test result if their close relative had a HER2+ breast cancer.
Goodall, John
2012-05-21
Log files are typically semi- or un-structured. To be useable for visualization and machine learning, they need to be parsed into a standard, structured format. Log-tool is a tool for facilitating the parsing, structuring, and routing of log files (e.g. intrusion detection long, web server logs, system logs). It consists of three main components: (1) Input it will input data from files, standard input, and syslog, (2) Parser it will parse the log file based on regular expressions into structured data (JSNO format), (3) Output it will output structured data into commonly used formats, including Redis (a database), standard output, and syslog.
Maximum Likelihood, Profile Likelihood, and Penalized Likelihood: A Primer
Cole, Stephen R.; Chu, Haitao; Greenland, Sander
2014-01-01
The method of maximum likelihood is widely used in epidemiology, yet many epidemiologists receive little or no education in the conceptual underpinnings of the approach. Here we provide a primer on maximum likelihood and some important extensions which have proven useful in epidemiologic research, and which reveal connections between maximum likelihood and Bayesian methods. For a given data set and probability model, maximum likelihood finds values of the model parameters that give the observed data the highest probability. As with all inferential statistical methods, maximum likelihood is based on an assumed model and cannot account for bias sources that are not controlled by the model or the study design. Maximum likelihood is nonetheless popular, because it is computationally straightforward and intuitive and because maximum likelihood estimators have desirable large-sample properties in the (largely fictitious) case in which the model has been correctly specified. Here, we work through an example to illustrate the mechanics of maximum likelihood estimation and indicate how improvements can be made easily with commercial software. We then describe recent extensions and generalizations which are better suited to observational health research and which should arguably replace standard maximum likelihood as the default method. PMID:24173548
The likelihood ratio and frequency of DQ2/DQ8 haplotypes in Iranian patients with celiac disease
Khosravi, Asghar; Mansouri, Masoume; Rostami-Nejad, Mohammad; Shahbazkhani, Bijan; Ekhlasi, Golnaz; Kalantari, Ebrahim
2016-01-01
Aim: The aim of this study was to evaluate the likelihood ratio and frequency of DQ2 and DQ8 in Iranian patients with celiac disease (CD). Background: The HLA DQ2 and HLA DQ8 are the important mediators in the development of celiac disease. A few studies evaluated the frequency of HLA DQ2 and HLA DQ8 haplotypes among the Iranian population with low sample size. Patients and methods: In this cross-sectional study, to predict HLA–DQ2 and DQ8 haplotypes, 141(73 male, 78 female) confirmed CD patients compared to 151 healthy controls were enrolled into this study during 2013-2014. HLA DQ2/ DQ8 haplotypes was determined in cases and controls using PCR-SSP technique. Results: DQ2 and DQ8 were positive in 80% (n=111) and 49% (n= 69) of CD patients and 36% (n=61) and 13% (n=21) of control group respectively. Moreover, 32% (n=45) of CD patients and 5.3% (n=8) of the control group were carrier of both haplotypes. In the case group about one-third of patients (32.2%) were positive for carrying both DQ2 and DQ8 heterodimers while only 5.3% (n=8) of the control group were carrier. In addition, the positive likelihood ratio of DQ2 and DQ8 were 1.74 (CI: 1.4- 2.1), and 2.6 (CI: 1.8– 2.7), respectively. Conclusion: The result of this study showed that the frequency of DQ8 among our population is higher than those reported by European countries, but it is close to those founded in South America and Middle East. This result suggests that the higher prevalence of HLA DQ8 pattern in Iranian CD patients is similar to non-European patients. PMID:26744610
Augmented Likelihood Image Reconstruction.
Stille, Maik; Kleine, Matthias; Hägele, Julian; Barkhausen, Jörg; Buzug, Thorsten M
2016-01-01
The presence of high-density objects remains an open problem in medical CT imaging. Data of projections passing through objects of high density, such as metal implants, are dominated by noise and are highly affected by beam hardening and scatter. Reconstructed images become less diagnostically conclusive because of pronounced artifacts that manifest as dark and bright streaks. A new reconstruction algorithm is proposed with the aim to reduce these artifacts by incorporating information about shape and known attenuation coefficients of a metal implant. Image reconstruction is considered as a variational optimization problem. The afore-mentioned prior knowledge is introduced in terms of equality constraints. An augmented Lagrangian approach is adapted in order to minimize the associated log-likelihood function for transmission CT. During iterations, temporally appearing artifacts are reduced with a bilateral filter and new projection values are calculated, which are used later on for the reconstruction. A detailed evaluation in cooperation with radiologists is performed on software and hardware phantoms, as well as on clinically relevant patient data of subjects with various metal implants. Results show that the proposed reconstruction algorithm is able to outperform contemporary metal artifact reduction methods such as normalized metal artifact reduction.
Good, Jessica
2013-08-01
This is an executable python script which offers two different conversions for well log data: 1) Conversion from a BoreholeLASLogData.xls model to a LAS version 2.0 formatted XML file. 2) Conversion from a LAS 2.0 formatted XML file to an entry in the WellLog Content Model. Example templates for BoreholeLASLogData.xls and WellLogsTemplate.xls can be found in the package after download.
Paillet, Frederick L.; Morin, R.H.
1988-01-01
A complete suite of conventional geophysical well logs was obtained in the upper part of a 3220-m-deep borehole drilled into geothermally altered alluvial sediments on the southeastern edge of the Salton Sea. Geophysical logs obtained in the State 2-14 borehole indicate that neutron porosity, gamma-gamma, and deep-induction logs provide useful information on lithologic trends with depth. The natural gamma log contains almost continuous, high-frequency fluctuations that obscure lithologic trends and that may be related to recent radioisotope redistribution and departure from radiometric equilibrium. Acoustic transit time logs give unrealistically low in situ compressional velocities ranging from 1.8 to 3.0 km/s, whereas acoustic waveform logs indicate that sediment compressional velocities range from less than 3.0 km/s shallower than 1000 m in depth to almost 5.0 km/s at depths greater than 2000 m. Analyses indicate that most log values lie between two lithologic end points: an electrically conductive claystone with moderate neutron porosity, but no effective porosity, and an electrically nonconductive, fully cemented siltstone that has small but finite porosity. -from Authors
Banerjee, Buddhananda; Biswas, Atanu
2014-01-01
Use of surrogate outcome to improve the inference in biomedical problems is an area of growing interest. Here, we consider a setup where both the true and surrogate endpoints are binary and we observe all the surrogate endpoints along with a few true endpoints. In a two-treatment setup we study the surrogate-augmented Mantel-Haenszel estimator based on observations from different groups when the group effect is present. We compare the Mantel-Haenszel estimator with the one obtained by maximizing profile likelihood in a surrogate augmented setup. We observe that the performances of these estimators are very close.
NASA Technical Reports Server (NTRS)
Klebesadel, R. W.; Fenimore, E. E.; Laros, J.
1983-01-01
The log N-log S data acquired by the Pioneer Venus Orbiter Gamma Burst Detector (PVO) are presented and compared to similar data from the Soviet KONUS experiment. Although the PVO data are consistent with and suggestive of a -3/2 power law distribution, the results are not adequate at this state of observations to differentiate between a -3/2 and a -1 power law slope.
Condition and fate of logged forests in the Brazilian Amazon.
Asner, Gregory P; Broadbent, Eben N; Oliveira, Paulo J C; Keller, Michael; Knapp, David E; Silva, José N M
2006-08-22
The long-term viability of a forest industry in the Amazon region of Brazil depends on the maintenance of adequate timber volume and growth in healthy forests. Using extensive high-resolution satellite analyses, we studied the forest damage caused by recent logging operations and the likelihood that logged forests would be cleared within 4 years after timber harvest. Across 2,030,637 km2 of the Brazilian Amazon from 1999 to 2004, at least 76% of all harvest practices resulted in high levels of canopy damage sufficient to leave forests susceptible to drought and fire. We found that 16+/-1% of selectively logged areas were deforested within 1 year of logging, with a subsequent annual deforestation rate of 5.4% for 4 years after timber harvests. Nearly all logging occurred within 25 km of main roads, and within that area, the probability of deforestation for a logged forest was up to four times greater than for unlogged forests. In combination, our results show that logging in the Brazilian Amazon is dominated by highly damaging operations, often followed rapidly by deforestation decades before forests can recover sufficiently to produce timber for a second harvest. Under the management regimes in effect at the time of our study in the Brazilian Amazon, selective logging would not be sustained.
Wang, Zhi-qiang; Ren, Chao; Wang, De-shen; Zhang, Dong-sheng; Luo, Hui-yan; Li, Yu-hong; Xu, Rui-hua
2012-01-01
Background In this study, we established a hypothetical tumor-lodds-metastasis (TLM) and tumor-ratio-metastasis (TRM) staging system. Moreover, we compared them with the 7th edition of American Joint Committee on Cancer tumor-nodes-metastasis (AJCC TNM) staging system in gastric cancer patients after D2 resection. Methods A total of 1000 gastric carcinoma patients receiving treatment in our center were selected for the analysis. Finally, 730 patients who received D2 resection were retrospectively studied. Patients were staged using the TLM, TRM and the 7th edition AJCC TNM system. Survival analysis was performed with a Cox regression model. We used two parameters to compare the TNM, TRM and TLM staging system, the −2log likelihood and the hazard ratio. Results The cut points of lymph node ratio (LNR) were set as 0, 0–0.3, 0.3–0.6, 0.6–1.0. And for the log odds of positive lymph nodes (LODDS), the cut points were established as≤−0.5, −0.5-0, 0-0.5, >0.5. There were significant differences in survival among patients in different LODDS classifications for each pN or LNR groups. When stratified by the LODDS classifications, the prognosis was highly homologous between those in the according pN or LNR classifications. Multivariate analysis showed that TLM staging system was better than the TRM or TNM system for the prognostic evaluation. Conclusions The TLM system was superior to the TRM or TNM system for prognostic assessment of gastric adenocarcinoma patients after D2 resection. PMID:22348125
NASA Technical Reports Server (NTRS)
Perez, Raul Garcia
1993-01-01
The Ulysses Log tells the story of some intriguing problems that we (=The Spacecraft Team) have encountered. Ulysses was launched on 6 Oct. 1990, and it made the fastest trip to Jupiter (8 Feb. 1992). It is presently going out of the ecliptic. This paper presents log entries from the following areas: (1) ingenious maneuvers; (2) telecommunication problems; and (3) surprises.
CORRELATOR 5.2 - A program for interactive lithostratigraphic correlation of wireline logs
Olea, R.A.
2004-01-01
The limited radius of investigation of petrophysical measurements made in boreholes and the relatively large distances between wells result in an incomplete sensing of the subsurface through well logging. CORRELATOR is a program for estimating geological properties between logged boreholes. An initial and fundamental step is the lithostratigraphic correlation of logs in different wells. The method employed by the program closely emulates the process of visual inspection used by experienced subsurface geologists in manual correlation. Mathematically, the determination of lithostratigraphical equivalence is based on the simultaneous assessment of similarity in shale content, similarity in the patterns of vertical variation in a petrophysical property that is measured with high vertical resolution, and spatial consistency of stratigraphic relationships as determined by an expert system. Multiple additional options for processing log readings allow maximization in the extraction of information from pairs of logs per well and great flexibility in the final display of results in the form of cross sections and dip diagrams. ?? 2004 Elsevier Ltd. All rights reserved.
Harrell-Williams, Leigh; Wolfe, Edward W
2014-01-01
Previous research has investigated the influence of sample size, model misspecification, test length, ability distribution offset, and generating model on the likelihood ratio difference test in applications of item response models. This study extended that research to the evaluation of dimensionality using the multidimensional random coefficients multinomial logit model (MRCMLM). Logistic regression analysis of simulated data reveal that sample size and test length have a large effect on the capacity of the LR difference test to correctly identify unidimensionality, with shorter tests and smaller sample sizes leading to smaller Type I error rates. Higher levels of simulated misfit resulted in fewer incorrect decisions than data with no or little misfit. However, Type I error rates indicate that the likelihood ratio difference test is not suitable under any of the simulated conditions for evaluating dimensionality in applications of the MRCMLM.
Inductive resistivity logging in steel-cased boreholes. SBIR Phase 2 progress report
Wilt, M.
1999-07-01
SBIR Phase 2 project 40145-97-I calls for the design and construction of a prototype inductive logging device to measure formation resistivity from within a steel-cased borehole. The SCIL (Steel Casing Induction Logger) tool is intended for reservoir characterization and process monitoring in an oil field environment. This report summarizes findings from the initial project period. In this phase, bench model measurements were made to test casing compensation schemes, numerical models were calculated to optimize the tool configuration and associated formation sensitivity and the preliminary design of the tool was completed. The bench tests constitute fundamental research on determining the characteristics of steel well casing and on developing means of separating the effects of the casing and the formation. This technology is crucial to the success of the project and significant progress has been made towards the goal of recovering the formation resistivity from inside the casing. Next, a series of sensitivity and tool configuration studies have been completed through partner Dr. David Alumbaugh at Sandia National Laboratories. These numerical results help to optimize the tool configuration and allow one to calculate the expected formation sensitivity. These models are preliminary to data interpretation software to be developed in the next project period. The initial hardware design of the tool has been completed, and ordering parts has begun for later manufacture and assembly. The tool, which is designed for maximum flexibility of deployment, will have a powerful transmitter, an array of three component sensors and sufficient dynamic range to operate in standard oil field steel-cased boreholes.
Vos, J; Oosterwijk, J C; Gómez-García, E; Menko, F H; Jansen, A M; Stoel, R D; van Asperen, C J; Tibben, A; Stiggelbout, A M
2011-03-01
Previous studies on the counsellees' perception of DNA test results did not clarify whether counsellees were asked about their recollections or interpretations, and focused only on patients' own risks and not on the likelihood that cancer is heritable in the family. We tested differences and correlations of four perception aspects: recollections and interpretations of both cancer risks and heredity likelihood. In a retrospective study, women tested for BRCA1/2 on average, 5 years ago, completed questionnaires about their perception. Participants had received an unclassified variant (n = 76), uninformative (n = 76) or pathogenic mutation (n = 51) result in BRCA1/2. Analyses included t-tests, correlations and structural equation modelling. The counsellees' perception showed to consist of four distinctive phenomena: recollections and interpretations of cancer risks and of heredity likelihood. This distinctiveness was suggested by significant differences between these perception variables. Moderate to strong correlations were found between these variables, suggesting that these differences between variables were consistent. The relationships between these variables were not influenced by actually communicated DNA test results, sociodemographics, medical and pedigree information, or framing of cancer risk questions. The largest differences between recollections and interpretations were found in the unclassified variant group and the smallest in uninformatives. Cancer risks and heredity likelihood correlated least in the pathogenic mutation group. Communication of ambiguous genetic information enlarged the differences. To understand the counsellees' perception of genetic counselling, researchers should study recollections and interpretations of cancer risks and heredity likelihood. Genetic counsellors should explicitly address the counsellees' recollections and interpretations, and be aware of possible inaccuracies.
Electoral Outcomes and Social Log-Likelihood Maxima
1979-10-01
the form mXf(x; e kSk=l f(x; 0k ) + f(y*; e k) g(ek) " "(1k) where (i) Il..",I os a partition of x (ii) ek E Ik , and (iii) m k To is the Lebesgue ...Footnotes 1/ By Corollary 5.9 in Bartle [1966] and the regularity conditions on the f(x" 0). 12/ By Corollary 5.9 in Bartle [1966] and the regularity...and H. Uzawa (eds.) [1950], Studies in Linear *and Nonlinear Programming, Stanford, California: Stanford Univer- sity Press. Bartle , R. [1966], The
Clow, G.D.; Saltus, R.W.; Waddington, E.D.
1996-01-01
We describe a high-precision (0.1-1.0 mK) borehole-temperature (BT) logging system developed at the United States Geological Survey (USGS) for use in remote polar regions. We discuss calibration, operational and data-processing procedures, and present an analysis of the measurement errors. The system is modular to facilitate calibration procedures and field repairs. By interchanging logging cables and temperature sensors, measurements can be made in either shallow air-filled boreholes or liquid-filled holes up to 7 km deep. Data can be acquired in either incremental or continuous-logging modes. The precision of data collected by the new logging system is high enough to detect and quantify various thermal effects at the milli-Kelvin level. To illustrate this capability, we present sample data from the 3 km deep borehole at GISP2, Greenland, and from a 130m deep air-filled hole at Taylor Dome, Antarctica. The precision of the processed GTSP2 continuous temperature logs is 0.25-0.34 mK, while the accuracy is estimated to be 4.5 mK. The effects of fluid convection and the dissipation of the thermal disturbance caused by drilling the borehole are clearly visible in the data. The precision of the incremental Taylor Dome measurements varies from 0.11 to 0.32mK, depending on the wind strength during the experiments. With this precision, we found that temperature fluctuations and multi-hour trends in the BT measurements correlate well with atmospheric-pressure changes.
Addendum 2: Logs of monitor wells drilled May 1988 through December 1992
Stout, J.; Qualheim, B.; McPherrin, R.; Barber, K.; Hedegaard, R.; McConihe, W.; Miller, T.
1993-11-01
The logs in this addendum were plotted in a new format by the same software package (LOGGER by Rockware, Denver, CO) that was used in the original publication. The scale remains the same, 1 inch = 15 foot. The header is totally automated with a subheading indexing the well-construction symbols. Geophysical curves are labeled in their respective channels, and percentage core recovery is plotted in a histogram. Lithologic symbols are plotted to scale in a channel similar to previous logs. The lithologic description also has been automated to assure consistency in terminology. Descriptions are more extensive and are referenced by leader lines to the lithologic symbol. Additional figures included for this Addendum are: a plot of all the monitoring well locations at the LLNL Main site and a plot detailing the gasoline spill area well locations in the vicinity of Building 403.
Whiley, Phillip J.; Parsons, Michael T.; Leary, Jennifer; Tucker, Kathy; Warwick, Linda; Dopita, Belinda; Thorne, Heather; Lakhani, Sunil R.; Goldgar, David E.; Brown, Melissa A.; Spurdle, Amanda B.
2014-01-01
Rare exonic, non-truncating variants in known cancer susceptibility genes such as BRCA1 and BRCA2 are problematic for genetic counseling and clinical management of relevant families. This study used multifactorial likelihood analysis and/or bioinformatically-directed mRNA assays to assess pathogenicity of 19 BRCA1 or BRCA2 variants identified following patient referral to clinical genetic services. Two variants were considered to be pathogenic (Class 5). BRCA1:c.4484G> C(p.Arg1495Thr) was shown to result in aberrant mRNA transcripts predicted to encode truncated proteins. The BRCA1:c.122A>G(p.His41Arg) RING-domain variant was found from multifactorial likelihood analysis to have a posterior probability of pathogenicity of 0.995, a result consistent with existing protein functional assay data indicating lost BARD1 binding and ubiquitin ligase activity. Of the remaining variants, seven were determined to be not clinically significant (Class 1), nine were likely not pathogenic (Class 2), and one was uncertain (Class 3).These results have implications for genetic counseling and medical management of families carrying these specific variants. They also provide additional multifactorial likelihood variant classifications as reference to evaluate the sensitivity and specificity of bioinformatic prediction tools and/or functional assay data in future studies. PMID:24489791
Maximum Likelihood and Bayesian Parameter Estimation in Item Response Theory.
ERIC Educational Resources Information Center
Lord, Frederic M.
There are currently three main approaches to parameter estimation in item response theory (IRT): (1) joint maximum likelihood, exemplified by LOGIST, yielding maximum likelihood estimates; (2) marginal maximum likelihood, exemplified by BILOG, yielding maximum likelihood estimates of item parameters (ability parameters can be estimated…
A per-pixel Log2ADC for high dynamic range, 1000FPS digital focal plane arrays (DFPA)
NASA Astrophysics Data System (ADS)
Petilli, Eugene
2016-09-01
Intrinsix has developed a Digital Focal Plane Array (DFPA) architecture based on a novel piecewise linear Log2 ADC (LADC) with "lossless" analog compression which enables ultra-high dynamic range ROICs that use less power than other extended dynamic range technologies. The LADC provides dynamic range of 126dB with a constant 75dB SNR over the entire frame. The companding 13bit mantissa, 3bit radix per pixel LADCs compress the 21bit signals into efficient 16 bit data words. The Read Out IC (ROIC) is compatible with most IR and LWIR detectors including two-color SLS (photodiode) and uBolometers. The DFPA architecture leverages two (staggered frame prime and redundant) MIPI CSI-3 interfaces to achieve full HD DFPA at 1000 frames/sec; an equivalent uncompressed data rate of 100Gb/sec. The LADC uses direct injection into a moderate sized integrating capacitor and several comparators create a stream of multi-bit data values. These values are accumulated in an SRAM based log2ALU and the radix of the ALU is combined with the data to generate a feedback current to the integrating capacitor, closing the delta loop. The integration time and a single pole low pass IIR filter are configurable using control signals to the log2ALU. The feedback current is at least partially generated using PWM for high linearity.
Correlation of lithologic and sonic logs from the COST No. B-2 well with seismic reflection data
King, K.C.
1979-01-01
The purpose of this study was to correlate events recorded on seismic records with changes in lithology recorded from sample descriptions from the Continental Offshore Stratigraphic Test (COST) No. B-2 well. The well is located on the U.S. mid-Atlantic Outer Continental Shelf about 146 km east of Atlantic City, N.J. (see location map). Lithologic data are summarized from the sample descriptions of Smith and others (1976). Sonic travel times were read at 0.15 m intervals in the well using a long-space sonic logging tool. Interval velocities, reflection coefficients and a synthetic seismogram were calculated from the sonic log.
Grid Logging: Best Practices Guide
Tierney, Brian L; Tierney, Brian L; Gunter, Dan
2008-04-01
The purpose of this document is to help developers of Grid middleware and application software generate log files that will be useful to Grid administrators, users, developers and Grid middleware itself. Currently, most of the currently generated log files are only useful to the author of the program. Good logging practices are instrumental to performance analysis, problem diagnosis, and security auditing tasks such as incident tracing and damage assessment. This document does not discuss the issue of a logging API. It is assumed that a standard log API such as syslog (C), log4j (Java), or logger (Python) is being used. Other custom logging API or even printf could be used. The key point is that the logs must contain the required information in the required format. At a high level of abstraction, the best practices for Grid logging are: (1) Consistently structured, typed, log events; (2) A standard high-resolution timestamp; (3) Use of logging levels and categories to separate logs by detail and purpose; (4) Consistent use of global and local identifiers; and (5) Use of some regular, newline-delimited ASCII text format. The rest of this document describes each of these recommendations in detail.
Atwood, J.W.; Killpack, T.J.; Glenn, W.E.; Nutter, C.
1980-11-01
WELLOG is a software system that has been developed to plot digitized well log data in a manner suitable for analysis. Multiple logs can be plotted side by side for correlation analysis, and up to three logs can be plotted on a cross plot. Data entry, editing, and modification functions are also provided by the program. Digitizing is accomplished by a TEKTRONIX 4954 (on-line) digitizing tablet, and plotting is done on a TEKTRONIX 4014 graphics terminal, a STATOS 42 electrostatic plotter, or a CALCOMP pen plotter using a device independent plotting system. This program (WELLOG.REV2) is not as system-dependent as the former version (WELLOG.REV1). The user must supply a program to digitize the data and supply subroutines to interface the program with file manipulation and plotting routines of their system. One major improvement is the use of an on-line digitizing system whereby the program accesses disk files rather than reading the data from tape. In REV2 the merge file has been automated such that the file is initialized automatically upon creation and also delete protected. The randomly spaced data capabilities have been greatly improved allowing the averaging and cross plotting of the data. Routines have been added which allow all of the cross plots excepting the Z-plot to be printed on a line printer. Dresser Atlas' A-K plot has also been added. The program is almost completely self-contained needing only a few interfacing and system subroutines.
EthoLog 2.2: a tool for the transcription and timing of behavior observation sessions.
Ottoni, E B
2000-08-01
EthoLog is a tool that aids in the transcription and timing of behavior observation sessions--experimental or naturalistic, from video/audio tapes or registering real time. It was created with Visual Basic and runs on Windows (3.x/9x). The user types the key codes for the predefined behavioral categories, and EthoLog registers their sequence and timing and saves the resulting data in ASCII output files. A sequential analysis matrix can be generated from the sequential data. The output files may be edited, converted to plain text files for printing, or exported to a spreadsheet program, such as MS Excel, for further analyses.
Collett, T.S.
1999-01-01
The JAPEX/JNOC/GSC Mallik 2L-38 gas hydrate research well project was designed to investigate the occurrence of in situ natural gas hydrate in the Mallik area of the Mackenzie Delta of Canada. Because gas hydrate is unstable at surface pressure and temperature conditions, a major emphasis was placed on the downhole logging program to determine the in situ physical properties of the gas-hydrate-bearing sediments. Downhole logging tool strings deployed in the Mallik 2L-38 well included the Schlumberger Platform Express with a high resolution laterolog, Array Induction Imager Tool, Dipole Shear Sonic Imager, and a Fullbore Formation Microlmager. The downhole log data obtained from the log- and core-inferred gas-hydrate-bearing sedimentary interval (897.25-1109.5 m log depth) in the Mallik 2L-38 well is depicted in a series of well displays. Also shown are numerous reservoir parameters, including gas hydrate saturation and sediment porosity log traces, calculated from available downhole well-log and core data. The gas hydrate accumulation delineated by the Mallik 2L-38 well has been determined to contain as much as 4.15109 m3 of gas in the 1 km2 area surrounding the drill site.
Maximum Likelihood Fusion Model
2014-08-09
data fusion, hypothesis testing,maximum likelihood estimation, mobile robot navigation REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT...61 vi 9 Bibliography 62 vii 10 LIST OF FIGURES 1.1 Illustration of mobile robotic agents. Land rovers such as (left) Pioneer robots ...simultaneous localization and mapping 1 15 Figure 1.1: Illustration of mobile robotic agents. Land rovers such as (left) Pioneer robots , (center) Segways
Barrett, Harrison H.; White, Timothy; Parra, Lucas C.
2010-01-01
As photon-counting imaging systems become more complex, there is a trend toward measuring more attributes of each individual event. In various imaging systems the attributes can include several position variables, time variables, and energies. If more than about four attributes are measured for each event, it is not practical to record the data in an image matrix. Instead it is more efficient to use a simple list where every attribute is stored for every event. It is the purpose of this paper to discuss the concept of likelihood for such list-mode data. We present expressions for list-mode likelihood with an arbitrary number of attributes per photon and for both preset counts and preset time. Maximization of this likelihood can lead to a practical reconstruction algorithm with list-mode data, but that aspect is covered in a separate paper [IEEE Trans. Med. Imaging (to be published)]. An expression for lesion detectability for list-mode data is also derived and compared with the corresponding expression for conventional binned data. PMID:9379247
An O(log sup 2 N) parallel algorithm for computing the eigenvalues of a symmetric tridiagonal matrix
NASA Technical Reports Server (NTRS)
Swarztrauber, Paul N.
1989-01-01
An O(log sup 2 N) parallel algorithm is presented for computing the eigenvalues of a symmetric tridiagonal matrix using a parallel algorithm for computing the zeros of the characteristic polynomial. The method is based on a quadratic recurrence in which the characteristic polynomial is constructed on a binary tree from polynomials whose degree doubles at each level. Intervals that contain exactly one zero are determined by the zeros of polynomials at the previous level which ensures that different processors compute different zeros. The exact behavior of the polynomials at the interval endpoints is used to eliminate the usual problems induced by finite precision arithmetic.
SPT Lensing Likelihood: South Pole Telescope CMB lensing likelihood code
NASA Astrophysics Data System (ADS)
Feeney, Stephen M.; Peiris, Hiranya V.; Verde, Licia
2014-11-01
The SPT lensing likelihood code, written in Fortran90, performs a Gaussian likelihood based upon the lensing potential power spectrum using a file from CAMB (ascl:1102.026) which contains the normalization required to get the power spectrum that the likelihood call is expecting.
DALI: Derivative Approximation for LIkelihoods
NASA Astrophysics Data System (ADS)
Sellentin, Elena
2015-07-01
DALI (Derivative Approximation for LIkelihoods) is a fast approximation of non-Gaussian likelihoods. It extends the Fisher Matrix in a straightforward way and allows for a wider range of posterior shapes. The code is written in C/C++.
NASA Astrophysics Data System (ADS)
Kong, Gyuyeol; Choi, Sooyong
2012-08-01
A turbo equalization scheme for holographic data storage (HDS) systems is proposed. The proposed turbo equalization procedure is conducted between a one-dimensional (1D) partial response maximum likelihood (PRML) detector and the joint Viterbi decoder by exchanging a priori and extrinsic information. In the joint Viterbi decoder, the modulation and convolutional decoding is performed simultaneously by mapping a 2/4 modulation symbol onto the trellis of the convolutional code to reduce the complexity of the decoding procedure and improve the decoding capability for the iterative equalization and decoding. In addition, since the channel model is described as the two-dimensional convolution in HDS systems, the 1D PRML detector is performed in the vertical direction and the joint Viterbi decoder is performed in the horizontal direction to maximize the performance gains. The simulation result shows that the proposed turbo equalization scheme has the better bit error rate performances as the number of iterations increases.
Real Time Mud Gas Logging During Drilling of DFDP-2B
NASA Astrophysics Data System (ADS)
Mathewson, L. A.; Toy, V.; Menzies, C. D.; Zimmer, M.; Erzinger, J.; Niedermann, S.; Cox, S.
2015-12-01
The Deep Fault Drilling Project (DFDP) aims to improve our understanding of the Alpine Fault Zone, a tectonically active mature fault system in New Zealand known to rupture in large events, by deep scientific drilling. The borehole DFDP-2B approached the Alpine Fault at depth, reaching a final depth of 892 m (820 m true vertical depth). Online gas analysis (OLGA) while drilling tracked changes in the composition of gases extracted from the circulating drill mud. The composition of fluids from fault zones can provide information about their origins, flow rates and -paths, fluid-rock interactions along these paths, and the permeability structure of the faulted rock mass. Apart from an atmospheric input, the gases in drilling mud derive from the pore space of rock, crushed at the drill bit, and from permeable layers intersected by the borehole. The rapid formation of mud wall cake seals the borehole from further fluid inflow, hence formation-derived gases enter mostly at the depth of the drill bit. OLGA analyses N2, O2, Ar, CO2, CH4, He, and H2 on a mass spectrometer, hydrocarbons CH4, C2H6, C3H8, i-C4H10, and n-C4H10 on a gas chromatograph, and Rn using a lucas-cell detector. Gas was sampled for offline analyses on noble gas and stable isotopes to complement the OLGA dataset. The principle formation-derived gases found in drilling mud during drilling of DFDP-2 were CO2 and CH4, with smaller component of H2 and He2. High radon activity is interpreted to reflect intervals of active fluid flow through highly fractured and faulted rock. 3He/4He values in many samples were extremely air-contaminated, i.e. there was almost no excess of non-atmospheric He. The 3He/4He values measured at 236 m and 610 m, which are the only analyses with uncertainties <100%, are very similar to those measured in hot springs along the Alpine Fault, e.g. Fox River (0.64 Ra), Copland (0.42 Ra), Lower Wanganui (0.81 Ra). We will compare these data to those gathered using OLGA and discuss the
Hertel, Thomas W.; Ramankutty, Navin; Baldos, Uris Lantz C.
2014-01-01
There has been a resurgence of interest in the impacts of agricultural productivity on land use and the environment. At the center of this debate is the assertion that agricultural innovation is land sparing. However, numerous case studies and global empirical studies have found little evidence of higher yields being accompanied by reduced area. We find that these studies overlook two crucial factors: estimation of a true counterfactual scenario and a tendency to adopt a regional, rather than a global, perspective. This paper introduces a general framework for analyzing the impacts of regional and global innovation on long run crop output, prices, land rents, land use, and associated CO2 emissions. In so doing, it facilitates a reconciliation of the apparently conflicting views of the impacts of agricultural productivity growth on global land use and environmental quality. Our historical analysis demonstrates that the Green Revolution in Asia, Latin America, and the Middle East was unambiguously land and emissions sparing, compared with a counterfactual world without these innovations. In contrast, we find that the environmental impacts of a prospective African Green Revolution are potentially ambiguous. We trace these divergent outcomes to relative differences between the innovating region and the rest of the world in yields, emissions efficiencies, cropland supply response, and intensification potential. Globalization of agriculture raises the potential for adverse environmental consequences. However, if sustained for several decades, an African Green Revolution will eventually become land sparing. PMID:25201962
Hertel, Thomas W; Ramankutty, Navin; Baldos, Uris Lantz C
2014-09-23
There has been a resurgence of interest in the impacts of agricultural productivity on land use and the environment. At the center of this debate is the assertion that agricultural innovation is land sparing. However, numerous case studies and global empirical studies have found little evidence of higher yields being accompanied by reduced area. We find that these studies overlook two crucial factors: estimation of a true counterfactual scenario and a tendency to adopt a regional, rather than a global, perspective. This paper introduces a general framework for analyzing the impacts of regional and global innovation on long run crop output, prices, land rents, land use, and associated CO2 emissions. In so doing, it facilitates a reconciliation of the apparently conflicting views of the impacts of agricultural productivity growth on global land use and environmental quality. Our historical analysis demonstrates that the Green Revolution in Asia, Latin America, and the Middle East was unambiguously land and emissions sparing, compared with a counterfactual world without these innovations. In contrast, we find that the environmental impacts of a prospective African Green Revolution are potentially ambiguous. We trace these divergent outcomes to relative differences between the innovating region and the rest of the world in yields, emissions efficiencies, cropland supply response, and intensification potential. Globalization of agriculture raises the potential for adverse environmental consequences. However, if sustained for several decades, an African Green Revolution will eventually become land sparing.
NASA Astrophysics Data System (ADS)
Xiao, Kun; Zou, Changchun; Xiang, Biao; Yue, Xuyuan; Zhou, Xinpeng; Li, Jianguo; Zhao, Bin
2014-10-01
The hole NLSD-2, one of the deepest scientific drilling projects in the metallic ore districts of China, is the second scientific drilling deep hole in the Nanling district. Its ultimate depth is 2012.12 m. This hole was created through the implementation of continuous coring, and the measuring of a variety of geophysical well logging methods was performed over the course of the drilling process. This paper analyzes the characteristic responses of the fracture and fractured zone by ultrasonic imaging log data, and characterizes various rules of fracture parameters which change according to drilling depth. It then discusses the denotative meaning of the log results of polymetallic mineralization layers. The formation fractures develop most readily in a depth of 100~200 m, 600~850 m and 1450~1550 m of the hole NLSD-2, and high angle fractures develop most prominently. The strike direction of the fractures is mainly NW-SE, reflecting the orientation of maximum horizontal principal stress. For the polymetallic mineralization layer that occurred in the fractured zone, the characteristic response of ultrasonic imaging log is a wide dark zone, and the characteristic responses of conventional logs displayed high polarizability, high density, high acoustic velocity and low resistivity. All the main polymetallic mineralization layers are developed in fractures or fractured zones, and the fractures and fractured zones can be identified by an ultrasonic imaging log, thus the log results indirectly indicate the occurrence of polymetallic mineralization layers. Additionally, the relationship between the dip direction of fractures and the well deviation provides guidance for straightening of the drilling hole.
SpaceOps 2012 Plus 2: Social Tools to Simplify ISS Flight Control Communications and Log Keeping
NASA Technical Reports Server (NTRS)
Cowart, Hugh S.; Scott, David W.
2014-01-01
A paper written for the SpaceOps 2012 Conference (Simplify ISS Flight Control Communications and Log Keeping via Social Tools and Techniques) identified three innovative concepts for real time flight control communications tools based on social mechanisms: a) Console Log Tool (CoLT) - A log keeping application at Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) that provides "anywhere" access, comment and notifications features similar to those found in Social Networking Systems (SNS), b) Cross-Log Communication via Social Techniques - A concept from Johnsson Space Center's (JSC) Mission Control Center Houston (MCC-H) that would use microblogging's @tag and #tag protocols to make information/requests visible and/or discoverable in logs owned by @Destination addressees, and c) Communications Dashboard (CommDash) - A MSFC concept for a Facebook-like interface to visually integrate and manage basic console log content, text chat streams analogous to voice loops, text chat streams dedicated to particular conversations, generic and position-specific status displays/streams, and a graphically based hailing display. CoLT was deployed operationally at nearly the same time as SpaceOps 2012, the Cross- Log Communications idea is currently waiting for a champion to carry it forward, and CommDash was approved as a NASA Iinformation Technoloby (IT) Labs project. This paper discusses lessons learned from two years of actual CoLT operations, updates CommDash prototype development status, and discusses potential for using Cross-Log Communications in both MCC-H and/or POIC environments, and considers other ways for synergizing console applcations.
NASA Astrophysics Data System (ADS)
Suh, Youngjoo; Kim, Hoirin
2014-12-01
In this paper, a new discriminative likelihood score weighting technique is proposed for speaker identification. The proposed method employs a discriminative weighting of frame-level log-likelihood scores with acoustic-phonetic classification in the Gaussian mixture model (GMM)-based speaker identification. Experiments performed on the Aurora noise-corrupted TIMIT database showed that the proposed approach provides meaningful performance improvement with an overall relative error reduction of 15.8% over the maximum likelihood-based baseline GMM approach.
Cetron, Martin
2016-03-18
Since May 2015, when Zika virus, a flavivirus transmitted primarily by Aedes aegypti mosquitoes, was reported in Brazil, the virus has rapidly spread across the Region of the Americas and the Caribbean. The association between maternal Zika virus infection and adverse fetal and reproductive outcomes, including microcephaly, prompted CDC to issue a Level 2 alert travel notice* for the 37 countries and U.S. territories (at the national and territorial level) that have reported recent Zika virus transmission as of March 11, 2016. In addition to mosquito bite precautions for all travelers, CDC advises that pregnant women postpone travel to affected countries and U.S. territories. Within a nation's borders, ecologic characteristics, which determine the distribution of mosquito vectors, can vary considerably. CDC conducted a spatial analysis, focusing on the probability of occurrence of Ae. aegypti, to support the demarcation for subnational travel alerts. Based on results of this analysis, travel that is limited to elevations higher than 2,000 m (6,562 ft) above sea level is considered to have minimal (approximately 1%) likelihood for mosquito-borne Zika virus transmission, even within countries reporting active transmission. Women who are pregnant should avoid travel to elevations <2,000 m in countries with active Zika virus transmission.
NASA Astrophysics Data System (ADS)
Kudryashova, L.; Belozerov, V.; Pushtakov, E.
2015-11-01
Well logging results are one of the ways to study the buried terrigenous rocks genesis. To ensure the most objective characterization of the rock and identification of electrofacies it is necessary to use a complex geological and geophysical survey. The comprehensive investigations of environmental conditions based on well logging have been performed for the horizon JK2 of Tumenskoe formation in Em-Egovskoe area, Krasnoleninskoe field (Western Siberia). The defined electrofacies were compared with the results of earlier conducted granulometric and mineralogical analyses. The totality of research provided for a conclusion that the investigated sediments of horizon JK2 had been formed within the destructive tidal delta. Thus, objective facies prediction can only be ensured by analyzing core and well logging data comprehensively.
Profile Likelihood and Incomplete Data.
Zhang, Zhiwei
2010-04-01
According to the law of likelihood, statistical evidence is represented by likelihood functions and its strength measured by likelihood ratios. This point of view has led to a likelihood paradigm for interpreting statistical evidence, which carefully distinguishes evidence about a parameter from error probabilities and personal belief. Like other paradigms of statistics, the likelihood paradigm faces challenges when data are observed incompletely, due to non-response or censoring, for instance. Standard methods to generate likelihood functions in such circumstances generally require assumptions about the mechanism that governs the incomplete observation of data, assumptions that usually rely on external information and cannot be validated with the observed data. Without reliable external information, the use of untestable assumptions driven by convenience could potentially compromise the interpretability of the resulting likelihood as an objective representation of the observed evidence. This paper proposes a profile likelihood approach for representing and interpreting statistical evidence with incomplete data without imposing untestable assumptions. The proposed approach is based on partial identification and is illustrated with several statistical problems involving missing data or censored data. Numerical examples based on real data are presented to demonstrate the feasibility of the approach.
Sinha, Ritwik; Luo, Yuqun
2007-01-01
Construction of precise confidence sets of disease gene locations after initial identification of linked regions can improve the efficiency of the ensuing fine mapping effort. We took the confidence set inference, a framework proposed and implemented using the Mean test statistic (CSI-Mean) and improved the efficiency substantially by using a likelihood ratio test statistic (CSI-MLS). The CSI framework requires knowledge of some disease-model-related parameters. In the absence of prior knowledge of these parameters, a two-step procedure may be employed: 1) the parameters are estimated using a coarse map of markers; 2) CSI-Mean or CSI-MLS are applied to construct the confidence sets of the disease gene locations using a finer map of markers, assuming the estimates from Step 1 for the required parameters. In this article we show that the advantages of CSI-MLS over CSI-Mean, previously demonstrated when the required parameters are known, are preserved in this two-step procedure, using both the simulated and real data contributed to Problems 2 and 3 of Genetic Analysis Workshop 15. In addition, our result suggests that microsatellite data, when available, should be used in Step 1. Also explored in detail is the effect of the absence of parental genotypes on the performance of CSI-MLS.
Toyooka, Kiminori; Sato, Mayuko; Wakazaki, Mayumi; Matsuoka, Ken
2016-01-01
ABSTRACT We developed a wide-range and high-resolution transmission electron microscope acquisition system and obtained giga-pixel images of tobacco BY-2 cells during the log and stationary phases of cell growth. We demonstrated that the distribution and ultrastructure of compartments involved in membrane traffic (i.e., Golgi apparatus, multivesicular body, and vesicle cluster) change during the log-to-stationary transition. Mitochondria, peroxisomes, and plastids were also enumerated. Electron densities of mitochondria and peroxisomes were altered during the growth-phase shift, while their numbers were reduced by nearly half. Plastid structure dramatically changed from atypical to spherical with starch granules. Nearly the same number of plastids was observed in both log and stationary phases. These results indicate that mechanisms regulating organelle populations differ from organelle to organelle. PMID:26855065
Geological well log analysis. Third ed
Pirson, S.J.
1983-01-01
Until recently, well logs have mainly been used for correlation, structural mapping, and quantitive evaluation of hydrocarbon bearing formations. This third edition of Geologic Well Log Analysis, however, describes how well logs can be used for geological studies and mineral exploration. This is done by analyzing well logs for numerous parameters and indices of significant mineral accumulation, primarily in sediments. Contents are: SP and Eh curves as redoxomorphic logs; sedimentalogical studies by log curve shapes; exploration for stratigraphic traps; continuous dipmeter as a structural tool; continuous dipmeter as a sedimentation tool; Paleo-facies logging and mapping; hydrogeology 1--hydrodynamics of compaction; hydrogeology 2--geostatic equilibrium; and hydrogeology 3--hydrodynamics of infiltration. Appendixes cover: Computer program for calculating the dip magnitude, azimuth, and the degree and orientation of the resistivity anisotrophy; a lithology computer program for calculating the curvature of a structure; and basic log analysis package for HP-41CV programmable calculator.
ERIC Educational Resources Information Center
Hidalgo, Mª Dolores; Gómez-Benito, Juana; Zumbo, Bruno D.
2014-01-01
The authors analyze the effectiveness of the R[superscript 2] and delta log odds ratio effect size measures when using logistic regression analysis to detect differential item functioning (DIF) in dichotomous items. A simulation study was carried out, and the Type I error rate and power estimates under conditions in which only statistical testing…
Bowlin, W.P.; Kneer, M.P.; Ballance, J.D.
1989-11-07
This patent describes an improvement in a computer controlled processing system for lumber production. It comprises: a computer, a sequence of processing stations for processing a log segment including; an excess material removing station for generating opposed flat side surfaces on the log segment. The flat side surfaces determined by the computer to become sides of boards to be severed from the log segments; a profiling station for forming profiled edges above and below the flat side surfaces to become the side edges of the boards to be severed from the log segment, and a severing station for severing the boards from the log segments, a conveyance means establishing a path of conveyance and having continuous control of the log segment on conveying the log segment along the path and through the above defined sequence of processing stations.
Porosity Log Prediction Using Artificial Neural Network
NASA Astrophysics Data System (ADS)
Dwi Saputro, Oki; Lazuardi Maulana, Zulfikar; Dzar Eljabbar Latief, Fourier
2016-08-01
Well logging is important in oil and gas exploration. Many physical parameters of reservoir is derived from well logging measurement. Geophysicists often use well logging to obtain reservoir properties such as porosity, water saturation and permeability. Most of the time, the measurement of the reservoir properties are considered expensive. One of method to substitute the measurement is by conducting a prediction using artificial neural network. In this paper, artificial neural network is performed to predict porosity log data from other log data. Three well from ‘yy’ field are used to conduct the prediction experiment. The log data are sonic, gamma ray, and porosity log. One of three well is used as training data for the artificial neural network which employ the Levenberg-Marquardt Backpropagation algorithm. Through several trials, we devise that the most optimal input training is sonic log data and gamma ray log data with 10 hidden layer. The prediction result in well 1 has correlation of 0.92 and mean squared error of 5.67 x10-4. Trained network apply to other well data. The result show that correlation in well 2 and well 3 is 0.872 and 0.9077 respectively. Mean squared error in well 2 and well 3 is 11 x 10-4 and 9.539 x 10-4. From the result we can conclude that sonic log and gamma ray log could be good combination for predicting porosity with neural network.
NASA Astrophysics Data System (ADS)
Hill, D. G.
1993-10-01
A five-mineral multiple-log potash assay technique has been successfully applied to evaluate potash-rich intervals in evaporite sequences. The technique is able to distinguish economic potash minerals from non-economic potash minerals and from other non-potash radioactive minerals. It can be applied on location, using a programmable calculator or microcomputer, providing near real-time logs of potash mineral concentrations. Log assay values show good agreement with core wet chemistry analyses.
Selective Logging, Fire, and Biomass in Amazonia
NASA Technical Reports Server (NTRS)
Houghton, R. A.
1999-01-01
Biomass and rates of disturbance are major factors in determining the net flux of carbon between terrestrial ecosystems and the atmosphere, and neither of them is well known for most of the earth's surface. Satellite data over large areas are beginning to be used systematically to measure rates of two of the most important types of disturbance, deforestation and reforestation, but these are not the only types of disturbance that affect carbon storage. Other examples include selective logging and fire. In northern mid-latitude forests, logging and subsequent regrowth of forests have, in recent decades, contributed more to the net flux of carbon between terrestrial ecosystems and the atmosphere than any other type of land use. In the tropics logging is also becoming increasingly important. According to the FAO/UNEP assessment of tropical forests, about 25% of total area of productive forests have been logged one or more times in the 60-80 years before 1980. The fraction must be considerably greater at present. Thus, deforestation by itself accounts for only a portion of the emissions carbon from land. Furthermore, as rates of deforestation become more accurately measured with satellites, uncertainty in biomass will become the major factor accounting for the remaining uncertainty in estimates of carbon flux. An approach is needed for determining the biomass of terrestrial ecosystems. 3 Selective logging is increasingly important in Amazonia, yet it has not been included in region-wide, satellite-based assessments of land-cover change, in part because it is not as striking as deforestation. Nevertheless, logging affects terrestrial carbon storage both directly and indirectly. Besides the losses of carbon directly associated with selective logging, logging also increases the likelihood of fire.
Bayesian computation via empirical likelihood
Mengersen, Kerrie L.; Pudlo, Pierre; Robert, Christian P.
2013-01-01
Approximate Bayesian computation has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the approximate Bayesian computation parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The Bayesian computation with empirical likelihood algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models. PMID:23297233
NASA Astrophysics Data System (ADS)
Shang, Yilun
2016-08-01
How complex a network is crucially impacts its function and performance. In many modern applications, the networks involved have a growth property and sparse structures, which pose challenges to physicists and applied mathematicians. In this paper, we introduce the forest likelihood as a plausible measure to gauge how difficult it is to construct a forest in a non-preferential attachment way. Based on the notions of admittable labeling and path construction, we propose algorithms for computing the forest likelihood of a given forest. Concrete examples as well as the distributions of forest likelihoods for all forests with some fixed numbers of nodes are presented. Moreover, we illustrate the ideas on real-life networks, including a benzenoid tree, a mathematical family tree, and a peer-to-peer network.
NASA Astrophysics Data System (ADS)
Cherkasova, T.; Timkin, T.; Savinova, O.
2015-02-01
The nanomineralogic investigation results of ore minerals in metasomatites (garnet skarns) of Ribny Log- 2 gold ore in Topolninsk ore deposit (Gorny Altai) revealed the native silica impurities (Si) of 1 - 5 nm within the grains of native bismuth (Bi). Polished sections were examined by using Tescan Vega 3 scanning electron microscope (SEM) with Oxford energy-dispersive spectrometer at the Department of Geology and Mineral Exploration, Institute of Natural Resources, Tomsk Polytechnic University.
NASA Technical Reports Server (NTRS)
1977-01-01
ELDEC Corp., Lynwood, Wash., built a weight-recording system for logging trucks based on electronic technology the company acquired as a subcontractor on space programs such as Apollo and the Saturn launch vehicle. ELDEC employed its space-derived expertise to develop a computerized weight-and-balance system for Lockheed's TriStar jetliner. ELDEC then adapted the airliner system to a similar product for logging trucks. Electronic equipment computes tractor weight, trailer weight and overall gross weight, and this information is presented to the driver by an instrument in the cab. The system costs $2,000 but it pays for itself in a single year. It allows operators to use a truck's hauling capacity more efficiently since the load can be maximized without exceeding legal weight limits for highway travel. Approximately 2,000 logging trucks now use the system.
Well log and 2D seismic data character of the Wilcox Group in south-central Louisiana
Enomoto, Catherine B.
2014-01-01
The Wilcox Group is productive in updip areas of Texas and Louisiana from fluvial, deltaic, and near-shore marine shelf sandstones. The reported presence of porous sandstones at 29,000 feet within the Wilcox Group containing about 200 feet of gas in the Davy Jones 1 discovery well in the offshore Louisiana South Marsh Island area illustrates a sand-rich system developed during the Paleocene and early Eocene. This study describes some of the well log and reflection seismic data characteristics of the slope and basin-floor reservoirs with gas-discovery potential that may be in the area between the producing trend onshore Louisiana and the offshore discovery.
Abulencia, A.; Acosta, D.; Adelman, Jahred A.; Affolder, Anthony A.; Akimoto, T.; Albrow, M.G.; Ambrose, D.; Amerio, S.; Amidei, D.; Anastassov, A.; Anikeev, K.; /Taiwan, Inst. Phys. /Argonne /Barcelona, IFAE /Baylor U. /INFN, Bologna /Bologna U. /Brandeis U. /UC, Davis /UCLA /UC, San Diego /UC, Santa Barbara
2005-12-01
This report describes a measurement of the top quark mass, M{sub top}, with the dynamical likelihood method (DLM) using the CDF II detector at the Fermilab Tevatron. The Tevatron produces top/anti-top (t{bar t}) pairs in p{bar p} collisions at a center-of-mass energy of 1.96 TeV. The data sample used in this analysis was accumulated from March 2002 through August 2004, which corresponds to an integrated luminosity of 318 pb{sup -1}. They use the t{bar t} candidates in the ''lepton+jets'' decay channel, requiring at least one jet identified as a b quark by finding an displaced secondary vertex. The DLM defines a likelihood for each event based on the differential cross section as a function of M{sub top} per unit phase space volume of the final partons, multiplied by the transfer functions from jet to parton energies. The method takes into account all possible jet combinations in an event, and the likelihood is multiplied event by event to derive the top quark mass by the maximum likelihood method. Using 63 t{bar t} candidates observed in the data, with 9.2 events expected from background, they measure the top quark mass to be 173.2{sub -2.4}{sup +2.6}(stat.) {+-} 3.2(syst.) GeV/c{sup 2}, or 173.2{sub -4.0}{sup +4.1} GeV/c{sup 2}.
The Dud-Alternative Effect in Likelihood Judgment
ERIC Educational Resources Information Center
Windschitl, Paul D.; Chambers, John R.
2004-01-01
The judged likelihood of a focal outcome should generally decrease as the list of alternative possibilities increases. For example, the likelihood that a runner will win a race goes down when 2 new entries are added to the field. However, 6 experiments demonstrate that the presence of implausible alternatives (duds) often increases the judged…
Mail LOG: Program operating instructions
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The operating instructions for the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS, are provided. The program is written in FORTRAN for the PRIME 300 computer system. The MAIL LOG program has the following four modes of operation: (1) INPUT - putting new records into the data base (2) REVISE - changing or modifying existing records in the data base (3) SEARCH - finding special records existing in the data base (4) ARCHIVE - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the INPUT and SEARCH modes. The MAIL LOG data base consists of three main subfiles: Incoming and outgoing mail correspondence; Design Information Releases and Releases and Reports; and Drawings and Engineering orders.
NASA Astrophysics Data System (ADS)
Scandurra, D.; Albertini, F.; van der Meer, R.; Meier, G.; Weber, D. C.; Bolsi, A.; Lomax, A.
2016-02-01
Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within +/- 1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field.
NASA Astrophysics Data System (ADS)
Massiot, C.; Doan, M. L.; Celerier, B. P.; Mcnamara, D. D.; Little, T. A.; Schmitt, D. R.; Pezard, P. A.; Remaud, L.; Townend, J.; Sutherland, R.; Toy, V.
2015-12-01
The Alpine Fault is the major dextral transpressive fault forming the Australia-Pacific plate boundary in the South Island of New Zealand. The DFDP-2B borehole, drilled in late 2014, reached a depth of 893 m (measured depth) and provides an opportunity to describe the protolith and structure of ductilely deformed schists in the hangingwall of this active fault. A total of 19 km of wireline logging data were acquired in the borehole, including 4.8 km of borehole televiewer (BHTV) acoustic image logs. The high resolution of the BHTV logs and the borehole's inclination, which reached a maximum of 46° from vertical near the base of the hole, provide good conditions in which to intersect the predominant foliation, which dips SE at a high angle to the hole as well as other feature sets. Here, we present a detailed structural analysis and lithological interpretation of the BHTV logs from the 264-886 m interval. The borehole's inclination and the attenuation caused by the drilling mud reduces the image quality in places. However, some zones of poor image quality also concur with indications of permeable fractures and are related to fractured zones with localized borehole wall damage. Foliation can be regularly identified to a depth of 808 m (n=406). The mean orientation of foliation planes (dip of 57° towards 145) remains quite consistent throughout the drilled interval and is similar to that inferred for the structurally underlying Alpine Fault. Larger-scale changes in image amplitude suggest lithological or petrophysical variations not yet identified on cuttings. The majority of the 1650 planar features identified as fractures are subparallel to the foliation and may represent deformed veins or other mineral segregations that contribute to the foliation's internal structure. Other planar features have dominantly NW-SE strikes, approximately orthogonal to the foliation and are inferred to be fractures. Truncations or offsets are observed on ~20 features suggesting
Scandurra, D; Albertini, F; van der Meer, R; Meier, G; Weber, D C; Bolsi, A; Lomax, A
2016-02-07
Pencil beam scanning (PBS) proton therapy requires the delivery of many thousand proton beams, each modulated for position, energy and monitor units, to provide a highly conformal patient treatment. The quality of the treatment is dependent on the delivery accuracy of each beam and at each fraction. In this work we describe the use of treatment log files, which are a record of the machine parameters for a given field delivery on a given fraction, to investigate the integrity of treatment delivery compared to the nominal planned dose. The dosimetry-relevant log file parameters are used to reconstruct the 3D dose distribution on the patient anatomy, using a TPS-independent dose calculation system. The analysis was performed for patients treated at Paul Scherrer Institute on Gantry 2, both for individual fields and per series (or plan), and delivery quality was assessed by determining the percentage of voxels in the log file dose distribution within +/- 1% of the nominal dose. It was seen that, for all series delivered, the mean pass rate is 96.4%. Furthermore, this work establishes a correlation between the delivery quality of a field and the beam position accuracy. This correlation is evident for all delivered fields regardless of individual patient or plan characteristics. We have also detailed further usefulness of log file analysis within our clinical workflow. In summary, we have highlighted that the integrity of PBS treatment delivery is dependent on daily machine performance and is specifically highly correlated with the accuracy of beam position. We believe this information will be useful for driving machine performance improvements in the PBS field.
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 4 2014-10-01 2014-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 4 2012-10-01 2012-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 4 2010-10-01 2010-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 4 2011-10-01 2011-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...
47 CFR 73.1840 - Retention of logs.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 4 2013-10-01 2013-10-01 false Retention of logs. 73.1840 Section 73.1840... Rules Applicable to All Broadcast Stations § 73.1840 Retention of logs. (a) Any log required to be kept by station licensees shall be retained by them for a period of 2 years. However, logs...
CRYPTOSPORIDIUM LOG INACTIVATION CALCULATION METHODS
Appendix O of the Surface Water Treatment Rule (SWTR) Guidance Manual introduces the CeffT10 (i.e., reaction zone outlet C value and T10 time) method for calculating ozone CT value and Giardia and virus log inactivation. The LT2ESWTR Pre-proposal Draft Regulatory Language for St...
Ehrmann, Brett J.; Anderson, Robert M.; Piatt, Gretchen A.; Funnell, Martha M.; Rashid, Hira; Shedden, Kerby; Douyon, Liselle
2014-01-01
Purpose The purpose of this pilot study is to investigate the utility of, and areas of refinement for, digital photography as an educational tool for food logging in obese patients with type 2 diabetes (T2DM). Methods Thirty-three patients aged 18-70 with T2DM, BMI at least 30 kg/m2, and A1C 7.5-9% were recruited from an endocrinology clinic and randomized to a week of food logging using a digital camera (DC) or paper diary (PD), crossing over for week two. Patients then viewed a presentation about dietary effects on blood glucose, using patient DC and blood glucose entries. Outcomes of adherence (based on number of weekly entries), changes in mean blood glucose and frequency of blood glucose checks, and patient satisfaction were compared between methods. Patient feedback on the DC intervention and presentation was also analyzed. Results Thirty patients completed the study. Adherence was identical across methods. The mean difference in number of entries was not significant between methods. This difference increased and neared statistical significance (favoring DC) among patients who were adherent for at least one week (21 entries, with 2 entries per day for 5 of 7 days, n=25). Mean blood glucose did not significantly decrease in either method. Patient satisfaction was similar between interventions. Feedback indicated concerns over photograph accuracy, forgetting to use the cameras, and embarrassment using them in public. Conclusion Though comparable to PD in adherence, blood glucose changes, and patient satisfaction in this pilot trial, patient feedback suggested specific areas of refinement to maximize utility of DC-based food logging as an educational tool in T2DM. PMID:24168836
Medlin, W.L.; Manzi, S.J.
1990-10-09
This patent describes an acoustic borehole logging method. It comprises traversing a borehole with a borehole logging tool containing a transmitter of acoustic energy having a free-field frequency spectrum with at least one characteristic resonant frequency of vibration and spaced-apart receiver, repeatedly exciting the transmitter with a swept frequency tone burst of a duration sufficiently greater than the travel time of acoustic energy between the transmitter and the receiver to allow borehole cavity resonances to be established within the borehole cavity formed between the borehole logging tool and the borehole wall, detecting acoustic energy amplitude modulated by the borehole cavity resonances with the spaced-apart receiver, and recording an amplitude verses frequency output of the receiver in correlation with depth as a log of the borehole frequency spectrum representative of the subsurface formation comprising the borehole wall.
Anderson, David W.
1993-12-15
Two logs of EE-3A were performed during the last couple of weeks. The first of which, was a Temperature/Casing-Collar Locator (CCL) log, which took place on Friday, December 10th., 1993. The second log was a Caliper log which was done in cooperation with the Dia-Log Company, of Odessa, TX. on Monday, December, 13th., 1993.
6. Log calving barn. Interior view showing log postandbeam support ...
6. Log calving barn. Interior view showing log post-and-beam support system and animal stalls. - William & Lucina Bowe Ranch, Log Calving Barn, 230 feet south-southwest of House, Melrose, Silver Bow County, MT
Rivenes, A S
2004-12-08
The purpose of this document is to investigate Oracle database log buffer queuing and its affect on the ability to load data using a specialized data loading system. Experiments were carried out on a Linux system using an Oracle 9.2 database. Previous experiments on a Sun 4800 running Solaris had shown that 100,000 entities per minute was an achievable rate. The question was then asked, can we do this on Linux, and where are the bottlenecks? A secondary question was also lurking, how can the loading be further scaled to handle even higher throughput requirements? Testing was conducted using a Dell PowerEdge 6650 server with four CPUs and a Dell PowerVault 220s RAID array with 14 36GB drives and 128 MB of cache. Oracle Enterprise Edition 9.2.0.4 was used for the database and Red Hat Linux Advanced Server 2.1 was used for the operating system. This document will detail the maximum observed throughputs using the same test suite that was used for the Sun tests. A detailed description of the testing performed along with an analysis of bottlenecks encountered will be made. Issues related to Oracle and Linux will also be detailed and some recommendations based on the findings.
Factors Associated with Young Adults’ Pregnancy Likelihood
Kitsantas, Panagiota; Lindley, Lisa L.; Wu, Huichuan
2014-01-01
OBJECTIVES While progress has been made to reduce adolescent pregnancies in the United States, rates of unplanned pregnancy among young adults (18–29 years) remain high. In this study, we assessed factors associated with perceived likelihood of pregnancy (likelihood of getting pregnant/getting partner pregnant in the next year) among sexually experienced young adults who were not trying to get pregnant and had ever used contraceptives. METHODS We conducted a secondary analysis of 660 young adults, 18–29 years old in the United States, from the cross-sectional National Survey of Reproductive and Contraceptive Knowledge. Logistic regression and classification tree analyses were conducted to generate profiles of young adults most likely to report anticipating a pregnancy in the next year. RESULTS Nearly one-third (32%) of young adults indicated they believed they had at least some likelihood of becoming pregnant in the next year. Young adults who believed that avoiding pregnancy was not very important were most likely to report pregnancy likelihood (odds ratio [OR], 5.21; 95% CI, 2.80–9.69), as were young adults for whom avoiding a pregnancy was important but not satisfied with their current contraceptive method (OR, 3.93; 95% CI, 1.67–9.24), attended religious services frequently (OR, 3.0; 95% CI, 1.52–5.94), were uninsured (OR, 2.63; 95% CI, 1.31–5.26), and were likely to have unprotected sex in the next three months (OR, 1.77; 95% CI, 1.04–3.01). DISCUSSION These results may help guide future research and the development of pregnancy prevention interventions targeting sexually experienced young adults. PMID:25782849
NASA Technical Reports Server (NTRS)
Balch, D. T.; Lombardi, J.
1985-01-01
A model scale hover test was conducted in the Sikorsky Aircraft Model Rotor hover Facility to identify and quantify the impact of the tail rotor on the demonstrated advantages of advanced geometry tip configurations. The existence of mutual interference between hovering main rotor and a tail rotor was acknowledged in the test. The test was conducted using the Basic Model Test Rig and two scaled main rotor systems, one representing a 1/5.727 scale UH-60A BLACK HAWK and the others a 1/4.71 scale S-76. Eight alternate rotor tip configurations were tested, 3 on the BLACK HAWK rotor and 6 on the S-76 rotor. Four of these tips were then selected for testing in close proximity to an operating tail rotor (operating in both tractor and pusher modes) to determine if the performance advantages that could be obtained from the use of advanced geometry tips in a main rotor only environment would still exist in the more complex flow field involving a tail rotor. This volume contains the test run log and tabulated data.
Acoustic paramagnetic logging tool
Vail, III, William B.
1988-01-01
New methods and apparatus are disclosed which allow measurement of the presence of oil and water in geological formations using a new physical effect called the Acoustic Paramagnetic Logging Effect (APLE). The presence of petroleum in formation causes a slight increase in the earth's magnetic field in the vicinity of the reservoir. This is the phenomena of paramagnetism. Application of an acoustic source to a geological formation at the Larmor frequency of the nucleons present causes the paramagnetism of the formation to disappear. This results in a decrease in the earth3 s magnetic field in the vicinity of the oil bearing formation. Repetitively frequency sweeping the acoustic source through the Larmor frequency of the nucleons present (approx. 2 kHz) causes an amplitude modulation of the earth's magnetic field which is a consequence of the APLE. The amplitude modulation of the earth's magnetic field is measured with an induction coil gradiometer and provides a direct measure of the amount of oil and water in the excitation zone of the formation . The phase of the signal is used to infer the longitudinal relaxation times of the fluids present, which results in the ability in general to separate oil and water and to measure the viscosity of the oil present. Such measurements may be preformed in open boreholes and in cased well bores.
The Sherpa Maximum Likelihood Estimator
NASA Astrophysics Data System (ADS)
Nguyen, D.; Doe, S.; Evans, I.; Hain, R.; Primini, F.
2011-07-01
A primary goal for the second release of the Chandra Source Catalog (CSC) is to include X-ray sources with as few as 5 photon counts detected in stacked observations of the same field, while maintaining acceptable detection efficiency and false source rates. Aggressive source detection methods will result in detection of many false positive source candidates. Candidate detections will then be sent to a new tool, the Maximum Likelihood Estimator (MLE), to evaluate the likelihood that a detection is a real source. MLE uses the Sherpa modeling and fitting engine to fit a model of a background and source to multiple overlapping candidate source regions. A background model is calculated by simultaneously fitting the observed photon flux in multiple background regions. This model is used to determine the quality of the fit statistic for a background-only hypothesis in the potential source region. The statistic for a background-plus-source hypothesis is calculated by adding a Gaussian source model convolved with the appropriate Chandra point spread function (PSF) and simultaneously fitting the observed photon flux in each observation in the stack. Since a candidate source may be located anywhere in the field of view of each stacked observation, a different PSF must be used for each observation because of the strong spatial dependence of the Chandra PSF. The likelihood of a valid source being detected is a function of the two statistics (for background alone, and for background-plus-source). The MLE tool is an extensible Python module with potential for use by the general Chandra user.
Walsh, David O; Turner, Peter
2014-05-27
Technologies including NMR logging apparatus and methods are disclosed. Example NMR logging apparatus may include surface instrumentation and one or more downhole probes configured to fit within an earth borehole. The surface instrumentation may comprise a power amplifier, which may be coupled to the downhole probes via one or more transmission lines, and a controller configured to cause the power amplifier to generate a NMR activating pulse or sequence of pulses. Impedance matching means may be configured to match an output impedance of the power amplifier through a transmission line to a load impedance of a downhole probe. Methods may include deploying the various elements of disclosed NMR logging apparatus and using the apparatus to perform NMR measurements.
4. Log chicken house (far left foreground), log bunkhouse (far ...
4. Log chicken house (far left foreground), log bunkhouse (far left background), one-room log cabin (left of center background), log root cellar (center), post-and-beam center in foreground, and blacksmith shop (far right foreground). View to southeast. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT
Section 9: Ground Water - Likelihood of Release
HRS training. the ground water pathway likelihood of release factor category reflects the likelihood that there has been, or will be, a release of hazardous substances in any of the aquifers underlying the site.
Structural basis for cytokinin production by LOG from Corynebacterium glutamicum
Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin
2016-01-01
“Lonely guy” (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a “PGGXGTXXE” motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms. PMID:27507425
Structural basis for cytokinin production by LOG from Corynebacterium glutamicum.
Seo, Hogyun; Kim, Sangwoo; Sagong, Hye-Young; Son, Hyeoncheol Francis; Jin, Kyeong Sik; Kim, Il-Kwon; Kim, Kyung-Jin
2016-08-10
"Lonely guy" (LOG) has been identified as a cytokinin-producing enzyme in plants and plant-interacting fungi. The gene product of Cg2612 from the soil-dwelling bacterium Corynebacterium glutamicum was annotated as an LDC. However, the facts that C. glutamicum lacks an LDC and Cg2612 has high amino acid similarity with LOG proteins suggest that Cg2612 is possibly an LOG protein. To investigate the function of Cg2612, we determined its crystal structure at a resolution of 2.3 Å. Cg2612 functions as a dimer and shows an overall structure similar to other known LOGs, such as LOGs from Arabidopsis thaliana (AtLOG), Claviceps purpurea (CpLOG), and Mycobacterium marinum (MmLOG). Cg2612 also contains a "PGGXGTXXE" motif that contributes to the formation of an active site similar to other LOGs. Moreover, biochemical studies on Cg2612 revealed that the protein has phosphoribohydrolase activity but not LDC activity. Based on these structural and biochemical studies, we propose that Cg2612 is not an LDC family enzyme, but instead belongs to the LOG family. In addition, the prenyl-binding site of Cg2612 (CgLOG) comprised residues identical to those seen in AtLOG and CpLOG, albeit dissimilar to those in MmLOG. The work provides structural and functional implications for LOG-like proteins from other microorganisms.
Convolutional codes. II - Maximum-likelihood decoding. III - Sequential decoding
NASA Technical Reports Server (NTRS)
Forney, G. D., Jr.
1974-01-01
Maximum-likelihood decoding is characterized as the determination of the shortest path through a topological structure called a trellis. Aspects of code structure are discussed along with questions regarding maximum-likelihood decoding on memoryless channels. A general bounding technique is introduced. The technique is used to obtain asymptotic bounds on the probability of error for maximum-likelihood decoding and list-of-2 decoding. The basic features of sequential algorithms are discussed along with a stack algorithm, questions of computational distribution, and the martingale approach to computational bounds.
CosmoSlik: Cosmology sampler of likelihoods
NASA Astrophysics Data System (ADS)
Millea, Marius
2017-01-01
CosmoSlik quickly puts together, runs, and analyzes an MCMC chain for analysis of cosmological data. It is highly modular and comes with plugins for CAMB (ascl:1102.026), CLASS (ascl:1106.020), the Planck likelihood, the South Pole Telescope likelihood, other cosmological likelihoods, emcee (ascl:1303.002), and more. It offers ease-of-use, flexibility, and modularity.
Owen, R. K.
2007-04-04
A perl module designed to read and parse the voluminous set of event or accounting log files produced by a Portable Batch System (PBS) server. This module can filter on date-time and/or record type. The data can be returned in a variety of formats.
ERIC Educational Resources Information Center
National Aeronautics and Space Administration, Washington, DC.
The major events of the first manned moon landing mission, Apollo 11, are presented in chronological order from launch time until arrival of the astronauts aboard the U.S.S. Hornet. The log is descriptive, non-technical, and includes numerous color photographs of the astronauts on the moon. (PR)
ERIC Educational Resources Information Center
Millward, Robert E.
1999-01-01
A visit to Ketchikan, Alaska, reveals a floating, one-teacher logging-camp school that uses multiage grouping and interdisciplinary teaching. There are 10 students. The school gym and playground, bunkhouse, fuel tanks, mess hall, and students' homes bob up and down and are often moved to other sites. (MLH)
ERIC Educational Resources Information Center
Butler, Kevin
2010-01-01
A classroom lecture at Capistrano Connections Academy in Southern California involves booting up the home computer, logging on to a Web site, and observing a teacher conducting a PowerPoint presentation of that day's lesson entirely online. Through microphone headsets, students can watch on their home computers, respond to the teacher's questions,…
ERIC Educational Resources Information Center
Deaton, Cynthia Minchew; Deaton, Benjamin E.; Leland, Katina
2010-01-01
The authors created an interactive reflective log (IRL) to provide teachers with an opportunity to use a journal approach to record, evaluate, and communicate student understanding of science concepts. Unlike a traditional journal, the IRL incorporates prompts to encourage students to discuss their understanding of science content and science…
Petrographic image logging system
Payne, C.J.; Ulrich, M.R.; Maxwell, G.B. ); Adams, J.P. )
1991-03-01
The Petrographic Image Logging System (PILS) is a logging system data base for Macintosh computers that allows the merging of traditional wire-line, core, and mud log data with petrographic images. The system is flexible; it allows the user to record, manipulate, and display almost any type of character, graphic, and image information. Character and graphic data are linked and entry in either mode automatically generates the alternate mode. Character/graphic data may include such items as ROP, wire-line log data, interpreted lithologies, ditch cutting lith-percentages, porosity grade and type, grain size, core/DST information, and sample descriptions. Image data may include petrographic and SEM images of cuttings, core, and thin sections. All data are tied to depth. Data are entered quickly and easily in an interactive manner with a mouse, keyboard, and digitizing tablet or may be imported and immediately autoplotted from a variety of environments via modem, network, or removable disk. Color log displays, including petrographic images, are easily available on CRT or as hardcopy. The system consists of a petrographic microscope, video camera, Macintosh computer, video framegrabber and digitizing tablet. Hardcopy is scaleable and can be generated by a variety of color printing devices. The software is written in Supertalk, a color superset of the standard Apple Hypercard programming language, hypertalk. This system is being tested by Mobil in the lab and at the well site. Implementation has provided near 'real-time' core and cuttings images from drilling wells to the geologist back at the office.
The fluid-compensated cement bond log
Nayfeh, T.H.; Leslie, H.D.; Wheelis, W.B.
1984-09-01
An experimental and numerical wave mechanics study of cement bond logs demonstrated that wellsite computer processing can now segregate wellbore fluid effects from the sonic signal response to changing cement strength. Traditionally, cement logs have been interpreted as if water were in the wellbore, without consideration of wellbore fluid effects. These effects were assumed to be negligible. However, with the increasing number of logs being run in completion fluids such as CaCl/sub 2/, ZnBr/sub 2/, and CaBr/sub 2/, large variations in cement bond logs became apparent. A Schlumberger internal paper showing that bond log amplitude is related to the acoustic impedance of the fluid in which the tool is run led to a comprehensive study of wellbore fluid effects. Numerical and experimental models were developed simulating wellbore geometry. Measurements were conducted in 5-, 7-, and 95/8-in. casings by varying the wellbore fluid densities, viscosities, and fluid types (acoustic impedance). Parallel numerical modeling was undertaken using similar parameters. The results showed that the bond log amplitude varied dramatically with the wellbore fluid's acoustic impedance; for example, there was a 70 percent increase in the signal amplitude for 11.5-lb/ gal CaCl/sub 2/ over the signal amplitude in water. This led to the development of a Fluid-Compensated Bond log that corrects the amplitude for acoustic impedance of varying wellbore fluids, thereby making the measurements more directly related to the cement quality.
Comparison between Pre-log and Post-log Statistical Models in Ultra-Low-Dose CT Reconstruction.
Fu, Lin; Lee, Tzu-Cheng; Kim, Soo Mee; Alessio, Adam; Kinahan, Paul; Chang, Zhiqian; Sauer, Ken; Kalra, Mannudeep; De Man, Bruno
2016-11-09
X-ray detectors in clinical computed tomography (CT) usually operate in current-integrating mode. Their complicated signal statistics often lead to intractable likelihood functions for practical use in model-based image reconstruction (MBIR). It is therefore desirable to design simplified statistical models without losing the essential factors. Depending on whether the CT transmission data are logarithmically transformed, pre-log and post-log models are two major categories of choices in CT MBIR. Both being approximations, it remains an open question whether one model can notably improve image quality over the other on real scanners. In this study, we develop and compare several pre-log and post-log MBIR algorithms under a unified framework. Their reconstruction accuracy based on simulation and clinical datasets are evaluated. The results show that pre-log MBIR can achieve notably better quantitative accuracy than post-log MBIR in ultra-low-dose CT, although in less extreme cases, post-log MBIR with handcrafted pre-processing remains a competitive alternative. Pre-log MBIR could play a growing role in emerging ultra-low-dose CT applications.
Log-Concavity and Strong Log-Concavity: a review
Saumard, Adrien; Wellner, Jon A.
2016-01-01
We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning. PMID:27134693
Log-Concavity and Strong Log-Concavity: a review.
Saumard, Adrien; Wellner, Jon A
We review and formulate results concerning log-concavity and strong-log-concavity in both discrete and continuous settings. We show how preservation of log-concavity and strongly log-concavity on ℝ under convolution follows from a fundamental monotonicity result of Efron (1969). We provide a new proof of Efron's theorem using the recent asymmetric Brascamp-Lieb inequality due to Otto and Menz (2013). Along the way we review connections between log-concavity and other areas of mathematics and statistics, including concentration of measure, log-Sobolev inequalities, convex geometry, MCMC algorithms, Laplace approximations, and machine learning.
Maximum likelihood density modification by pattern recognition of structural motifs
Terwilliger, Thomas C.
2004-04-13
An electron density for a crystallographic structure having protein regions and solvent regions is improved by maximizing the log likelihood of a set of structures factors {F.sub.h } using a local log-likelihood function: (x)+p(.rho.(x).vertline.SOLV)p.sub.SOLV (x)+p(.rho.(x).vertline.H)p.sub.H (x)], where p.sub.PROT (x) is the probability that x is in the protein region, p(.rho.(x).vertline.PROT) is the conditional probability for .rho.(x) given that x is in the protein region, and p.sub.SOLV (x) and p(.rho.(x).vertline.SOLV) are the corresponding quantities for the solvent region, p.sub.H (x) refers to the probability that there is a structural motif at a known location, with a known orientation, in the vicinity of the point x; and p(.rho.(x).vertline.H) is the probability distribution for electron density at this point given that the structural motif actually is present. One appropriate structural motif is a helical structure within the crystallographic structure.
West, H.I. Jr.; Glasgow, J.E.
1983-12-01
The feasibility of using oil-well bore-hole logging techniques for assaying the carbonate content of a potential shot site has been investigated. The procedure makes use of the detection of the 4439-keV ..gamma.. ray of carbon and the 6130-keV ..gamma.. ray of oxygen produced by the inelastic scattering of 14-MeV neutrons in the bore-hole medium. For the needs of the containment program, a sensitivity of detection of CO/sub 2/ to less than or equal to 0.02 weight fraction must be achieved. Laboratory measurements indicate that only with considerable development effort could a tool be made that would come close to achieving this goal.
Maximum Likelihood Analysis in the PEN Experiment
NASA Astrophysics Data System (ADS)
Lehman, Martin
2013-10-01
The experimental determination of the π+ -->e+ ν (γ) decay branching ratio currently provides the most accurate test of lepton universality. The PEN experiment at PSI, Switzerland, aims to improve the present world average experimental precision of 3 . 3 ×10-3 to 5 ×10-4 using a stopped beam approach. During runs in 2008-10, PEN has acquired over 2 ×107 πe 2 events. The experiment includes active beam detectors (degrader, mini TPC, target), central MWPC tracking with plastic scintillator hodoscopes, and a spherical pure CsI electromagnetic shower calorimeter. The final branching ratio will be calculated using a maximum likelihood analysis. This analysis assigns each event a probability for 5 processes (π+ -->e+ ν , π+ -->μ+ ν , decay-in-flight, pile-up, and hadronic events) using Monte Carlo verified probability distribution functions of our observables (energies, times, etc). A progress report on the PEN maximum likelihood analysis will be presented. Work supported by NSF grant PHY-0970013.
Kim, Jong Hum; Kim, Woo Taek
2013-01-01
Really Interesting New Gene (RING) E3 ubiquitin ligases have been implicated in cellular responses to the stress hormone abscisic acid (ABA) as well as to environmental stresses in higher plants. Here, an ABA-insensitive RING protein3 (atairp3) loss-of-function mutant line in Arabidopsis (Arabidopsis thaliana) was isolated due to its hyposensitivity to ABA during its germination stage as compared with wild-type plants. AtAIRP3 contains a single C3HC4-type RING motif, a putative myristoylation site, and a domain associated with RING2 (DAR2) domain. Unexpectedly, AtAIRP3 was identified as LOSS OF GDU2 (LOG2), which was recently shown to participate in an amino acid export system via interaction with GLUTAMINE DUMPER1. Thus, AtAIRP3 was renamed as AtAIRP3/LOG2. Transcript levels of AtAIRP3/LOG2 were up-regulated by drought, high salinity, and ABA, suggesting a role for this factor in abiotic stress responses. The atairp3/log2-2 knockout mutant and 35S:AtAIRP3-RNAi knockdown transgenic plants displayed impaired ABA-mediated seed germination and stomata closure. Cosuppression and complementation studies further supported a positive role for AtAIRP3/LOG2 in ABA responses. Suppression of AtAIRP3/LOG2 resulted in marked hypersensitive phenotypes toward high salinity and water deficit relative to wild-type plants. These results suggest that Arabidopsis RING E3 AtAIRP3/LOG2 is a positive regulator of the ABA-mediated drought and salt stress tolerance mechanism. Using yeast (Saccharomyces cerevisiae) two-hybrid, in vitro, and in vivo immunoprecipitation, cell-free protein degradation, and in vitro ubiquitination assays, RESPONSIVE TO DEHYDRATION21 was identified as a substrate protein of AtAIRP3/LOG2. Collectively, our data suggest that AtAIRP3/LOG2 plays dual functions in ABA-mediated drought stress responses and in an amino acid export pathway in Arabidopsis. PMID:23696092
NASA Astrophysics Data System (ADS)
Salehin, Z.; Woobaidullah, A. S. M.; Snigdha, S. S.
2015-12-01
Bengal Basin with its prolific gas rich province provides needed energy to Bangladesh. Present energy situation demands more Hydrocarbon explorations. Only 'Semutang' is discovered in the high amplitude structures, where rest of are in the gentle to moderate structures of western part of Chittagong-Tripura Fold Belt. But it has some major thrust faults which have strongly breached the reservoir zone. The major objectives of this research are interpretation of gas horizons and faults, then to perform velocity model, structural and property modeling to obtain reservoir properties. It is needed to properly identify the faults and reservoir heterogeneities. 3D modeling is widely used to reveal the subsurface structure in faulted zone where planning and development drilling is major challenge. Thirteen 2D seismic and six well logs have been used to identify six gas bearing horizons and a network of faults and to map the structure at reservoir level. Variance attributes were used to identify faults. Velocity model is performed for domain conversion. Synthetics were prepared from two wells where sonic and density logs are available. Well to seismic tie at reservoir zone shows good match with Direct Hydrocarbon Indicator on seismic section. Vsh, porosity, water saturation and permeability have been calculated and various cross plots among porosity logs have been shown. Structural modeling is used to make zone and layering accordance with minimum sand thickness. Fault model shows the possible fault network, those liable for several dry wells. Facies model have been constrained with Sequential Indicator Simulation method to show the facies distribution along the depth surfaces. Petrophysical models have been prepared with Sequential Gaussian Simulation to estimate petrophysical parameters away from the existing wells to other parts of the field and to observe heterogeneities in reservoir. Average porosity map for each gas zone were constructed. The outcomes of the research
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Agricultural Employment of H-2A Aliens in the United States, 75 FR 6884 (Feb. 12, 2010). The effect of... rulemaking (NPRM). 74 FR 45906 (Sept. 4, 2009). After considering comments from the public on the...
Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.; Goodrich, J.H.; Kumar, R.M.
1981-11-01
This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712 feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.
12. Upstream view showing thelower log pond log chute in ...
12. Upstream view showing thelower log pond log chute in the main channel of the Hudson River. The log chute in the dam can be seen in the background. Facing southwest. - Glens Falls Dam, 100' to 450' West of U.S. Route 9 Bridge Spanning Hudson River, Glens Falls, Warren County, NY
NASA Astrophysics Data System (ADS)
Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.
2015-12-01
An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.
Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete
2015-01-01
An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.
Optimized Large-scale CMB Likelihood and Quadratic Maximum Likelihood Power Spectrum Estimation
NASA Astrophysics Data System (ADS)
Gjerløw, E.; Colombo, L. P. L.; Eriksen, H. K.; Górski, K. M.; Gruppuso, A.; Jewell, J. B.; Plaszczynski, S.; Wehus, I. K.
2015-11-01
We revisit the problem of exact cosmic microwave background (CMB) likelihood and power spectrum estimation with the goal of minimizing computational costs through linear compression. This idea was originally proposed for CMB purposes by Tegmark et al., and here we develop it into a fully functioning computational framework for large-scale polarization analysis, adopting WMAP as a working example. We compare five different linear bases (pixel space, harmonic space, noise covariance eigenvectors, signal-to-noise covariance eigenvectors, and signal-plus-noise covariance eigenvectors) in terms of compression efficiency, and find that the computationally most efficient basis is the signal-to-noise eigenvector basis, which is closely related to the Karhunen-Loeve and Principal Component transforms, in agreement with previous suggestions. For this basis, the information in 6836 unmasked WMAP sky map pixels can be compressed into a smaller set of 3102 modes, with a maximum error increase of any single multipole of 3.8% at ℓ ≤ 32 and a maximum shift in the mean values of a joint distribution of an amplitude-tilt model of 0.006σ. This compression reduces the computational cost of a single likelihood evaluation by a factor of 5, from 38 to 7.5 CPU seconds, and it also results in a more robust likelihood by implicitly regularizing nearly degenerate modes. Finally, we use the same compression framework to formulate a numerically stable and computationally efficient variation of the Quadratic Maximum Likelihood implementation, which requires less than 3 GB of memory and 2 CPU minutes per iteration for ℓ ≤ 32, rendering low-ℓ QML CMB power spectrum analysis fully tractable on a standard laptop.
Markov chain Monte Carlo without likelihoods.
Marjoram, Paul; Molitor, John; Plagnol, Vincent; Tavare, Simon
2003-12-23
Many stochastic simulation approaches for generating observations from a posterior distribution depend on knowing a likelihood function. However, for many complex probability models, such likelihoods are either impossible or computationally prohibitive to obtain. Here we present a Markov chain Monte Carlo method for generating observations from a posterior distribution without the use of likelihoods. It can also be used in frequentist applications, in particular for maximum-likelihood estimation. The approach is illustrated by an example of ancestral inference in population genetics. A number of open problems are highlighted in the discussion.
NASA Astrophysics Data System (ADS)
Isyaku, Aminu A.; Rust, Derek; Teeuw, Richard; Whitworth, Malcolm
2016-09-01
Structural and stratigraphic mapping within the Bornu Basin in north east Nigeria was commonly carried out using traditional field geological methods. However, such traditional approaches remain inadequate in the semi-arid region characterised by topographically flat areas and lack of continuous bedrock outcrops that are mostly concealed beneath sand cover. Previous studies in the north-eastern part of the basin carried out using ditch cuttings from few wells and disconnected seismic data were largely inadequate and the resulting stratigraphic analyses were more often generalised. This paper presents an integrated structural and stratigraphic study of the basin using combined subsurface geophysical datasets. A Combined Log Pattern (CLP) method is a well log analysis, which utilises various well log data including gamma ray, resistivity, bulk density and sonic logs to identify lithology and stratigraphic boundaries of subsurface formations. This method is applied to constrain the subsurface stratigraphy of the north-eastern part of the Bornu Basin bordering the Lake Chad. In addition to qualitative combined well log analysis, the time-depth relationship of the sonic log and seismic data was quantitatively determined by tying a well with an intersecting seismic section to validate the stratigraphic facies horizons identified. Four well log facies and their environments of deposition were characterised from the combined well log analysis of the different log types. It is discovered that the Cretaceous basement structural features controlled the deposition of overlying formations in the basin. Without intact core data, the shallower wells were discovered to have bottomed over subsurface horst features while deeper wells penetrated into the basal facies contained mainly within the grabens. Main subsurface structural lineaments in the area include NW-SE, NE-SW and NNW-SSE trending faults, which mainly formed the horst and graben features. Some stratigraphic formations
Vestige: Maximum likelihood phylogenetic footprinting
Wakefield, Matthew J; Maxwell, Peter; Huttley, Gavin A
2005-01-01
Background Phylogenetic footprinting is the identification of functional regions of DNA by their evolutionary conservation. This is achieved by comparing orthologous regions from multiple species and identifying the DNA regions that have diverged less than neutral DNA. Vestige is a phylogenetic footprinting package built on the PyEvolve toolkit that uses probabilistic molecular evolutionary modelling to represent aspects of sequence evolution, including the conventional divergence measure employed by other footprinting approaches. In addition to measuring the divergence, Vestige allows the expansion of the definition of a phylogenetic footprint to include variation in the distribution of any molecular evolutionary processes. This is achieved by displaying the distribution of model parameters that represent partitions of molecular evolutionary substitutions. Examination of the spatial incidence of these effects across regions of the genome can identify DNA segments that differ in the nature of the evolutionary process. Results Vestige was applied to a reference dataset of the SCL locus from four species and provided clear identification of the known conserved regions in this dataset. To demonstrate the flexibility to use diverse models of molecular evolution and dissect the nature of the evolutionary process Vestige was used to footprint the Ka/Ks ratio in primate BRCA1 with a codon model of evolution. Two regions of putative adaptive evolution were identified illustrating the ability of Vestige to represent the spatial distribution of distinct molecular evolutionary processes. Conclusion Vestige provides a flexible, open platform for phylogenetic footprinting. Underpinned by the PyEvolve toolkit, Vestige provides a framework for visualising the signatures of evolutionary processes across the genome of numerous organisms simultaneously. By exploiting the maximum-likelihood statistical framework, the complex interplay between mutational processes, DNA repair and
ERIC Educational Resources Information Center
Hurst, Beth
2005-01-01
Learning logs, or reading response logs, have long been established as an effective reading strategy that helps students learn from text (Atwell, 1987; Blough & Berman, 1991; Calkins, 1986; Commander & Smith, 1996; Kuhrt & Farris, 1990; Reed, 1988; Sanders, 1985). In this paper, the author describes her experiences using learning logs as a…
Log-Linear Models for Gene Association
Hu, Jianhua; Joshi, Adarsh; Johnson, Valen E.
2009-01-01
We describe a class of log-linear models for the detection of interactions in high-dimensional genomic data. This class of models leads to a Bayesian model selection algorithm that can be applied to data that have been reduced to contingency tables using ranks of observations within subjects, and discretization of these ranks within gene/network components. Many normalization issues associated with the analysis of genomic data are thereby avoided. A prior density based on Ewens’ sampling distribution is used to restrict the number of interacting components assigned high posterior probability, and the calculation of posterior model probabilities is expedited by approximations based on the likelihood ratio statistic. Simulation studies are used to evaluate the efficiency of the resulting algorithm for known interaction structures. Finally, the algorithm is validated in a microarray study for which it was possible to obtain biological confirmation of detected interactions. PMID:19655032
Code of Federal Regulations, 2012 CFR
2012-10-01
... issued a demand letter for protective action. (b) Since the time required to drill and produce a... evaluate the characteristics and performance of the draining well; (2) Rig availability; (3) Well depth; (4... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...
Code of Federal Regulations, 2013 CFR
2013-10-01
... issued a demand letter for protective action. (b) Since the time required to drill and produce a... evaluate the characteristics and performance of the draining well; (2) Rig availability; (3) Well depth; (4... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...
Code of Federal Regulations, 2011 CFR
2011-10-01
... issued a demand letter for protective action. (b) Since the time required to drill and produce a... evaluate the characteristics and performance of the draining well; (2) Rig availability; (3) Well depth; (4... to drill; and (6) Weather conditions. (c) If BLM determines that you did not take protection...
Glassmire, David M; Tarescavage, Anthony M; Gottfried, Emily D
2016-12-01
The Miller Forensic Assessment of Symptoms Test (M-FAST) was designed as a screening measure for feigned psychiatric symptoms. When M-FAST Total Scores are elevated (raw score ≥6), the test manual recommends follow-up with a more comprehensive measure of feigning, such as the widely used and researched Structured Interview of Reported Symptoms (SIRS) or the revised version of the test (SIRS-2). The purpose of the current study was to evaluate how often M-FAST screening elevations are associated with subsequent elevations on the SIRS or SIRS-2. The sample included archival data from 100 forensic psychiatric inpatients who obtained M-FAST Total Score elevations ≥6 during screening and were subsequently administered the SIRS (that was also rescored using SIRS-2 criteria). Among examinees who elevated the M-FAST over the recommended cutoff, 66.0% met standard SIRS feigning criteria, 42% met SIRS-2 criteria for feigning, and 81.0% obtained at least 1 SIRS/SIRS-2 elevation in the Probable Feigning range or higher. These results are consistent with the M-FAST manual guidelines, which support the use of the ≥6 M-FAST cutoff score to screen for potential feigning (but not as an independent marker of feigning). A higher M-FAST cutoff score of ≥16 was associated with subsequently meeting full SIRS criteria for feigning in 100.0% of protocols. Because the SIRS criteria were designed to have very low false positive rates, these findings indicate that more confident assertions about feigning can be made when elevations reach this level on the MFAST. (PsycINFO Database Record
Schweder, Tore
2003-12-01
Maximum likelihood estimates of abundance are obtained from repeated photographic surveys of a closed stratified population with naturally marked and unmarked individuals. Capture intensities are assumed log-linear in stratum, year, and season. In the chosen model, an approximate confidence distribution for total abundance of bowhead whales, with an accompanying likelihood reduced of nuisance parameters, is found from a parametric bootstrap experiment. The confidence distribution depends on the assumed study protocol. A confidence distribution that is exact (except for the effect of discreteness) is found by conditioning in the unstratified case without unmarked individuals.
NASA Astrophysics Data System (ADS)
Halabe, Udaya B.; Pyakurel, Sandeep
2007-03-01
There has been a lack of an effective NDE technique to locate internal defects within wooden logs. The few available elastic wave propagation based techniques are limited to predicting E values. Other techniques such as X-rays have not been very successful in detecting internal defects in logs. If defects such as embedded metals could be identified before the sawing process, the saw mills could significantly increase their production by reducing the probability of damage to the saw blade and the associated downtime and the repair cost. Also, if the internal defects such as knots and decayed areas could be identified in logs, the sawing blade can be oriented to exclude the defective portion and optimize the volume of high valued lumber that can be obtained from the logs. In this research, GPR has been successfully used to locate internal defects (knots, decays and embedded metals) within the logs. This paper discusses GPR imaging and mapping of the internal defects using both 2D and 3D interpretation methodology. Metal pieces were inserted in a log and the reflection patterns from these metals were interpreted from the radargrams acquired using 900 MHz antenna. Also, GPR was able to accurately identify the location of knots and decays. Scans from several orientations of the log were collected to generate 3D cylindrical volume. The actual location of the defects showed good correlation with the interpreted defects in the 3D volume. The time/depth slices from 3D cylindrical volume data were useful in understanding the extent of defects inside the log.
Log(s) physics results from CDF
Not Available
1989-11-08
The Collider Detector at Fermilab (CDF) is a large, azimuthally symmetric detector designed to study {bar p}p interactions at the Fermilab Tevatron Collider. Results are presented from data taken with a minimum bias trigger at {radical}s = 630 and 1800 GeV during the 1987 run. The topics include the current analysis of dn/d{eta} and some very preliminary results on short range pseudorapidity correlations and Bose-Einstein correlations. 7 refs., 5 figs., 2 tabs.
Hynes, Scott M; Wickremsinhe, Enaksha; Zhang, Wei; Decker, Rodney; Ott, Jennifer; Chandler, Jason; Mitchell, Malcolm
2015-01-01
LY2603618 is a selective inhibitor of deoxyribonucleic acid damage checkpoint kinase 1 (CHK1) and has been in development for the enhancement of chemotherapeutic agents. The study described was to assess the potential interaction between LY2603618 and cytochrome P450 isoform 2D6 (CYP2D6) substrate desipramine in patients with cancer. Before clinical investigation, in silico simulations (using Simcyp®) were conducted. An open-label, two-period, fixed-sequence study was planned in 30 patients with advanced or metastatic cancers, in which a 50 mg oral dose of desipramine was administered alone and in combination with 275 mg of LY2603618 (i.v. infusion). An interim analysis was planned after 15 patients completed both periods. Ratios of geometric least squares means (LSMs) of primary pharmacokinetic (PK) parameters and 90% repeated confidence intervals (RCIs) between desipramine plus LY2603618 and desipramine alone were calculated. Lack of an interaction was declared if the 90% RCI fell between 0.8 and 1.25. The LSM ratios (90% RCI) for areas under the plasma concentration-time curve from time zero to tlast (AUC[0-tlast]) and to infinity (AUC[0-∞]) and maximum plasma concentration (Cmax) were 1.14 (1.04, 1.25), 1.09 (0.99, 1.21) and 1.16 (1.05, 1.29). In silico simulations were predictive of clinical results. Single doses of 275 mg LY2603618 administered with 50 mg desipramine were generally well tolerated. In conclusion, no clinically significant interaction was observed between LY2603618 and desipramine in patients with cancer. In silico predictions of clinical results demonstrated that mechanistic and physiologically based PK approaches may inform clinical study design in cancer patients.
Coal log pipeline: Development status of the first commercial system
Marrero, T.R.
1996-12-31
The coal log pipeline (CLP) is an innovative means for long-distance transportation of coal. In the CLP concept, coal is pressed into the form of cylinders--coal logs--that are propelled by water flowing through underground pipe. A coal log pipeline has many advantages when compared to coal transport by unit train, slurry pipeline and long-distance trucking: low-cost, low energy consumption, low-water consumption, simple dewatering at pipeline exit, safe, and environmentally friendly. The coal logs travel butted together, as trains. Between the coal log {open_quotes}trains,{close_quotes} some space is allowed for valve switching. The optimum diameter of a coal log is approximately 90 to 95% the inside diameter of the pipe. The coal-to-water ratio is about 4 to 1. A 200 mm diameter CLP can transport about 2 million tonnes of coal per year. The coal logs at their destination come out of the pipeline onto a moving conveyer which transports the logs to a crusher or stock pile. Coal logs are crushed to match the size of existing fuel. The water effluent is treated and reused at the power plant; there is no need for its discharge. Coal logs can be manufactured with and without the use of binder. By using less than 2 percent emulsified asphalt as binder, no heat is required to compact coal logs. Binderless coal logs can be compacted at less than 90{degrees}C. Compaction pressures, for coal logs made with or without binder, are about 70 MPa. The coal particle size distribution and moisture content must be controlled. The economics of coal log pipeline system have been studied. Results indicate that a new coal log pipeline is cost-competitive with existing railroads for distances greater than 80 km, approximately. CLP is much more economical than coal slurry pipeline of the same diameter. This paper describes the current R&D and commercialization plan for CLP. 4 refs.
NASA Astrophysics Data System (ADS)
Townend, J.; Sutherland, R.; Toy, V.; Doan, M. L.; Celerier, B. P.; Massiot, C.; Coussens, J.; Capova, L.; Jeppson, T.
2015-12-01
Fault rock assemblages reflect the combined effects of interaction between deformation, stress, temperature, fluid, and chemical regimes on distinct spatial and temporal scales at various positions in the crust. Here we summarize the petrophysical, structural, and hydrogeological characteristics of the Alpine Fault's hanging wall based on measurements made during the second stage of the Deep Fault Drilling Project (DFDP-2) in late 2014. In total, more than 19 km of wireline logging data including 4.8 km of borehole televiewer imagery were acquired during the three-month drilling operation, which reached a maximum depth of 893 m (817 m true vertical depth). In conjunction with manual and automatic measurements of hydraulic parameters, on-site analysis of rock cuttings and fluid geochemistry, and post-drilling temperature measurements, the wireline data permit detailed analysis of bulk rock properties, fracturing, and fault zone structure. We focus in particular on understanding how the hydraulic conductivity of individual fractures relates to the high bulk conductivity of the hanging-wall, and thus to the transport of fluids and heat through the shallow crust.
Swift, T.E.; Goodrich, J.H.; Kumar, R.M.; McCoy, R.L.; Wilhelm, M.H.; Glascock, M.R.
1982-01-01
The coring, logging and testing of Bennett Ranch Unit well No. 310 was a cooperative effort between Texas Pacific, owner of the well, and Gruy Federal, Inc. The requirements of the contract, which are summarized in Enclosure 1, Appendix A, include drilling and coring activities. The pressure-coring and associated logging and testing programs in selected wells are intended to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report presents detailed information on the first such project. This project demonstrates the usefulness of integrating pressure core, log and production data to realistically evaluate a reservoir for carbon dioxide flood. The engineering of tests and analysis of such experimental data requires original thinking, but the reliability of the results is higher than data derived from conventional tests.
3. Log bunkhouse (far left), log chicken house (left of ...
3. Log bunkhouse (far left), log chicken house (left of center), equipment shed (center), and workshop (far right). View to northwest. - William & Lucina Bowe Ranch, County Road 44, 0.1 mile northeast of Big Hole River Bridge, Melrose, Silver Bow County, MT
Censored Median Regression and Profile Empirical Likelihood
Subramanian, Sundarraman
2007-01-01
We implement profile empirical likelihood based inference for censored median regression models. Inference for any specified sub-vector is carried out by profiling out the nuisance parameters from the “plug-in” empirical likelihood ratio function proposed by Qin and Tsao. To obtain the critical value of the profile empirical likelihood ratio statistic, we first investigate its asymptotic distribution. The limiting distribution is a sum of weighted chi square distributions. Unlike for the full empirical likelihood, however, the derived asymptotic distribution has intractable covariance structure. Therefore, we employ the bootstrap to obtain the critical value, and compare the resulting confidence intervals with the ones obtained through Basawa and Koul’s minimum dispersion statistic. Furthermore, we obtain confidence intervals for the age and treatment effects in a lung cancer data set. PMID:19112527
Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon
Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin
2014-01-01
The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US–Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms. PMID:24743552
Logging Concessions Enable Illegal Logging Crisis in the Peruvian Amazon
NASA Astrophysics Data System (ADS)
Finer, Matt; Jenkins, Clinton N.; Sky, Melissa A. Blue; Pine, Justin
2014-04-01
The Peruvian Amazon is an important arena in global efforts to promote sustainable logging in the tropics. Despite recent efforts to achieve sustainability, such as provisions in the US-Peru Trade Promotion Agreement, illegal logging continues to plague the region. We present evidence that Peru's legal logging concession system is enabling the widespread illegal logging via the regulatory documents designed to ensure sustainable logging. Analyzing official government data, we found that 68.3% of all concessions supervised by authorities were suspected of major violations. Of the 609 total concessions, nearly 30% have been cancelled for violations and we expect this percentage to increase as investigations continue. Moreover, the nature of the violations indicate that the permits associated with legal concessions are used to harvest trees in unauthorized areas, thus threatening all forested areas. Many of the violations pertain to the illegal extraction of CITES-listed timber species outside authorized areas. These findings highlight the need for additional reforms.
Maximum-Likelihood Detection Of Noncoherent CPM
NASA Technical Reports Server (NTRS)
Divsalar, Dariush; Simon, Marvin K.
1993-01-01
Simplified detectors proposed for use in maximum-likelihood-sequence detection of symbols in alphabet of size M transmitted by uncoded, full-response continuous phase modulation over radio channel with additive white Gaussian noise. Structures of receivers derived from particular interpretation of maximum-likelihood metrics. Receivers include front ends, structures of which depends only on M, analogous to those in receivers of coherent CPM. Parts of receivers following front ends have structures, complexity of which would depend on N.
Quasi-likelihood for Spatial Point Processes
Guan, Yongtao; Jalilian, Abdollah; Waagepetersen, Rasmus
2014-01-01
Summary Fitting regression models for intensity functions of spatial point processes is of great interest in ecological and epidemiological studies of association between spatially referenced events and geographical or environmental covariates. When Cox or cluster process models are used to accommodate clustering not accounted for by the available covariates, likelihood based inference becomes computationally cumbersome due to the complicated nature of the likelihood function and the associated score function. It is therefore of interest to consider alternative more easily computable estimating functions. We derive the optimal estimating function in a class of first-order estimating functions. The optimal estimating function depends on the solution of a certain Fredholm integral equation which in practise is solved numerically. The derivation of the optimal estimating function has close similarities to the derivation of quasi-likelihood for standard data sets. The approximate solution is further equivalent to a quasi-likelihood score for binary spatial data. We therefore use the term quasi-likelihood for our optimal estimating function approach. We demonstrate in a simulation study and a data example that our quasi-likelihood method for spatial point processes is both statistically and computationally efficient. PMID:26041970
Prioritizing Rare Variants with Conditional Likelihood Ratios
Li, Weili; Dobbins, Sara; Tomlinson, Ian; Houlston, Richard; Pal, Deb K.; Strug, Lisa J.
2016-01-01
Background Prioritizing individual rare variants within associated genes or regions often consists of an ad hoc combination of statistical and biological considerations. From the statistical perspective, rare variants are often ranked using Fisher’s exact p values, which can lead to different rankings of the same set of variants depending on whether 1- or 2-sided p values are used. Results We propose a likelihood ratio-based measure, maxLRc, for the statistical component of ranking rare variants under a case-control study design that avoids the hypothesis-testing paradigm. We prove analytically that the maxLRc is always well-defined, even when the data has zero cell counts in the 2×2 disease-variant table. Via simulation, we show that the maxLRc outperforms Fisher’s exact p values in most practical scenarios considered. Using next-generation sequence data from 27 rolandic epilepsy cases and 200 controls in a region previously shown to be linked to and associated with rolandic epilepsy, we demonstrate that rankings assigned by the maxLRc and exact p values can differ substantially. Conclusion The maxLRc provides reliable statistical prioritization of rare variants using only the observed data, avoiding the need to specify parameters associated with hypothesis testing that can result in ranking discrepancies across p value procedures; and it is applicable to common variant prioritization. PMID:25659987
Likelihood analysis of the Local Group acceleration
NASA Astrophysics Data System (ADS)
Schmoldt, I.; Branchini, E.; Teodoro, L.; Efstathiou, G.; Frenk, C. S.; Keeble, O.; McMahon, R.; Maddox, S.; Oliver, S.; Rowan-Robinson, M.; Saunders, W.; Sutherland, W.; Tadros, H.; White, S. D. M.
1999-04-01
We compute the acceleration of the Local Group using 11 206 IRAS galaxies from the recently completed all-sky PSCz redshift survey. Measuring the acceleration vector in redshift space generates systematic uncertainties caused by the redshift-space distortions in the density field. We therefore assign galaxies to their real-space positions by adopting a non-parametric model for the velocity field that relies solely on the linear gravitational instability (GI) and linear biasing hypotheses. Remaining systematic contributions to the measured acceleration vector are corrected for by using PSCz mock catalogues from N-body experiments. The resulting acceleration vector points ~15 away from the CMB dipole apex, with a remarkable alignment between small- and large-scale contributions. A considerable fraction (~65 per cent) of the measured acceleration is generated within 40 h^-1 Mpc, with a non-negligible contribution from scales between 90 and 140 h^-1 Mpc, after which the acceleration amplitude seems to have converged. The local group acceleration from PSCz appears to be consistent with the one determined from the IRAS 1.2-Jy galaxy catalogue once the different contributions from shot noise have been taken into account. The results are consistent with the gravitational instability hypothesis and do not indicate any strong deviations from the linear biasing relation on large scales. A maximum-likelihood analysis of the cumulative PSCz dipole is performed within a radius of 150 h^-1 Mpc, in which we account for non-linear effects, shot noise and finite sample size. The aim is to constrain the beta=Omega^0.6/b parameter and the power spectrum of density fluctuations. We obtain beta=0.70^+0.35_-0.2 at 1sigma confidence level. The likelihood analysis is not very sensitive to the shape of the power spectrum, because of the rise in the amplitude of the dipole beyond 40 h^-1 Mpc and the increase in shot noise on large scales. There is, however, a weak indication that within the
Well Logging with Californium-252
Boulogne, A.R.
2003-01-06
Californium-252 is an intense neutron emitter that has only recently become available for experimental well logging. The purpose of this research is to investigate the application of well logging to groundwater hydrology; however, most of the techniques and purposes are quite similar to applications in the petroleum industry.
NASA Astrophysics Data System (ADS)
Schuchardt, A.; Morche, D.
2015-12-01
Rivers cover only a small part of the Earth`s surface, yet they transfer sediment in globally significant quantities. In mountainous regions, the majority of the total channel length occurs in headwater streams. Those mountain channels are influenced in terms of sediment connectivity by processes on the slopes. For example in such a sediment routing system, sediment originating from debris flows on the slopes is delivered along sediment pathways to the channel system and can be transported further downstream as solid load. Interruption of instream coarse sediment connectivity is closely related to the existence of channel blocking barriers which also can be formed by biota. By storing sediment large wood (LW) log jams disrupt in-channel sediment connectivity. We present a study design in order to decipher the short to long term effects (c. 10-2-102 years) of sediment (dis)connectivity effects of large wood. The study areas are two basins in mountain ranges in Germany and Austria. In Austria the drainage area of the river Fugnitz was chosen which is located in the National Park Thayatal. The other drainage area of the river Sieber in Saxony-Anhalt, Germany, is located in the Harz National Park. Since studies on LW and its geomorphological effects in Central European rivers are still rare the main goals of the project are: •to identify important triggers for LW transport from slopes into the channels •to examine the spatial distribution and characterization of LW in main and slope channels by mapping and dGPS measurements •to determine the effects of LW on channel hydraulic parameters (e.g. slope, width, grains size composition, roughness) by field measurements of channel long profiles and cross section with dGPS and Wolman particle counts •to quantify the direct effects of LW on discharge and bed load transport by measuring flow velocity with an Ott-Nautilus current meter and to measure bed load up- and downstream of log jams using a portable Helley
Well log evaluation of gas hydrate saturations
Collett, Timothy S.
1998-01-01
The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The `standard' and `quick look' Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.
Well log evaluation of gas hydrate saturations
Collett, T.S.
1998-01-01
The amount of gas sequestered in gas hydrates is probably enormous, but estimates are highly speculative due to the lack of previous quantitative studies. Gas volumes that may be attributed to a gas hydrate accumulation within a given geologic setting are dependent on a number of reservoir parameters; one of which, gas-hydrate saturation, can be assessed with data obtained from downhole well logging devices. The primary objective of this study was to develop quantitative well-log evaluation techniques which will permit the calculation of gas-hydrate saturations in gas-hydrate-bearing sedimentary units. The "standard" and "quick look" Archie relations (resistivity log data) yielded accurate gas-hydrate and free-gas saturations within all of the gas hydrate accumulations assessed in the field verification phase of the study. Compressional wave acoustic log data have been used along with the Timur, modified Wood, and the Lee weighted average acoustic equations to calculate accurate gas-hydrate saturations in all of the gas hydrate accumulations assessed in this study. The well log derived gas-hydrate saturations calculated in the field verification phase of this study, which range from as low as 2% to as high as 97%, confirm that gas hydrates represent a potentially important source of natural gas.
Seasonal logging, process response, and geomorphic work
NASA Astrophysics Data System (ADS)
Mohr, C.; Zimmermann, A.; Korup, O.; Iroume, A.; Francke, T.; Bronstert, A.
2013-12-01
Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3-min) measurements of stream discharge and sediment concentrations in similar-sized (~0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors which in turn provides model uncertainties. We find that, where no logging occurred, ~80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. Particularly dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.
NASA Technical Reports Server (NTRS)
Piccinotti, G.; Mushotzky, R. F.; Boldt, E. A.; Holt, S. S.; Marshall, F. E.; Serlemitsos, P. J.; Shafer, R. A.
1982-01-01
An all-sky survey of X-ray sources was performed, complete to a limiting sensitivity of 3.1 x 10 to the -11 ergs/sq cm/s in the 2-10 keV band. The complete sample has allowed construction of luminosity functions based on a flux-limited sample for clusters of galaxies and active galactic nuclei. Integration of the best-fit luminosity functions indicates that clusters of galaxies contribute about 4% of the 2-10 keV DXRB, and active galactic nuclei about 20%. It is predicted that many of the objects seen in the deep survey should be local, relatively low luminosity active galactic nuclei and clusters of galaxies.
[Using log-binomial model for estimating the prevalence ratio].
Ye, Rong; Gao, Yan-hui; Yang, Yi; Chen, Yue
2010-05-01
To estimate the prevalence ratios, using a log-binomial model with or without continuous covariates. Prevalence ratios for individuals' attitude towards smoking-ban legislation associated with smoking status, estimated by using a log-binomial model were compared with odds ratios estimated by logistic regression model. In the log-binomial modeling, maximum likelihood method was used when there were no continuous covariates and COPY approach was used if the model did not converge, for example due to the existence of continuous covariates. We examined the association between individuals' attitude towards smoking-ban legislation and smoking status in men and women. Prevalence ratio and odds ratio estimation provided similar results for the association in women since smoking was not common. In men however, the odds ratio estimates were markedly larger than the prevalence ratios due to a higher prevalence of outcome. The log-binomial model did not converge when age was included as a continuous covariate and COPY method was used to deal with the situation. All analysis was performed by SAS. Prevalence ratio seemed to better measure the association than odds ratio when prevalence is high. SAS programs were provided to calculate the prevalence ratios with or without continuous covariates in the log-binomial regression analysis.
Likelihood Principle and Maximum Likelihood Estimator of Location Parameter for Cauchy Distribution.
1986-05-01
consistency (or strong consistency) of maximum likelihood estimator has been studied by many researchers, for example, Wald (1949), Wolfowitz (1953, 1965...20, 595-601. [25] Wolfowitz , J. (1953). The method of maximum likelihood and Wald theory of decision functions. Indag. Math., Vol. 15, 114-119. [26...Probability Letters Vol. 1, No. 3, 197-202. [24] Wald , A. (1949). Note on the consistency of maximum likelihood estimates. Ann. Math. Statist., Vol
Bonner, B; Hutchings, L; Kasameyer, P
2006-06-14
We devise a strategy for analysis of Vp and Vs microearthquake tomography results in the Salton Sea geothermal field to identify important features of the geothermal reservoir. We first interpret rock properties in State 2-14 borehole based upon logged core through the reservoir. Then, we interpret seismic recordings in the well (Daley et al., 1988) to develop the strategy. We hypothesize that mapping Poisson's ratio has two applications for the Salton Sea geothermal reservoir: (1) to map the top of the reservoir, and (2) as a diagnostic for permeable zones. Poisson's ratio can be obtained from Vp and Vs. In the State 2-14 borehole, Poisson's ratio calculated from large scale averages ({approx} 150 m) shows a monotonic decrease with depth to about 1300 m, at which point it increases with depth. Our model is that the monotonic decrease is due to compaction, and the increase below 1300 m is due to the rocks being hydrothermally altered. We hypothesize we can map the depth to alteration by identifying the transition from decreasing to increasing values; and thus, map the top of the reservoir, which is associated with a known increase in sulfite, chlorite, and epidote alteration that may be indicative of hydrothermal activity. We also observe (from Daley et. al. plots) an anomalous drop in Poisson's ratio at a depth of about 900 m, within a sandstone formation. The sandstone has a P-wave velocity significantly higher than the siltstone above it but a lower velocity in the lower half of the formation relative to the upper half. We interpret the relative decrease in velocity to be due to fracturing and chemical alteration caused by permeability. We conclude that using Vp and Vs tomography results to obtain images of Poisson's ratio has the potential to identify significant features in the geothermal reservoir in this geologic setting. Seismic attenuation tomography results (mapped as Qp and Qs) should also be useful for evaluating geothermal reservoirs, but that is not
MAIL LOG, program summary and specifications
NASA Technical Reports Server (NTRS)
Harris, D. K.
1979-01-01
The summary and specifications to obtain the software package, MAIL LOG, developed for the Scout Project Automatic Data System, SPADS are provided. The MAIL LOG program has four modes of operation: (1) input - putting new records into the data base; (2) revise - changing or modifying existing records in the data base; (3) search - finding special records existing in the data base; and (4) archive - store or put away existing records in the data base. The output includes special printouts of records in the data base and results from the input and search modes.
Park, Chung Gyoo; Son, Joon-Ki; Lee, Byung-Ho; Cho, Jin Hoon; Ren, Yonglin
2014-12-01
The Bursaphelenchus xylophilus, commonly known as pinewood nematode in Japan, is a quarantine pest and is most often associated with beetles of the genus Monochamus, the pine sawyers, particularly Monochamus alternatus. Long-distance dispersal of the nematode and its vectors led to widespread losses in pine forests. Two fumigation trials were conducted for treatment of logs naturally infested with both M. alternatus and B. xylophilus. The logs were treated with ethanedinitrile or metam sodium at low temperature (-7-25.7°C and -3.7-23.1°C) for 3-d exposure in winter and early spring. Fumigation with ethanedinitrile at concentrations of 48, 68, 97 and 158 g/m(3) resulted in 34.6-58.3, 91.5-97.2, 100, and 100% mortality for M. alternatus and 88.4, 77.9, 96.4, and 98.0% mortality for B. xylophilus, respectively. With Metam sodium fumigation at a dose rate of 1162 g/m(3), 100% M. alternatus and 97.4% B. xylophilus were killed. These results suggest that 97 g/m(3) of ethanedinitrile is adequate for complete control of M. alternatus in pine wood and >158 g/m(3) is required for eradication of B. xylophilus at low temperature fumigation. These results suggest that 97 g/m(3) of ethanedinitrile offers complete control of M. alternatus in pine wood and control of >98% B. xylophilus in winter or spring fumigation at a dosage rate of 158 g/m(3). Therefore, ethanedinitrile has great potential for treatment of fresh pine wooden logs to manage the nematodes and the vector insects at low temperature.
Likelihood Analysis of Supersymmetric SU(5) GUTs
Bagnaschi, E.; Costa, J. C.; Sakurai, K.; Borsato, M.; Buchmueller, O.; Cavanaugh, R.; Chobanova, V.; Citron, M.; De Roeck, A.; Dolan, M. J.; Ellis, J. R.; Flächer, H.; Heinemeyer, S.; Isidori, G.; Lucio, M.; Martínez Santos, D.; Olive, K. A.; Richards, A.; de Vries, K. J.; Weiglein, G.
2016-10-31
We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has 7 parameters: a universal gaugino mass $m_{1/2}$, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), $m_5$ and $m_{10}$, and for the $\\mathbf{5}$ and $\\mathbf{\\bar 5}$ Higgs representations $m_{H_u}$ and $m_{H_d}$, a universal trilinear soft SUSY-breaking parameter $A_0$, and the ratio of Higgs vevs $\\tan \\beta$. In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + MET events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously-identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel ${\\tilde u_R}/{\\tilde c_R} - \\tilde{\\chi}^0_1$ coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of ${\\tilde \
Likelihood analysis of supersymmetric SU(5) GUTs
NASA Astrophysics Data System (ADS)
Bagnaschi, E.; Costa, J. C.; Sakurai, K.; Borsato, M.; Buchmueller, O.; Cavanaugh, R.; Chobanova, V.; Citron, M.; De Roeck, A.; Dolan, M. J.; Ellis, J. R.; Flächer, H.; Heinemeyer, S.; Isidori, G.; Lucio, M.; Martínez Santos, D.; Olive, K. A.; Richards, A.; de Vries, K. J.; Weiglein, G.
2017-02-01
We perform a likelihood analysis of the constraints from accelerator experiments and astrophysical observations on supersymmetric (SUSY) models with SU(5) boundary conditions on soft SUSY-breaking parameters at the GUT scale. The parameter space of the models studied has seven parameters: a universal gaugino mass m_{1/2}, distinct masses for the scalar partners of matter fermions in five- and ten-dimensional representations of SU(5), m_5 and m_{10}, and for the 5 and {bar{5}} Higgs representations m_{H_u} and m_{H_d}, a universal trilinear soft SUSY-breaking parameter A_0, and the ratio of Higgs vevs tan β . In addition to previous constraints from direct sparticle searches, low-energy and flavour observables, we incorporate constraints based on preliminary results from 13 TeV LHC searches for jets + [InlineEquation not available: see fulltext.] events and long-lived particles, as well as the latest PandaX-II and LUX searches for direct Dark Matter detection. In addition to previously identified mechanisms for bringing the supersymmetric relic density into the range allowed by cosmology, we identify a novel {tilde{u}_R}/{tilde{c}_R} - tilde{χ }01 coannihilation mechanism that appears in the supersymmetric SU(5) GUT model and discuss the role of {{tilde{ν }}_τ } coannihilation. We find complementarity between the prospects for direct Dark Matter detection and SUSY searches at the LHC.
Maintained Individual Data Distributed Likelihood Estimation (MIDDLE)
Boker, Steven M.; Brick, Timothy R.; Pritikin, Joshua N.; Wang, Yang; von Oertzen, Timo; Brown, Donald; Lach, John; Estabrook, Ryne; Hunter, Michael D.; Maes, Hermine H.; Neale, Michael C.
2015-01-01
Maintained Individual Data Distributed Likelihood Estimation (MIDDLE) is a novel paradigm for research in the behavioral, social, and health sciences. The MIDDLE approach is based on the seemingly-impossible idea that data can be privately maintained by participants and never revealed to researchers, while still enabling statistical models to be fit and scientific hypotheses tested. MIDDLE rests on the assumption that participant data should belong to, be controlled by, and remain in the possession of the participants themselves. Distributed likelihood estimation refers to fitting statistical models by sending an objective function and vector of parameters to each participants’ personal device (e.g., smartphone, tablet, computer), where the likelihood of that individual’s data is calculated locally. Only the likelihood value is returned to the central optimizer. The optimizer aggregates likelihood values from responding participants and chooses new vectors of parameters until the model converges. A MIDDLE study provides significantly greater privacy for participants, automatic management of opt-in and opt-out consent, lower cost for the researcher and funding institute, and faster determination of results. Furthermore, if a participant opts into several studies simultaneously and opts into data sharing, these studies automatically have access to individual-level longitudinal data linked across all studies. PMID:26717128
Maintained Individual Data Distributed Likelihood Estimation (MIDDLE).
Boker, Steven M; Brick, Timothy R; Pritikin, Joshua N; Wang, Yang; von Oertzen, Timo; Brown, Donald; Lach, John; Estabrook, Ryne; Hunter, Michael D; Maes, Hermine H; Neale, Michael C
2015-01-01
Maintained Individual Data Distributed Likelihood Estimation (MIDDLE) is a novel paradigm for research in the behavioral, social, and health sciences. The MIDDLE approach is based on the seemingly impossible idea that data can be privately maintained by participants and never revealed to researchers, while still enabling statistical models to be fit and scientific hypotheses tested. MIDDLE rests on the assumption that participant data should belong to, be controlled by, and remain in the possession of the participants themselves. Distributed likelihood estimation refers to fitting statistical models by sending an objective function and vector of parameters to each participant's personal device (e.g., smartphone, tablet, computer), where the likelihood of that individual's data is calculated locally. Only the likelihood value is returned to the central optimizer. The optimizer aggregates likelihood values from responding participants and chooses new vectors of parameters until the model converges. A MIDDLE study provides significantly greater privacy for participants, automatic management of opt-in and opt-out consent, lower cost for the researcher and funding institute, and faster determination of results. Furthermore, if a participant opts into several studies simultaneously and opts into data sharing, these studies automatically have access to individual-level longitudinal data linked across all studies.
1979-10-01
OR SKIP1: RTS PC ;RETURN ;**** ERROR SECTION ERR: JMp e*4 ; ABORT TO MONITOR *END 69 AFAL-TR-79-1075 *TITLE INA NORMALIZE AMPLITUDE 26-AUG-77 COKER...IIVI ;* ERROR SECTION ERR: JP @#4 ; ABORT TO MONITOR ;**** M2 M2: COM R2 ;COMPLEMENT I2VI BIC #377 ,R2 ;CLEAR PHASE BITS ADD *400 ,R2 ;-I2VI JMP ADD...IlVI Ml ? CMP #0 PR3 OILF=0? BEG M2 ;IF SO -12VI CMP $1 PR3 ;ILF=1? BEG ml ;IF SO -IlVI ;**ERROR SECTION ERR: Jmp @#4 ; ABORT TO MONITOR ;**** M2 M2: CON
14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.
Code of Federal Regulations, 2011 CFR
2011-01-01
... maintenance log entry. 135.443 Section 135.443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... release or aircraft maintenance log entry. (a) No certificate holder may operate an aircraft after... (2) An appropriate entry in the aircraft maintenance log. (b) The airworthiness release or log...
14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.
Code of Federal Regulations, 2010 CFR
2010-01-01
... maintenance log entry. 135.443 Section 135.443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... release or aircraft maintenance log entry. (a) No certificate holder may operate an aircraft after... (2) An appropriate entry in the aircraft maintenance log. (b) The airworthiness release or log...
14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.
Code of Federal Regulations, 2012 CFR
2012-01-01
... maintenance log entry. 135.443 Section 135.443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... release or aircraft maintenance log entry. (a) No certificate holder may operate an aircraft after... (2) An appropriate entry in the aircraft maintenance log. (b) The airworthiness release or log...
14 CFR 135.443 - Airworthiness release or aircraft maintenance log entry.
Code of Federal Regulations, 2013 CFR
2013-01-01
... maintenance log entry. 135.443 Section 135.443 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION... release or aircraft maintenance log entry. (a) No certificate holder may operate an aircraft after... (2) An appropriate entry in the aircraft maintenance log. (b) The airworthiness release or log...
The fluid-compensated cement bond log
Nayfeh, T.H.; Wheelis, W.B. Jr.; Leslie, H.D.
1986-08-01
Simulations of cement bond logging (CBL) have shown that wellbore fluid effects can be segregated from sonic-signal response to changing cement strengths. Traditionally, the effects have been considered negligible and the CBL's have been interpreted as if water were in the wellbore. However, large variations in CBL's have become apparent with the increasing number of logs run in completion fluids, such as CaCl/sub 2/, ZnBr/sub 2/, and CaBr/sub 2/. To study wellbore fluid effects, physical and numerical models were developed that simulated the wellbore geometry. Measurements were conducted in 5-, 7-, and 9 5/8-in. casings for a range of wellbore fluid types and for both densities and viscosities. Parallel numerical modeling used similar parameters. Results show that bond-log amplitudes varied dramatically with the wellbore fluid acoustic impedance-i.e., there was a 70% increase in signal amplitudes for 11.5 lbm/gal (1370-kg/m/sup 3/) CaCl/sub 2/ over the signal amplitude in water. This led to the development of a fluid-compensated bond log that corrects the amplitude for acoustic impedance of various wellbore fluids, thereby making the measurements more directly related to the cement quality.
Likelihood analysis of earthquake focal mechanism distributions
NASA Astrophysics Data System (ADS)
Kagan, Yan Y.; Jackson, David D.
2015-06-01
In our paper published earlier we discussed forecasts of earthquake focal mechanism and ways to test the forecast efficiency. Several verification methods were proposed, but they were based on ad hoc, empirical assumptions, thus their performance is questionable. We apply a conventional likelihood method to measure the skill of earthquake focal mechanism orientation forecasts. The advantage of such an approach is that earthquake rate prediction can be adequately combined with focal mechanism forecast, if both are based on the likelihood scores, resulting in a general forecast optimization. We measure the difference between two double-couple sources as the minimum rotation angle that transforms one into the other. We measure the uncertainty of a focal mechanism forecast (the variability), and the difference between observed and forecasted orientations (the prediction error), in terms of these minimum rotation angles. To calculate the likelihood score we need to compare actual forecasts or occurrences of predicted events with the null hypothesis that the mechanism's 3-D orientation is random (or equally probable). For 3-D rotation the random rotation angle distribution is not uniform. To better understand the resulting complexities, we calculate the information (likelihood) score for two theoretical rotational distributions (Cauchy and von Mises-Fisher), which are used to approximate earthquake source orientation pattern. We then calculate the likelihood score for earthquake source forecasts and for their validation by future seismicity data. Several issues need to be explored when analyzing observational results: their dependence on forecast and data resolution, internal dependence of scores on forecasted angle and random variability of likelihood scores. Here, we propose a simple tentative solution but extensive theoretical and statistical analysis is needed.
Factors Influencing Likelihood of Voice Therapy Attendance.
Misono, Stephanie; Marmor, Schelomo; Roy, Nelson; Mau, Ted; Cohen, Seth M
2017-03-01
Objective To identify factors associated with the likelihood of attending voice therapy among patients referred for it in the CHEER (Creating Healthcare Excellence through Education and Research) practice-based research network infrastructure. Study Design Prospectively enrolled cross-sectional study. Setting CHEER network of community and academic sites. Methods Data were collected on patient-reported demographics, voice-related diagnoses, voice-related handicap (Voice Handicap Index-10), likelihood of attending voice therapy (VT), and opinions on factors influencing likelihood of attending VT. The relationships between patient characteristics/opinions and likelihood of attending VT were investigated. Results A total of 170 patients with various voice-related diagnoses reported receiving a recommendation for VT. Of those, 85% indicated that they were likely to attend it, regardless of voice-related handicap severity. The most common factors influencing likelihood of VT attendance were insurance/copay, relief that it was not cancer, and travel. Those who were not likely to attend VT identified, as important factors, unclear potential improvement, not understanding the purpose of therapy, and concern that it would be too hard. In multivariate analysis, factors associated with greater likelihood of attending VT included shorter travel distance, age (40-59 years), and being seen in an academic practice. Conclusions Most patients reported plans to attend VT as recommended. Patients who intended to attend VT reported different considerations in their decision making from those who did not plan to attend. These findings may inform patient counseling and efforts to increase access to voice care.
New materials for fireplace logs
NASA Technical Reports Server (NTRS)
Kieselback, D. J.; Smock, A. W.
1971-01-01
Fibrous insulation and refractory concrete are used for logs as well as fireproof walls, incinerator bricks, planters, and roof shingles. Insulation is lighter and more shock resistant than fireclay. Lightweight slag bonded with refractory concrete serves as aggregrate.
Numerical likelihood analysis of cosmic ray anisotropies
Carlos Hojvat et al.
2003-07-02
A numerical likelihood approach to the determination of cosmic ray anisotropies is presented which offers many advantages over other approaches. It allows a wide range of statistically meaningful hypotheses to be compared even when full sky coverage is unavailable, can be readily extended in order to include measurement errors, and makes maximum unbiased use of all available information.
Growing local likelihood network: Emergence of communities
NASA Astrophysics Data System (ADS)
Chen, S.; Small, M.
2015-10-01
In many real situations, networks grow only via local interactions. New nodes are added to the growing network with information only pertaining to a small subset of existing nodes. Multilevel marketing, social networks, and disease models can all be depicted as growing networks based on local (network path-length) distance information. In these examples, all nodes whose distance from a chosen center is less than d form a subgraph. Hence, we grow networks with information only from these subgraphs. Moreover, we use a likelihood-based method, where at each step we modify the networks by changing their likelihood to be closer to the expected degree distribution. Combining the local information and the likelihood method, we grow networks that exhibit novel features. We discover that the likelihood method, over certain parameter ranges, can generate networks with highly modulated communities, even when global information is not available. Communities and clusters are abundant in real-life networks, and the method proposed here provides a natural mechanism for the emergence of communities in scale-free networks. In addition, the algorithmic implementation of network growth via local information is substantially faster than global methods and allows for the exploration of much larger networks.
Synthesizing Regression Results: A Factored Likelihood Method
ERIC Educational Resources Information Center
Wu, Meng-Jia; Becker, Betsy Jane
2013-01-01
Regression methods are widely used by researchers in many fields, yet methods for synthesizing regression results are scarce. This study proposes using a factored likelihood method, originally developed to handle missing data, to appropriately synthesize regression models involving different predictors. This method uses the correlations reported…
Assessing the Likelihood of Hand-to-Hand Cross-Transmission of Bacteria: An Experimental Study.
Bellissimo-Rodrigues, Fernando; Pires, Daniela; Soule, Hervé; Gayet-Ageron, Angèle; Pittet, Didier
2017-02-22
BACKGROUND Although the hands of healthcare workers (HCWs) are implicated in most episodes of healthcare-associated infections, the correlation between hand contamination and the likelihood of cross-transmission remains unknown. METHODS We conducted a laboratory-based study involving pairs of HCWs. The hands of a HCW (transmitter) were contaminated with Escherichia coli ATCC 10536 before holding hands with another HCW (host) for 1 minute. Meanwhile, the unheld hand of the transmitter was sampled. Afterward, the host's held hand was also sampled. Each experiment consisted of 4 trials with increasing concentrations of E. coli (103-106 colony-forming units [cfu]/mL). The primary outcome was the likelihood of transmission of at least 1 cfu from transmitter to host. We used a mixed logistic regression model with a random effect on the subject to assess the association between transmission and bacterial count on the transmitter's hands. RESULTS In total, 6 HCWs performed 30 experiments and 120 trials. The bacterial counts recovered from host hands were directly associated with the bacterial counts on transmitter hands (P1 and ≤3 log10 cfu compared to ≤1 log10. When transmitter contamination was <1 log10 cfu, no cross-transmission was detected. CONCLUSION There is a direct relationship between the bacterial burden on HCWs hands and the likelihood of cross-transmission. Under the described conditions, at least 1 log10 cfu must be present on HCW hands to be potentially transmitted. Further studies are needed at the low contamination range. Infect Control Hosp Epidemiol 2017;1-6.
Predicting reservoir wettability via well logs
NASA Astrophysics Data System (ADS)
Feng, Cheng; Fu, Jinhua; Shi, Yujiang; Li, Gaoren; Mao, Zhiqiang
2016-06-01
Wettability is an important factor in controlling the distribution of oil and water. However, its evaluation has so far been a difficult problem because no log data can directly indicate it. In this paper, a new method is proposed for quantitatively predicting reservoir wettability via well log analysis. Specifically, based on the J function, diagenetic facies classification and the piecewise power functions, capillary pressure curves are constructed from conventional logs and a nuclear magnetic resonance (NMR) log respectively. Under the influence of wettability, the latter is distorted while the former remains unaffected. Therefore, the ratio of the median radius obtained from the two kinds of capillary pressure curve is calculated to reflect wettability, a quantitative relationship between the ratio and reservoir wettability is then established. According to the low-permeability core sample capillary pressure curve, NMR {{T}2} spectrum and contact angle experimental data from the bottom of the Upper Triassic reservoirs in western Ordos Basin, China, two kinds of constructing capillary pressure curve models and a predictive wettability model are calibrated. The wettability model is verified through the Amott wettability index and saturation exponent from resistivity measurement and their determined wettability levels are comparable, indicating that the proposed model is quite reliable. In addition, the model’s good application effect is exhibited in the field study. Thus, the quantitatively predicting reservoir wettability model proposed in this paper provides an effective tool for formation evaluation, field development and the improvement of oil recovery.
Royse, D J; Rhodes, T W; Sanchez, J E
2002-01-01
Synthetic logs were vacuum-soaked or regular-soaked to determine the effects of soaking on yield and mushroom size, log weight variability and water distribution within the log. Yields (g/log) from substrates vacuum-soaked were higher by 26.7%, 18.6% and 35.8% (mean = 27.2%) for crops I, II and III, respectively, when compared with regular-soaked. However, mushroom size averaged only 11.2 g for vacuum-soaked logs vs 17 g for regular-soaked logs (51.8% larger for regular-soaked). The time required for vacuum-soaking logs was generally less than 3 min, compared with regular-soaking times ranging over 3-15 h. Water tended to accumulate more in the outside zone in the vacuum-soaked logs, compared with regular-soaked logs. Mean moisture contents for crops I and II for outside, middle and interior zones of vacuum-soaked logs were 66%, 47.5% and 42.2%, respectively, while regular-soaked logs for the same zones were 62.4%, 52.1% and 50.9%, respectively. Vacuum-soaked log weights had lower standard deviations than weights for regular-soaked logs in four out of six soaks, indicating a more uniform soaking process.
Well Logging and Logging Analysis of UHP metamorphic Rocks in CCSD Main Hole (0-2000m)
NASA Astrophysics Data System (ADS)
Pan, H.; Niu, Y.; Wang, W.; Zhu, L.; Xu, D.; Wu, H.; Li, S.; Luo, M.
2004-12-01
CCSD logging engineering gather many modern high technologies and employs various advanced logging tools to survey the sidewall continuously. This can obtain various physical, chemical, geometrical, etc in-situ information of the borehole's profile. So well logging is one of the most important parts and pivotal technologies in the project of CCSD. The main logging methods in CCSD-MH(0-2000m) are laterolog (Rd,Rs), gamma ray(GR), nature gamma spectrometry(U, TH, K), density(DEN), photo electric section exponent (Pe), compensated neutron(CNL), multipole array acoustic (Vp, Vs, Vst), Simultaneous Acoustic-Resistivity-image(Star-II), temperature(T),magnetic susceptibility(MS), three component borehole magnetic and redox potential log,etc. The various metamorphic rocks can be classified by logging curves,and their physical parameters can be acquired by analyzing the response characters of various metamorphic rocks and by statistics. According to the logging cross plot, We can research the clustering of metamorphite's physical property. Five lithologic segments can be obtainend by logging curves. The GR, Th, U, K logging values of segment 1 is lower than the third, fourth and fiveth segment, higher than segment 2; The DEN, Pe values of segment 1 higher than the third, fourth and fiveth segments. The main rocks in segment 1,2,3,4,5 are eclogites, serpentinites, paragneiss, orthogneiss, and eclogites(containing silicon and muscovite ) respectively. Generally, eclogite contain rutile, silicon, muscovite, etc. minerals. These minerals have response obviously on log curves.There are rutile,ilmenite, pyrite mineralized, etc. Making use of DEN, Pe, susceptibility log values, these mineralized layers can be goodly demarcation. For example, on the rutile mineralzed layer, the logging curve response characters are of high density and Pe obviously. The key data of the synthetical seismic record is wave impedance. In this paper, Utilize the data of AC, DEN curves to calculate the
NASA Astrophysics Data System (ADS)
Pilon, Paul J.; Adamowski, Kaz
1993-03-01
Maximum likelihood and censored sample theory are applied for flood frequency analysis purposes to the log Pearson Type III (LP3) distribution. The logarithmic likelihood functions are developed and solved in terms of fully specified floods, historical information, and parameters to be estimated. The asymptotic standard error of estimate of the T-year flood is obtained using the general equation for the variance of estimate of a function. The variances and covariances of the parameters are obtained through inversion of Fisher's information matrix. Monte Carlo studies to verify the accuracy of the derived asymptotic expression for the standard errors of the 10, 50, 100, and 500 year floods, indicate that these are accurate for both Type I and Type II censored samples, while the bias is less than 2.5%. Subsequently, the Type II censored data were subjected to a random, multiplicative error. Results indicate that historical information contributes greatly to the accuracy of estimation of the quantiles even when the error of its measurement becomes excessive.
Lienkaemper, James J.; Williams, Patrick L.; Sickler, Robert R.; Fumal, Thomas E.
2005-01-01
This publication makes available a detailed trench log (sheets 1 and 2) of a 110-m trench we excavated in 2004 across a tectonic sag pond in the Hayward fault zone. Also included are revised stratigraphic unit descriptions from this fifth field season of subsurface investigation of the Hayward fault at Tyson's Lagoon (Tule Pond). Preliminary findings based on fieldwork done in 2000 have been published (Lienkaemper and others: data archive, 2002a; report, 2002b), as were the logs and data for 2001-2003 (Lienkaemper and others, 2003, L03). A continuous exposure of the geologic section across the entire pond made in 2004 (Fig. 1, 04A) has revealed some critical miscorrelations of units made in the original on-line version of L03, hence users of these earlier trench data should only use the 2005 revised version 2.0 of L03 for correlation purposes. Lienkaemper, Williams, and Sickler interpreted the geology and logged the trenches. Fumal did most of the trench photography. The Hayward fault is recognized to be among the most hazardous in the United States (Working Group on California Earthquake Probabilities, 2003). Establishing a chronology of prehistoric or paleoearthquakes is of immediate use in resolving the likelihood of future large earthquakes Hayward fault. This document makes available geologic evidence for historical and prehistoric surface-rupturing earthquakes preserved at the site. A second, formal report on our conclusions based on these data is in preparation.
NASA Astrophysics Data System (ADS)
Chen, C. E.; Lorenzelli, F.; Hudson, R. E.; Yao, K.
2007-12-01
We investigate the maximum likelihood (ML) direction-of-arrival (DOA) estimation of multiple wideband sources in the presence of unknown nonuniform sensor noise. New closed-form expression for the direction estimation Cramér-Rao-Bound (CRB) has been derived. The performance of the conventional wideband uniform ML estimator under nonuniform noise has been studied. In order to mitigate the performance degradation caused by the nonuniformity of the noise, a new deterministic wideband nonuniform ML DOA estimator is derived and two associated processing algorithms are proposed. The first algorithm is based on an iterative procedure which stepwise concentrates the log-likelihood function with respect to the DOAs and the noise nuisance parameters, while the second is a noniterative algorithm that maximizes the derived approximately concentrated log-likelihood function. The performance of the proposed algorithms is tested through extensive computer simulations. Simulation results show the stepwise-concentrated ML algorithm (SC-ML) requires only a few iterations to converge and both the SC-ML and the approximately-concentrated ML algorithm (AC-ML) attain a solution close to the derived CRB at high signal-to-noise ratio.
Seasonal logging, process response, and geomorphic work
NASA Astrophysics Data System (ADS)
Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.
2013-09-01
Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux focus on annual timescales, but potentially overlook important geomorphic responses on shorter time scales immediately following timber harvest. Sediments fluxes are commonly estimated from linear regression of intermittent measurements of water and sediment discharge using sediment rating curves (SRCs). However, these often unsatisfactorily reproduce non-linear effects such as discharge-load hystereses. We resolve such important dynamics from non-parametric Quantile Regression Forests (QRF) of high-frequency (3 min) measurements of stream discharge and sediment concentrations in similar-sized (~ 0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the Random Forest (RF) algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ~ 80% of the total sediment load was transported during rare but high magnitude runoff events during only 5% of the monitoring period. The variability of sediment flux of these rare events spans four orders of magnitude. In particular dry-season logging dampened the role of these rare, extreme sediment-transport events by increasing load efficiency during more moderate events. We show that QRFs outperforms traditional SRCs in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment discharge at high temporal resolution.
Seasonal logging, process response, and geomorphic work
NASA Astrophysics Data System (ADS)
Mohr, C. H.; Zimmermann, A.; Korup, O.; Iroumé, A.; Francke, T.; Bronstert, A.
2014-03-01
Deforestation is a prominent anthropogenic cause of erosive overland flow and slope instability, boosting rates of soil erosion and concomitant sediment flux. Conventional methods of gauging or estimating post-logging sediment flux often focus on annual timescales but overlook potentially important process response on shorter intervals immediately following timber harvest. We resolve such dynamics with non-parametric quantile regression forests (QRF) based on high-frequency (3 min) discharge measurements and sediment concentration data sampled every 30-60 min in similar-sized (˜0.1 km2) forested Chilean catchments that were logged during either the rainy or the dry season. The method of QRF builds on the random forest algorithm, and combines quantile regression with repeated random sub-sampling of both cases and predictors. The algorithm belongs to the family of decision-tree classifiers, which allow quantifying relevant predictors in high-dimensional parameter space. We find that, where no logging occurred, ˜80% of the total sediment load was transported during extremely variable runoff events during only 5% of the monitoring period. In particular, dry-season logging dampened the relative role of these rare, extreme sediment-transport events by increasing load efficiency during more efficient moderate events. We show that QRFs outperform traditional sediment rating curves (SRCs) in terms of accurately simulating short-term dynamics of sediment flux, and conclude that QRF may reliably support forest management recommendations by providing robust simulations of post-logging response of water and sediment fluxes at high temporal resolution.
Stepwise Signal Extraction via Marginal Likelihood
Du, Chao; Kao, Chu-Lan Michael
2015-01-01
This paper studies the estimation of stepwise signal. To determine the number and locations of change-points of the stepwise signal, we formulate a maximum marginal likelihood estimator, which can be computed with a quadratic cost using dynamic programming. We carry out extensive investigation on the choice of the prior distribution and study the asymptotic properties of the maximum marginal likelihood estimator. We propose to treat each possible set of change-points equally and adopt an empirical Bayes approach to specify the prior distribution of segment parameters. Detailed simulation study is performed to compare the effectiveness of this method with other existing methods. We demonstrate our method on single-molecule enzyme reaction data and on DNA array CGH data. Our study shows that this method is applicable to a wide range of models and offers appealing results in practice. PMID:27212739
cosmoabc: Likelihood-free inference for cosmology
NASA Astrophysics Data System (ADS)
Ishida, Emille E. O.; Vitenti, Sandro D. P.; Penna-Lima, Mariana; Trindade, Arlindo M.; Cisewski, Jessi; M.; de Souza, Rafael; Cameron, Ewan; Busti, Vinicius C.
2015-05-01
Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogs. cosmoabc is a Python Approximate Bayesian Computation (ABC) sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code can be coupled to an external simulator to allow incorporation of arbitrary distance and prior functions. When coupled with the numcosmo library, it has been used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function.
Spectral likelihood expansions for Bayesian inference
NASA Astrophysics Data System (ADS)
Nagel, Joseph B.; Sudret, Bruno
2016-03-01
A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.
Method for induced polarization logging
Vinegar, H.J.; Waxman, M.H.
1987-04-14
A method is described for generating a log of the formation phase shift, resistivity and spontaneous potential of an earth formation from data obtained from the earth formation with a multi-electrode induced polarization logging tool. The method comprises obtaining data samples from the formation at measurement points equally spaced in time of the magnitude and phase of the induced voltage and the magnitude and phase of the current supplied by a circuit through a reference resistance R/sub 0/ to a survey current electrode associated with the tool.
Likelihood-Based Climate Model Evaluation
NASA Technical Reports Server (NTRS)
Braverman, Amy; Cressie, Noel; Teixeira, Joao
2012-01-01
Climate models are deterministic, mathematical descriptions of the physics of climate. Confidence in predictions of future climate is increased if the physics are verifiably correct. A necessary, (but not sufficient) condition is that past and present climate be simulated well. Quantify the likelihood that a (summary statistic computed from a) set of observations arises from a physical system with the characteristics captured by a model generated time series. Given a prior on models, we can go further: posterior distribution of model given observations.
Likelihood-based modification of experimental crystal structure electron density maps
Terwilliger, Thomas C.
2005-04-16
A maximum-likelihood method for improves an electron density map of an experimental crystal structure. A likelihood of a set of structure factors {F.sub.h } is formed for the experimental crystal structure as (1) the likelihood of having obtained an observed set of structure factors {F.sub.h.sup.OBS } if structure factor set {F.sub.h } was correct, and (2) the likelihood that an electron density map resulting from {F.sub.h } is consistent with selected prior knowledge about the experimental crystal structure. The set of structure factors {F.sub.h } is then adjusted to maximize the likelihood of {F.sub.h } for the experimental crystal structure. An improved electron density map is constructed with the maximized structure factors.
Maximum likelihood continuity mapping for fraud detection
Hogden, J.
1997-05-01
The author describes a novel time-series analysis technique called maximum likelihood continuity mapping (MALCOM), and focuses on one application of MALCOM: detecting fraud in medical insurance claims. Given a training data set composed of typical sequences, MALCOM creates a stochastic model of sequence generation, called a continuity map (CM). A CM maximizes the probability of sequences in the training set given the model constraints, CMs can be used to estimate the likelihood of sequences not found in the training set, enabling anomaly detection and sequence prediction--important aspects of data mining. Since MALCOM can be used on sequences of categorical data (e.g., sequences of words) as well as real valued data, MALCOM is also a potential replacement for database search tools such as N-gram analysis. In a recent experiment, MALCOM was used to evaluate the likelihood of patient medical histories, where ``medical history`` is used to mean the sequence of medical procedures performed on a patient. Physicians whose patients had anomalous medical histories (according to MALCOM) were evaluated for fraud by an independent agency. Of the small sample (12 physicians) that has been evaluated, 92% have been determined fraudulent or abusive. Despite the small sample, these results are encouraging.
Likelihood reinstates Archaeopteryx as a primitive bird.
Lee, Michael S Y; Worthy, Trevor H
2012-04-23
The widespread view that Archaeopteryx was a primitive (basal) bird has been recently challenged by a comprehensive phylogenetic analysis that placed Archaeopteryx with deinonychosaurian theropods. The new phylogeny suggested that typical bird flight (powered by the front limbs only) either evolved at least twice, or was lost/modified in some deinonychosaurs. However, this parsimony-based result was acknowledged to be weakly supported. Maximum-likelihood and related Bayesian methods applied to the same dataset yield a different and more orthodox result: Archaeopteryx is restored as a basal bird with bootstrap frequency of 73 per cent and posterior probability of 1. These results are consistent with a single origin of typical (forelimb-powered) bird flight. The Archaeopteryx-deinonychosaur clade retrieved by parsimony is supported by more characters (which are on average more homoplasious), whereas the Archaeopteryx-bird clade retrieved by likelihood-based methods is supported by fewer characters (but on average less homoplasious). Both positions for Archaeopteryx remain plausible, highlighting the hazy boundary between birds and advanced theropods. These results also suggest that likelihood-based methods (in addition to parsimony) can be useful in morphological phylogenetics.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 4 2013-10-01 2013-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 4 2012-10-01 2012-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 4 2011-10-01 2011-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...
29 CFR 1918.88 - Log operations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 7 2013-07-01 2013-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...
29 CFR 1918.88 - Log operations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 7 2011-07-01 2011-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...
29 CFR 1918.88 - Log operations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 7 2012-07-01 2012-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...
29 CFR 1918.88 - Log operations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 7 2014-07-01 2014-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 4 2014-10-01 2014-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...
29 CFR 1918.88 - Log operations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 7 2010-07-01 2010-07-01 false Log operations. 1918.88 Section 1918.88 Labor Regulations...) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Handling Cargo § 1918.88 Log operations. (a) Working in holds. When loading logs into the holds of vessels and using dumper devices to roll logs into the...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 4 2010-10-01 2010-10-01 false Logs. 73.781 Section 73.781 Telecommunication... International Broadcast Stations § 73.781 Logs. The licensee or permittee of each international broadcast station must maintain the station log in the following manner: (a) In the program log: (1) An entry of...
Using Web Logs in the Science Classroom
ERIC Educational Resources Information Center
Duplichan, Staycle C.
2009-01-01
As educators we must ask ourselves if we are meeting the needs of today's students. The science world is adapting to our ever-changing society; are the methodology and philosophy of our educational system keeping up? In this article, you'll learn why web logs (also called blogs) are an important Web 2.0 tool in your science classroom and how they…
Statistical log analysis made practical
Mitchell, W.K.; Nelson, R.J. )
1991-06-01
This paper discusses the advantages of a statistical approach to log analysis. Statistical techniques use inverse methods to calculate formation parameters. The use of statistical techniques has been limited, however, by the complexity of the mathematics and lengthy computer time required to minimize traditionally used nonlinear equations.
Lysne, P.
1992-01-01
Logging technologies developed hydrocarbon resource evaluation have not migrated into geothermal applications even though data so obtained would strengthen reservoir characterization efforts. Two causative issues have impeded progress: (i) there is a general lack of vetted, high-temperature instrumentation, and (ii) the interpretation of log data generated in a geothermal formation is in its infancy. Memory-logging tools provide a path around the first obstacle by providing quality data at a low cost. These tools feature on-board computers that process and store data, and newer systems may be programmed to make decisions.'' Since memory tools are completely self-contained, they are readily deployed using the slick line found on most drilling locations. They have proven to be rugged, and a minimum training program is required for operator personnel. Present tools measure properties such as temperature and pressure, and the development of noise, deviation, and fluid conductivity logs based on existing hardware is relatively easy. A more complex geochemical tool aimed at a quantitative analysis of potassium, uranium and thorium will be available in about on year, and it is expandable into all nuclear measurements common in the hydrocarbon industry. A second tool designed to sample fluids at conditions exceeding 400{degrees}C is in the proposal stage. Partnerships are being formed between the geothermal industry, scientific drilling programs, and the national laboratories to define and develop inversion algorithms relating raw tool data to more pertinent information. 8 refs.
ERIC Educational Resources Information Center
Miles, Donna
2001-01-01
In response to high numbers of preventable fatal accidents in the logging industry, the Occupational Safety and Health Administration (OSHA) developed a week-long logger safety training program that includes hands-on learning of safety techniques in the woods. Reaching small operators has been challenging; outreach initiatives in Maine, North…
Maximal likelihood correspondence estimation for face recognition across pose.
Li, Shaoxin; Liu, Xin; Chai, Xiujuan; Zhang, Haihong; Lao, Shihong; Shan, Shiguang
2014-10-01
Due to the misalignment of image features, the performance of many conventional face recognition methods degrades considerably in across pose scenario. To address this problem, many image matching-based methods are proposed to estimate semantic correspondence between faces in different poses. In this paper, we aim to solve two critical problems in previous image matching-based correspondence learning methods: 1) fail to fully exploit face specific structure information in correspondence estimation and 2) fail to learn personalized correspondence for each probe image. To this end, we first build a model, termed as morphable displacement field (MDF), to encode face specific structure information of semantic correspondence from a set of real samples of correspondences calculated from 3D face models. Then, we propose a maximal likelihood correspondence estimation (MLCE) method to learn personalized correspondence based on maximal likelihood frontal face assumption. After obtaining the semantic correspondence encoded in the learned displacement, we can synthesize virtual frontal images of the profile faces for subsequent recognition. Using linear discriminant analysis method with pixel-intensity features, state-of-the-art performance is achieved on three multipose benchmarks, i.e., CMU-PIE, FERET, and MultiPIE databases. Owe to the rational MDF regularization and the usage of novel maximal likelihood objective, the proposed MLCE method can reliably learn correspondence between faces in different poses even in complex wild environment, i.e., labeled face in the wild database.
Royle, J. Andrew; Sutherland, Christopher S.; Fuller, Angela K.; Sun, Catherine C.
2015-01-01
We develop a likelihood analysis framework for fitting spatial capture-recapture (SCR) models to data collected on class structured or stratified populations. Our interest is motivated by the necessity of accommodating the problem of missing observations of individual class membership. This is particularly problematic in SCR data arising from DNA analysis of scat, hair or other material, which frequently yields individual identity but fails to identify the sex. Moreover, this can represent a large fraction of the data and, given the typically small sample sizes of many capture-recapture studies based on DNA information, utilization of the data with missing sex information is necessary. We develop the class structured likelihood for the case of missing covariate values, and then we address the scaling of the likelihood so that models with and without class structured parameters can be formally compared regardless of missing values. We apply our class structured model to black bear data collected in New York in which sex could be determined for only 62 of 169 uniquely identified individuals. The models containing sex-specificity of both the intercept of the SCR encounter probability model and the distance coefficient, and including a behavioral response are strongly favored by log-likelihood. Estimated population sex ratio is strongly influenced by sex structure in model parameters illustrating the importance of rigorous modeling of sex differences in capture-recapture models.
Precision pressure/temperature logging tool
Henfling, J.A.; Normann, R.A.
1998-01-01
Past memory logging tools have provided excellent pressure/temperature data when used in a geothermal environment, and they are easier to maintain and deploy than tools requiring an electric wireline connection to the surface. However, they are deficient since the tool operator is unaware of downhole conditions that could require changes in the logging program. Tools that make ``decisions`` based on preprogrammed scenarios can partially overcome this difficulty, and a suite of such memory tools has been developed at Sandia National Laboratories. The first tool, which forms the basis for future instruments, measures pressure and temperature. Design considerations include a minimization of cost while insuring quality data, size compatibility with diamond-cored holes, operation in holes to 425 C (800 F), transportability by ordinary passenger air service, and ease of operation. This report documents the development and construction of the pressure/temperature tool. It includes: (1) description of the major components; (2) calibration; (3) typical logging scenario; (4) tool data examples; and (5) conclusions. The mechanical and electrical drawings, along with the tool`s software, will be furnished upon request.
Chemical logging of geothermal wells
Allen, Charles A.; McAtee, Richard E.
1981-01-01
The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.
Chemical logging of geothermal wells
Allen, C.A.; McAtee, R.E.
The presence of geothermal aquifers can be detected while drilling in geothermal formations by maintaining a chemical log of the ratio of the concentrations of calcium to carbonate and bicarbonate ions in the return drilling fluid. A continuous increase in the ratio of the concentrations of calcium to carbonate and bicarbonate ions is indicative of the existence of a warm or hot geothermal aquifer at some increased depth.
Audit Log for Forensic Photography
NASA Astrophysics Data System (ADS)
Neville, Timothy; Sorell, Matthew
We propose an architecture for an audit log system for forensic photography, which ensures that the chain of evidence of a photograph taken by a photographer at a crime scene is maintained from the point of image capture to its end application at trial. The requirements for such a system are specified and the results of experiments are presented which demonstrate the feasibility of the proposed approach.
LIKEDM: Likelihood calculator of dark matter detection
NASA Astrophysics Data System (ADS)
Huang, Xiaoyuan; Tsai, Yue-Lin Sming; Yuan, Qiang
2017-04-01
With the large progress in searches for dark matter (DM) particles with indirect and direct methods, we develop a numerical tool that enables fast calculations of the likelihoods of specified DM particle models given a number of observational data, such as charged cosmic rays from space-borne experiments (e.g., PAMELA, AMS-02), γ-rays from the Fermi space telescope, and underground direct detection experiments. The purpose of this tool - LIKEDM, likelihood calculator for dark matter detection - is to bridge the gap between a particle model of DM and the observational data. The intermediate steps between these two, including the astrophysical backgrounds, the propagation of charged particles, the analysis of Fermi γ-ray data, as well as the DM velocity distribution and the nuclear form factor, have been dealt with in the code. We release the first version (v1.0) focusing on the constraints from indirect detection of DM with charged cosmic and gamma rays. Direct detection will be implemented in the next version. This manual describes the framework, usage, and related physics of the code.
Maximum likelihood decoding of Reed Solomon Codes
Sudan, M.
1996-12-31
We present a randomized algorithm which takes as input n distinct points ((x{sub i}, y{sub i})){sup n}{sub i=1} from F x F (where F is a field) and integer parameters t and d and returns a list of all univariate polynomials f over F in the variable x of degree at most d which agree with the given set of points in at least t places (i.e., y{sub i} = f (x{sub i}) for at least t values of i), provided t = {Omega}({radical}nd). The running time is bounded by a polynomial in n. This immediately provides a maximum likelihood decoding algorithm for Reed Solomon Codes, which works in a setting with a larger number of errors than any previously known algorithm. To the best of our knowledge, this is the first efficient (i.e., polynomial time bounded) algorithm which provides some maximum likelihood decoding for any efficient (i.e., constant or even polynomial rate) code.
Multiplicative earthquake likelihood models incorporating strain rates
NASA Astrophysics Data System (ADS)
Rhoades, D. A.; Christophersen, A.; Gerstenberger, M. C.
2017-01-01